Big Data Engineer

JOB DESCRIPTION

Selecting and integrating any Big Data tools and frameworks required to provide requested capabilities.
Implementing ETL process to transform data from OLTP databases to OLAP DB and Data Lake using event streaming platforms such as Kafka.
Develop, transform large datasets and maintain robust data pipelines that can support various use cases with high performance.
Monitoring performance and advising any necessary infrastructure changes.
Defining data retention, data governance policies and framework

JOB REQUIREMENT

At least 5 years experience in Java programming languages.
At least 5 years experience with Big Data, Java Spring, Kafka Streams, Spark Streams frameworks.
Experience in large scale deployment and performance tuning.
Experience with schema design and dimensional data modeling
Experience with non-relational and relational databases (MySQL, MongoDB)
Experience building and optimizing ‘big data’ data pipelines, architectures and data sets.
Experience with data pipeline and workflow management tool
Fluent written and spoken English.
Strong analytical and problem-solving skills.
Bonus Points if You
Have experience with Delta Lake technology.
Good Docker/Kubernetes knowledge is a plus.
Good Kibana, Elasticsearch ELK stack knowledge is a plus.

WHAT'S ON OFFER

We have a track record of success and a vision and a plan for a promising future. Our company has closed to 100% market share for player location regulatory compliance in the US gaming space. And we have fuelled that momentum with the expansion into new markets - media & entertainment and fintech.
We are proud of our values and we live them in all of our actions, conversations, and work: There’s always a way; Together we can do more; Aim higher. Then higher; Act with integrity; For the greater good.
We are proud to be part of a global team that develops award-winning solutions for some of the world’s largest and most innovative companies.
We will support you on your learning journey. We invest in employee career growth and development. Our learning & development commitment includes leadership and technical development, a substantial budget for education and training, as well as dedicated work hours for self-study.
We care about our team. Our team is talented, has a bias for action, and is known for their positive attitude and energy. Team members are generously rewarded with competitive salaries, incentives, and a comprehensive benefits package.
We care about giving back to the communities in which we live and work. We supports a  broad range of community initiatives through donations and employee volunteer activities.
We know that work can be fun. We take the time to create employee events and experiences where everyone can connect and celebrate.

CONTACT

PEGASI – IT Recruitment Consultancy | Email: recruit@pegasi.com.vn | Tel: +84 28 3622 8666
We are PEGASI – IT Recruitment Consultancy in Vietnam. If you are looking for new opportunity for your career path, kindly visit our website www.pegasi.com.vn for your reference. Thank you!

Job Summary

Company Type:

Product

Technical Skills:

Data Engineering, Java

Location:

Ho Chi Minh - Viet Nam

Working Policy:

Salary:

Negotiation

Job ID:

J01078

Status:

Close

Related Job:

Partner Implementation Engineer (Security & Digital Trust)

Ha Noi - Viet Nam


Outsource

Đóng vai trò là người thực hiện triển khai chủ chốt, chịu trách nhiệm triển khai, cấu hình và tích hợp các giải pháp Security & Digital Trust (PKI, Chữ ký số, Mã hóa, MFA) vào hệ thống thực tế của khách hàng, đảm bảo hệ thống vận hành ổn định, bảo mật và đúng thiết kế. Triển khai hệ thống (Implementation) Chuẩn bị môi trường: kiểm tra hạ tầng (Server, Hệ điều hành, Cơ sở dữ liệu, Mạng) Cài đặt & cấu hình giải pháp: PKI / CA / Chữ ký số / MFA / Mã hóa Thiết lập chính sách bảo mật, quy trình nghiệp vụ Kết nối với thiết bị bảo mật (HSM, Quản lý Khóa) Triển khai trên nền tảng Cloud / Container (nếu có) Triển khai hệ thống trên Kubernetes / OpenShift Cấu hình tài nguyên (YAML: Pod, Dịch vụ, Ingress, Bản đồ Cấu hình, Bí mật) Thiết lập lưu trữ (Khối Lưu trữ Không gian); mạng nội bộ Áp dụng các chính sách bảo mật cho container Tích hợp hệ thống (Integration) Hỗ trợ tích hợp với: Trang web/ Ứng dụng/ Giao diện lập trình ứng dụng và IAM / SSO / AD / LDAP Hướng dẫn sử dụng API/SDK Kiểm tra luồng dữ liệu & bảo mật giao tiếp Phối hợp với nhóm khách hàng (Phát triển / Cơ sở hạ tầng / Bảo mật) Kiểm thử & nghiệm thu (QA/UAT) Thực hiện kiểm thử kỹ thuật & kịch bản vận hành Hỗ trợ UAT với khách hàng Kiểm tra tính đúng đắn của: Chữ ký số; Chứng thư và Luồng xác thực Vận hành & hỗ trợ Giám sát hệ thống, phân tích log, xử lý sự cố Hỗ trợ sau triển khai (L2/L3) Đảm bảo hệ thống hoạt động ổn định & HA Tài liệu & chuyển giao Xây dựng tài liệu triển khai (cấu trúc, cấu hình) Hướng dẫn vận hành cho khách hàng Đào tạo kỹ thuật cơ bản

Negotiation

View details

AI Product Builder

Ha Noi - Viet Nam


Product

  • AI
  • Backend
  • Frontend
  • Devops
  • Java
  • Golang
  • Product Management

Collaborate with domain experts to develop business requirements and constraints for designing prompt AI-assisted workflows and system specifications. Utilize AI tools, no-code/low-code, and coding to rapidly prototype UI/UX mockups and foundational implementations. Test prototypes through hypothesis validation cycles and provide detailed handovers to engineering teams. Decode legacy specifications and enhance existing products with AI-assisted analysis and implementation. Constantly enhance the product team's building-tooling, templates, and practices to adapt to changes in models and platforms.

Negotiation

View details

DevOps Engineer

Others - Viet Nam


Product

  • Devops
  • Kubernetes
  • Network

Managing and developing our Kubernetes platform across multiple clusters and environments including production, development, on-premises and public cloud. Designing and overseeing hybrid cloud infrastructure across on-premises and public clouds (such as GCP, AWS), including workload placement, cross-cloud networking, and unified resource management. Taking responsibility for the end-to-end CI/CD and GitOps process, including container build pipelines, image optimization, and progressive delivery using tools like ArgoCD/FluxCD. Taking charge of the observability stack to provide a comprehensive view across all clusters using tools like Grafana, Mimir, Tempo, Loki, Pyroscope, OnCall, Prometheus, and supporting agent-assisted SRE workflows. Managing and enhancing our inference platform, including vLLM serving and AIBrix for multi-model orchestration and autoscaling with a fleet of NVIDIA GPUs. Operating platform services such as Kafka, Redis, PostgreSQL, OpenSearch. Managing identity and access management with Keycloak integrated with Google Workspace, strengthening SSO, RBAC, and secrets management across the platform. Strengthening network security across private load balancers, firewalls, and VPC segmentation and designing and maintaining hub-and-spoke/multi-AZ topologies. Supporting training infrastructure with self-service VM provisioning, RunPod burst capacity, and Weights and Biases integration. Driving infrastructure reliability, cost efficiency, and capacity planning as the platform scales.

Negotiation

View details