Cloud Data Engineer

ABOUT CLIENT

Our client is a leading global technology company that provides a wide range of IT services and solutions. With a strong focus on innovation and digital transformation, our client helps businesses adapt to the ever-changing technological landscape. Their expertise in areas like cloud computing, cybersecurity, and AI makes them a valuable partner for organizations.

JOB DESCRIPTION

Develop and execute the data processing pipeline using Google Cloud Platform (GCP).
Collaborate with implementation teams throughout the project lifecycle to offer extensive technical proficiency for deploying enterprise-scale data solutions and leveraging contemporary data/analytics technologies on GCP.
Create data processing pipelines and architectures.
Automate DevOps procedures for all components of the data pipelines, ensuring seamless transition from development to production.
Translate business challenges into technical data problems while incorporating essential business drivers in coordination with product management.
Extract, load, transform, sanitize, and authenticate data.
Provide assistance and resolution for issues related to data pipelines.

JOB REQUIREMENT

Minimum of 4 years of experience in Data Engineering or a similar role
Strong Cloud-based Data Engineering experience in AWS, Azure, or GCP with at least 2 years of Cloud experience
Proficiency in GCP Cloud Data Engineering, including general infrastructure and data services such as Big Query, Dataflow, Airflow, and Cloud Function
Proficiency in AWS Cloud Data Engineering, including data pipeline technologies like Lake Formation, MWAA, EMR, and storage technologies like S3 and Glue
Proficiency in Azure Cloud Data Engineering, including Azure Data Lake Storage, Azure Databricks, Azure Data Factory, and Synapse
Successful design and implementation of large and complex data solutions using various architectural patterns such as Microservices
Advanced skills in SQL and Python
Experience with DataOps
Experience in using DevOps on Cloud data platforms such as Terraform for Infrastructure as Code (IaC), GitOps, Docker, and Kubernetes
Strong educational background in Information Technology (IT) and Information and Communication Technology (ICT)
Ability to influence both technical and business peers and stakeholders
Fluent in English verbal communication
Experience in Marketing domains is preferred

WHAT'S ON OFFER

This position offers hybrid working arrangements, with three days working in the office and flexible hours.
Salary is negotiable based on candidate expectations.
Employees are entitled to 18 paid leaves annually, which includes 12 annual leaves and 6 personal leaves.
The insurance plan includes coverage based on full salary, a 13th-month salary, and performance bonuses.
A monthly meal allowance of 730,000 VND is provided.
Employees receive 100% full salary and benefits from the start of employment.
Medical benefits are extended to the employee and their family.
The work environment is fast-paced, flexible, and multicultural with opportunities for travel to 49 countries.
The company provides complimentary snacks, refreshments, and parking facilities.
Internal training programs covering technical, functional, and English language skills are offered.
The regular working hours are from 08:30 AM to 06:00 PM on Mondays to Fridays, inclusive of meal breaks.

CONTACT

PEGASI – IT Recruitment Consultancy | Email: recruit@pegasi.com.vn | Tel: +84 28 3622 8666
We are PEGASI – IT Recruitment Consultancy in Vietnam. If you are looking for new opportunity for your career path, kindly visit our website www.pegasi.com.vn for your reference. Thank you!

Job Summary

Company Type:

Outsource

Technical Skills:

Data Engineering, Cloud, Google Cloud, ETL/ELT

Location:

Ho Chi Minh - Viet Nam

Working Policy:

Hybrid

Salary:

Negotiation

Job ID:

J01454

Status:

Close

Related Job:

Senior Deep Learning Algorithms Engineer

Ho Chi Minh, Ha Noi - Viet Nam


Product

  • Machine Learning
  • Algorithm

Analyze and optimize deep learning training and inference workloads on advanced hardware and software platforms. Work with researchers and engineers to enhance workload performance. Develop high-quality software for deep learning platforms. Create automated tools for workload analysis and optimization.

Negotiation

View details

Software Engineer

Ho Chi Minh - Viet Nam


Product

Create and develop the API Platform with a focus on reliability, performance, and providing a top-tier developer experience Deploy and enhance AI/ML models in scalable, production environments in collaboration with research and applied ML teams Manage and advance a contemporary, cloud-native infrastructure stack utilizing Kubernetes, Docker, and infrastructure-as-code (IaC) tools Ensure platform dependability by designing and implementing telemetry, monitoring, alerting, autoscaling, failover, and disaster recovery mechanisms Contribute to developer and operations workflows, encompassing CI/CD pipelines, release management, and on-call rotations Work collaboratively across teams to implement secure APIs with fine-grained access control, usage metering, and billing integration Continuously enhance platform performance, cost-efficiency, and observability to accommodate scaling and serve users globally.

Negotiation

View details

Product Manager (Data & Models)

Ho Chi Minh - Viet Nam


Product

  • Product Management
  • AI

Designing data strategy and model integration for creating efficient data pipelines, evaluation frameworks, and annotation systems to maintain high-performance LLMs. Responsible for ensuring data quality standards and implementing bias mitigation and privacy-preserving techniques. Defining the product's core model roadmaps, taking into account technical feasibility, user needs, and ethical considerations. Collaboration with researchers to incorporate experimental breakthroughs into deployable features. Partnering with Engineering and Research teams to ensure model development aligns with product goals and advocating for transparency in model decision-making to build user trust. Analyzing usage patterns from open-source communities (Discord, Reddit, GitHub) to refine model behavior and address real-world edge cases, contributing to community-driven model evolution. Setting performance benchmarks, cost efficiency, and resource utilization standards for model scalability and reliability.

Negotiation

View details