Cloud Data Engineer

ABOUT CLIENT

Our client is a leading global technology company that provides a wide range of IT services and solutions. With a strong focus on innovation and digital transformation, our client helps businesses adapt to the ever-changing technological landscape. Their expertise in areas like cloud computing, cybersecurity, and AI makes them a valuable partner for organizations.

JOB DESCRIPTION

Develop and execute the data processing pipeline using Google Cloud Platform (GCP).
Collaborate with implementation teams throughout the project lifecycle to offer extensive technical proficiency for deploying enterprise-scale data solutions and leveraging contemporary data/analytics technologies on GCP.
Create data processing pipelines and architectures.
Automate DevOps procedures for all components of the data pipelines, ensuring seamless transition from development to production.
Translate business challenges into technical data problems while incorporating essential business drivers in coordination with product management.
Extract, load, transform, sanitize, and authenticate data.
Provide assistance and resolution for issues related to data pipelines.

JOB REQUIREMENT

Minimum of 4 years of experience in Data Engineering or a similar role
Strong Cloud-based Data Engineering experience in AWS, Azure, or GCP with at least 2 years of Cloud experience
Proficiency in GCP Cloud Data Engineering, including general infrastructure and data services such as Big Query, Dataflow, Airflow, and Cloud Function
Proficiency in AWS Cloud Data Engineering, including data pipeline technologies like Lake Formation, MWAA, EMR, and storage technologies like S3 and Glue
Proficiency in Azure Cloud Data Engineering, including Azure Data Lake Storage, Azure Databricks, Azure Data Factory, and Synapse
Successful design and implementation of large and complex data solutions using various architectural patterns such as Microservices
Advanced skills in SQL and Python
Experience with DataOps
Experience in using DevOps on Cloud data platforms such as Terraform for Infrastructure as Code (IaC), GitOps, Docker, and Kubernetes
Strong educational background in Information Technology (IT) and Information and Communication Technology (ICT)
Ability to influence both technical and business peers and stakeholders
Fluent in English verbal communication
Experience in Marketing domains is preferred

WHAT'S ON OFFER

This position offers hybrid working arrangements, with three days working in the office and flexible hours.
Salary is negotiable based on candidate expectations.
Employees are entitled to 18 paid leaves annually, which includes 12 annual leaves and 6 personal leaves.
The insurance plan includes coverage based on full salary, a 13th-month salary, and performance bonuses.
A monthly meal allowance of 730,000 VND is provided.
Employees receive 100% full salary and benefits from the start of employment.
Medical benefits are extended to the employee and their family.
The work environment is fast-paced, flexible, and multicultural with opportunities for travel to 49 countries.
The company provides complimentary snacks, refreshments, and parking facilities.
Internal training programs covering technical, functional, and English language skills are offered.
The regular working hours are from 08:30 AM to 06:00 PM on Mondays to Fridays, inclusive of meal breaks.

CONTACT

PEGASI – IT Recruitment Consultancy | Email: recruit@pegasi.com.vn | Tel: +84 28 3622 8666
We are PEGASI – IT Recruitment Consultancy in Vietnam. If you are looking for new opportunity for your career path, kindly visit our website www.pegasi.com.vn for your reference. Thank you!

Job Summary

Company Type:

Outsource

Technical Skills:

Data Engineering, Cloud, Google Cloud, ETL/ELT

Location:

Ho Chi Minh - Viet Nam

Working Policy:

Hybrid

Salary:

Negotiation

Job ID:

J01454

Status:

Close

Related Job:

Senior Signal Processing Engineer

Others - Viet Nam


Outsource

  • Python

Design and improve rPPG/TOI pipelines using RGB/IR video with motion/illumination compensation. Implement multi-stage preprocessing, denoising, and quality scoring. Examples: adaptive filtering, ICA/PCA variants, color-space transforms, skin ROI stabilization, signal confidence metrics. o Build cross-device normalization strategies and error-bounded estimators. Define biomarker-level acceptance criteria and failure modes for consumer-grade capture. Partner with iOS and ML teams to integrate algorithms into on-device or hybrid pipelines. Produce technical documentation suitable for regulatory-risk positioning and Apple review support. Benchmark report across device models, skin tones, lighting, and motion conditions. Biomarker feature specification sheet with recommended thresholds and confidence bands. A/B results showing improvements in stability, missingness, and downstream inference performance.

Negotiation

View details

Senior Mobile App Development Engineer

Others - Viet Nam


Outsource

  • iOS

Develop and enhance capture flows using TrueDepth, ARKit, AVFoundation, and CoreMotion. Convert algorithm requirements into reliable on-device data collection and frame synchronization. Create user-friendly quality control interface and capture-state instrumentation, including tasks such as lighting checks, face alignment, motion thresholds, occlusion detection, and retake prompts. Improve latency, thermal behavior, and battery usage for 10-second diagnostic-style capture. Ensure data privacy and compliance with privacy regulations in telemetry. Create technical documentation for the use of specific APIs and data handling for Apple's internal use. Build a modular capture SDK layer with feature flags and logging capabilities. Develop a cross-device performance matrix and tuning guide. Integrate deterministic quality control gates to minimize noisy biomarker outputs in the capture process.

Negotiation

View details

Senior Data Analyst, Anti-spoofing Operation

Ho Chi Minh - Viet Nam


Product

  • Data Analyst

Lead and develop analytical initiatives to address fraud detection challenges, from problem definition through implementation, monitoring, and continuous optimization. Spearhead strategic efforts to combat emerging fraud patterns and develop innovative detection approaches through data analysis, testing, and impact measurement. Design and implement production-grade data pipelines, automated workflows, and advanced analytical frameworks using statistical methods, machine learning, and geospatial analysis. Conduct in-depth analyses of complex fraud operations and develop behavioral analytics systems. Translate technical findings into actionable recommendations for leadership, product teams, and engineering. Lead compliance analytics initiatives and define evaluation metrics for new features. Provide technical guidance and mentorship, establish best practices, and contribute to hiring top talent in analytics.

Negotiation

View details