Senior/ Lead Data Engineer (Data Platform / MLOps)

JOB DESCRIPTION

You will be responsible for managing, designing, and enhancing data systems and workflows that drive key business decisions. The role is focused 75% on data engineering, involving the construction and optimization of data pipelines and architectures, and 25% on supporting data science initiatives through collaboration with data science teams for machine learning workflows and advanced analytics. You will leverage technologies like Python, Airflow, Kubernetes, and AWS to deliver high-quality data solutions.
Architect, develop, and maintain scalable data infrastructure, including data lakes, pipelines, and metadata repositories, ensuring the timely and accurate delivery of data to stakeholders.
Work closely with data scientists to build and support data models, integrate data sources, and support machine learning workflows and experimentation environments.
Develop and optimize large-scale, batch, and real-time data processing systems to enhance operational efficiency and meet business objectives.
Leverage Python, Apache Airflow, and AWS services to automate data workflows and processes, ensuring efficient scheduling and monitoring.
Utilize AWS services such as S3, Glue, EC2, and Lambda to manage data storage and compute resources, ensuring high performance, scalability, and cost-efficiency.
Implement robust testing and validation procedures to ensure the reliability, accuracy, and security of data processing workflows.
Stay informed of industry best practices and emerging technologies in both data engineering and data science to propose optimizations and innovative solutions.

JOB REQUIREMENT

7-8+ years of dedicated experience as a Data Engineer.
Core Expertise: Proficiency in Python for data processing and scripting (pandas, pyspark), workflow automation (Apache Airflow), and experience with AWS services (Glue, S3, EC2, Lambda).
Containerization & Orchestration: Experience working with Kubernetes and Docker for managing containerized environments in the cloud.
Data Engineering Tools: Hands-on experience with columnar and big data databases (Athena, Redshift, Vertica, Hive/Hadoop), along with version control systems like Git.
Cloud Services: Strong familiarity with AWS services for cloud-based data processing and management.
CI/CD Pipeline: Experience with CI/CD tools such as Jenkins, CircleCI, or AWS CodePipeline for continuous integration and deployment.
Data Engineering Focus (75%): Expertise in building and managing robust data architectures and pipelines for large-scale data operations.
Data Science Support (25%): Ability to support data science workflows, including collaboration on data preparation, feature engineering, and enabling experimentation environments.
Nice-to-have requirements:
Langchain Experience: Familiarity with Langchain for building data applications involving natural language processing or conversational AI frameworks.
Advanced Data Science Tools: Experience with AWS Sagemaker or Databricks for enabling machine learning environments.
Big Data & Analytics: Familiarity with both RDBMS (MySQL, PostgreSQL) and NoSQL (DynamoDB, Redis) databases.
BI Tools: Experience with enterprise BI tools like Tableau, Looker, or PowerBI.
Messaging & Event Streaming: Familiarity with distributed messaging systems like Kafka or RabbitMQ for event streaming.
Monitoring & Logging: Experience with monitoring and log management tools such as the ELK stack or Datadog.
Data Privacy and Security: Knowledge of best practices for ensuring data privacy and security, particularly in large data infrastructures.

WHAT'S ON OFFER

Competitive salary
13th-month salary guarantee
Performance bonus
Professional English course for employees
Premium health insurance
Extensive annual leave

CONTACT

PEGASI – IT Recruitment Consultancy | Email: recruit@pegasi.com.vn | Tel: +84 28 3622 8666
We are PEGASI – IT Recruitment Consultancy in Vietnam. If you are looking for new opportunity for your career path, kindly visit our website www.pegasi.com.vn for your reference. Thank you!

Job Summary

Company Type:

Outsource

Technical Skills:

Location:

Ho Chi Minh, Ha Noi - Viet Nam

Working Policy:

Hybrid

Salary:

Negotiation

Job ID:

J01942

Status:

Close

Related Job:

Senior Signal Processing Engineer

Others - Viet Nam


Outsource

  • Python

Design and improve rPPG/TOI pipelines using RGB/IR video with motion/illumination compensation. Implement multi-stage preprocessing, denoising, and quality scoring. Examples: adaptive filtering, ICA/PCA variants, color-space transforms, skin ROI stabilization, signal confidence metrics. o Build cross-device normalization strategies and error-bounded estimators. Define biomarker-level acceptance criteria and failure modes for consumer-grade capture. Partner with iOS and ML teams to integrate algorithms into on-device or hybrid pipelines. Produce technical documentation suitable for regulatory-risk positioning and Apple review support. Benchmark report across device models, skin tones, lighting, and motion conditions. Biomarker feature specification sheet with recommended thresholds and confidence bands. A/B results showing improvements in stability, missingness, and downstream inference performance.

Negotiation

View details

Senior Mobile App Development Engineer

Others - Viet Nam


Outsource

  • iOS

Develop and enhance capture flows using TrueDepth, ARKit, AVFoundation, and CoreMotion. Convert algorithm requirements into reliable on-device data collection and frame synchronization. Create user-friendly quality control interface and capture-state instrumentation, including tasks such as lighting checks, face alignment, motion thresholds, occlusion detection, and retake prompts. Improve latency, thermal behavior, and battery usage for 10-second diagnostic-style capture. Ensure data privacy and compliance with privacy regulations in telemetry. Create technical documentation for the use of specific APIs and data handling for Apple's internal use. Build a modular capture SDK layer with feature flags and logging capabilities. Develop a cross-device performance matrix and tuning guide. Integrate deterministic quality control gates to minimize noisy biomarker outputs in the capture process.

Negotiation

View details

Senior Data Analyst, Anti-spoofing Operation

Ho Chi Minh - Viet Nam


Product

  • Data Analyst

Lead and develop analytical initiatives to address fraud detection challenges, from problem definition through implementation, monitoring, and continuous optimization. Spearhead strategic efforts to combat emerging fraud patterns and develop innovative detection approaches through data analysis, testing, and impact measurement. Design and implement production-grade data pipelines, automated workflows, and advanced analytical frameworks using statistical methods, machine learning, and geospatial analysis. Conduct in-depth analyses of complex fraud operations and develop behavioral analytics systems. Translate technical findings into actionable recommendations for leadership, product teams, and engineering. Lead compliance analytics initiatives and define evaluation metrics for new features. Provide technical guidance and mentorship, establish best practices, and contribute to hiring top talent in analytics.

Negotiation

View details