Senior/ Lead Data Engineer (Data Platform / MLOps)

JOB DESCRIPTION

You will be responsible for managing, designing, and enhancing data systems and workflows that drive key business decisions. The role is focused 75% on data engineering, involving the construction and optimization of data pipelines and architectures, and 25% on supporting data science initiatives through collaboration with data science teams for machine learning workflows and advanced analytics. You will leverage technologies like Python, Airflow, Kubernetes, and AWS to deliver high-quality data solutions.
Architect, develop, and maintain scalable data infrastructure, including data lakes, pipelines, and metadata repositories, ensuring the timely and accurate delivery of data to stakeholders.
Work closely with data scientists to build and support data models, integrate data sources, and support machine learning workflows and experimentation environments.
Develop and optimize large-scale, batch, and real-time data processing systems to enhance operational efficiency and meet business objectives.
Leverage Python, Apache Airflow, and AWS services to automate data workflows and processes, ensuring efficient scheduling and monitoring.
Utilize AWS services such as S3, Glue, EC2, and Lambda to manage data storage and compute resources, ensuring high performance, scalability, and cost-efficiency.
Implement robust testing and validation procedures to ensure the reliability, accuracy, and security of data processing workflows.
Stay informed of industry best practices and emerging technologies in both data engineering and data science to propose optimizations and innovative solutions.

JOB REQUIREMENT

7-8+ years of dedicated experience as a Data Engineer.
Core Expertise: Proficiency in Python for data processing and scripting (pandas, pyspark), workflow automation (Apache Airflow), and experience with AWS services (Glue, S3, EC2, Lambda).
Containerization & Orchestration: Experience working with Kubernetes and Docker for managing containerized environments in the cloud.
Data Engineering Tools: Hands-on experience with columnar and big data databases (Athena, Redshift, Vertica, Hive/Hadoop), along with version control systems like Git.
Cloud Services: Strong familiarity with AWS services for cloud-based data processing and management.
CI/CD Pipeline: Experience with CI/CD tools such as Jenkins, CircleCI, or AWS CodePipeline for continuous integration and deployment.
Data Engineering Focus (75%): Expertise in building and managing robust data architectures and pipelines for large-scale data operations.
Data Science Support (25%): Ability to support data science workflows, including collaboration on data preparation, feature engineering, and enabling experimentation environments.
Nice-to-have requirements:
Langchain Experience: Familiarity with Langchain for building data applications involving natural language processing or conversational AI frameworks.
Advanced Data Science Tools: Experience with AWS Sagemaker or Databricks for enabling machine learning environments.
Big Data & Analytics: Familiarity with both RDBMS (MySQL, PostgreSQL) and NoSQL (DynamoDB, Redis) databases.
BI Tools: Experience with enterprise BI tools like Tableau, Looker, or PowerBI.
Messaging & Event Streaming: Familiarity with distributed messaging systems like Kafka or RabbitMQ for event streaming.
Monitoring & Logging: Experience with monitoring and log management tools such as the ELK stack or Datadog.
Data Privacy and Security: Knowledge of best practices for ensuring data privacy and security, particularly in large data infrastructures.

WHAT'S ON OFFER

Competitive salary
13th-month salary guarantee
Performance bonus
Professional English course for employees
Premium health insurance
Extensive annual leave

CONTACT

PEGASI – IT Recruitment Consultancy | Email: recruit@pegasi.com.vn | Tel: +84 28 3622 8666
We are PEGASI – IT Recruitment Consultancy in Vietnam. If you are looking for new opportunity for your career path, kindly visit our website www.pegasi.com.vn for your reference. Thank you!

Job Summary

Company Type:

Outsource

Technical Skills:

Location:

Ho Chi Minh, Ha Noi - Viet Nam

Working Policy:

Hybrid

Salary:

Negotiation

Job ID:

J01942

Status:

Close

Related Job:

.NET Engineer

Ho Chi Minh - Viet Nam


Product

  • .NET
  • ReactJS
  • Angular

Take an active role in shaping requirements: Partner with stakeholders to understand their needs and translate them into effective technical solutions. Craft clean, scalable, and maintainable code: Build robust and high-performance applications using .NET languages, adhering to best practices and design patterns. Ensure quality through rigorous testing: Write comprehensive unit and integration tests, and participate in testing and debugging efforts to maintain high quality standards. Contribute to continuous improvement: Review, refactor, and maintain existing code to optimize performance and ensure long-term code health. Document your work effectively: Create clear and concise technical documentation to support ongoing development and maintenance. Collaborate effectively within a cross-functional team: Work closely with engineers, product managers, designers, and testers to achieve shared goals.

Negotiation

View details

Technical Lead (Java)

Ho Chi Minh, Ha Noi - Viet Nam


Outsource

  • Java

#Technical Leadership Lead and mentor a team of backend engineers across multiple functional domains. Provide technical direction and ensure alignment with architecture standards and engineering best practices. Review solution designs, code quality, and implementation approaches - promoting clean architecture, resilient microservices, and AI-enhanced development patterns. Guide the team in diagnosing complex technical challenges, using both traditional analysis and AI-powered debugging or observability tools. Support planning, estimation, and technical decision-making within the engineering team. Backend Architecture & Development Design and develop backend services using Java and modern backend frameworks (e.g., Spring Boot). Define and implement microservices-based architecture and API-driven systems. Ensure backend services are scalable, secure, resilient, and maintainable. Drive best practices in backend development, including maintainable system design, automated testing, code quality, API governance, performance optimization and documentation. Work closely with DevOps teams to support CI/CD pipelines, incorporate intelligent automation, and ensure stable, secure deployment environments. Integrate AI-assisted development practices into coding, testing, documentation, code reviews, and defect prevention workflows. #Banking & Payment Domain Design backend systems supporting banking products and financial transaction processing. Build and maintain services related to payments, card processing, financial operations, and core banking functions. Ensure backend implementations comply with financial industry standards and security requirements. Collaborate with business and product teams to translate banking and financial requirements into technical solutions. #Client Collaboration & Delivery Work directly with client stakeholders, architects, and product owners to understand business processes, requirements and define technical solutions. Challenge and clarify requirements with curiosity dig deep into the "why" behind business needs. Participate in architecture discussions and technical workshops with client teams. Communicate technical designs, trade-offs, and solutions clearly to both engineering teams and non-technical stakeholders. Support project delivery by ensuring technical risks are identified and mitigated early.

Negotiation

View details

Lead Data Engineer

Ho Chi Minh, Ha Noi - Viet Nam


Outsource

  • Data Engineering
  • Management

Architect, develop, and maintain scalable data infrastructure, including data lakes, pipelines, and metadata repositories, ensuring the timely and accurate delivery of data to stakeholders. Work closely with data scientists to build and support data models, integrate data sources, and support machine learning workflows and experimentation environments. Develop and optimize large-scale, batch, and real-time data processing systems to enhance operational efficiency and meet business objectives. Leverage Python, Apache Airflow, and AWS services to automate data workflows and processes, ensuring efficient scheduling and monitoring. Utilize AWS services such as S3, Glue, EC2, and Lambda to manage data storage and compute resources, ensuring high performance, scalability, and cost-efficiency. Implement robust testing and validation procedures to ensure the reliability, accuracy, and security of data processing workflows. Stay informed of industry best practices and emerging technologies in both data engineering and data science to propose optimizations and innovative solutions.

Negotiation

View details