Cloud Data Engineer

ABOUT CLIENT

Our client is a leading global technology company that provides a wide range of IT services and solutions. With a strong focus on innovation and digital transformation, our client helps businesses adapt to the ever-changing technological landscape. Their expertise in areas like cloud computing, cybersecurity, and AI makes them a valuable partner for organizations.

JOB DESCRIPTION

Develop and execute the data processing pipeline using Google Cloud Platform (GCP).
Collaborate with implementation teams throughout the project lifecycle to offer extensive technical proficiency for deploying enterprise-scale data solutions and leveraging contemporary data/analytics technologies on GCP.
Create data processing pipelines and architectures.
Automate DevOps procedures for all components of the data pipelines, ensuring seamless transition from development to production.
Translate business challenges into technical data problems while incorporating essential business drivers in coordination with product management.
Extract, load, transform, sanitize, and authenticate data.
Provide assistance and resolution for issues related to data pipelines.

JOB REQUIREMENT

Minimum of 4 years of experience in Data Engineering or a similar role
Strong Cloud-based Data Engineering experience in AWS, Azure, or GCP with at least 2 years of Cloud experience
Proficiency in GCP Cloud Data Engineering, including general infrastructure and data services such as Big Query, Dataflow, Airflow, and Cloud Function
Proficiency in AWS Cloud Data Engineering, including data pipeline technologies like Lake Formation, MWAA, EMR, and storage technologies like S3 and Glue
Proficiency in Azure Cloud Data Engineering, including Azure Data Lake Storage, Azure Databricks, Azure Data Factory, and Synapse
Successful design and implementation of large and complex data solutions using various architectural patterns such as Microservices
Advanced skills in SQL and Python
Experience with DataOps
Experience in using DevOps on Cloud data platforms such as Terraform for Infrastructure as Code (IaC), GitOps, Docker, and Kubernetes
Strong educational background in Information Technology (IT) and Information and Communication Technology (ICT)
Ability to influence both technical and business peers and stakeholders
Fluent in English verbal communication
Experience in Marketing domains is preferred

WHAT'S ON OFFER

This position offers hybrid working arrangements, with three days working in the office and flexible hours.
Salary is negotiable based on candidate expectations.
Employees are entitled to 18 paid leaves annually, which includes 12 annual leaves and 6 personal leaves.
The insurance plan includes coverage based on full salary, a 13th-month salary, and performance bonuses.
A monthly meal allowance of 730,000 VND is provided.
Employees receive 100% full salary and benefits from the start of employment.
Medical benefits are extended to the employee and their family.
The work environment is fast-paced, flexible, and multicultural with opportunities for travel to 49 countries.
The company provides complimentary snacks, refreshments, and parking facilities.
Internal training programs covering technical, functional, and English language skills are offered.
The regular working hours are from 08:30 AM to 06:00 PM on Mondays to Fridays, inclusive of meal breaks.

CONTACT

PEGASI – IT Recruitment Consultancy | Email: recruit@pegasi.com.vn | Tel: +84 28 3622 8666
We are PEGASI – IT Recruitment Consultancy in Vietnam. If you are looking for new opportunity for your career path, kindly visit our website www.pegasi.com.vn for your reference. Thank you!

Job Summary

Company Type:

Outsource

Technical Skills:

Data Engineering, Cloud, Google Cloud, ETL/ELT

Location:

Ho Chi Minh - Viet Nam

Working Policy:

Hybrid

Salary:

Negotiation

Job ID:

J01454

Status:

Close

Related Job:

Software Engineer

Ho Chi Minh - Viet Nam


Outsource

  • Azure
  • .NET

Creating API-based and event-driven integration solutions Developing integration solutions following Azure best practices and cloud-native patterns Constructing integrations using Azure Integration Services like Logic Apps, Functions, API Management, Service Bus, and Event Hubs Installing and managing SAP integrations, such as SAP S/4HANA, SAP PI/PO, or SAP BTP Integration Suite Building and maintaining integrations using C# and the .NET ecosystem Utilizing Infrastructure as Code practices with tools like Terraform Ensuring secure authentication, authorization, and API security utilizing OAuth and best practices Working with architects, developers, and clients to devise end-to-end integration solutions Assisting in deployments, monitoring, and continuous improvement of integration platforms, ensuring reliability and observability in production environments

Negotiation

View details

Senior .NET Engineer

Ho Chi Minh - Viet Nam


Product

  • .NET

Take charge of complex workflows: Collaborate with stakeholders to implement and integrate end-to-end processes, from claim intake to booking, stay, and payment platform. Develop scalable, distributed systems: Build resilient backend services using .NET, with a focus on microservices and ensuring high system reliability. Work on integration-heavy systems: Connect with external insurance and accommodation providers as well as internal systems using APIs and messaging patterns. Ensure system quality and reliability: Write unit and integration tests, troubleshoot production issues, and maintain high standards for performance and stability. Contribute to ongoing improvement: Refine and optimize existing systems, enhance architecture, and embrace best practices in software design. Collaborate in a cross-functional environment: Partner with Dev, PM, and QA engineers to deliver high-quality solutions. Drive technical documentation: Maintain clear and structured documentation to support system evolution and onboarding.

Negotiation

View details

Locomotion Research Engineer

Others - Singapore


Product

Create and train RL locomotion policies for various movement types Establish and maintain simulation environments using custom actuator models to replicate hardware characteristics Implement domain randomization strategy to address simulation-to-reality discrepancies Validate and fine-tune locomotion controllers in simulation and physical platforms Utilize Data Engine telemetry data to refine simulation parameters Collaborate with different teams on issues related to locomotion performance Contribute to open-source releases of locomotion models, training code, and simulation assets

Negotiation

View details