Cloud Data Engineer

ABOUT CLIENT

Our client is a leading global technology company that provides a wide range of IT services and solutions. With a strong focus on innovation and digital transformation, our client helps businesses adapt to the ever-changing technological landscape. Their expertise in areas like cloud computing, cybersecurity, and AI makes them a valuable partner for organizations.

JOB DESCRIPTION

Develop and execute the data processing pipeline using Google Cloud Platform (GCP).
Collaborate with implementation teams throughout the project lifecycle to offer extensive technical proficiency for deploying enterprise-scale data solutions and leveraging contemporary data/analytics technologies on GCP.
Create data processing pipelines and architectures.
Automate DevOps procedures for all components of the data pipelines, ensuring seamless transition from development to production.
Translate business challenges into technical data problems while incorporating essential business drivers in coordination with product management.
Extract, load, transform, sanitize, and authenticate data.
Provide assistance and resolution for issues related to data pipelines.

JOB REQUIREMENT

Minimum of 4 years of experience in Data Engineering or a similar role
Strong Cloud-based Data Engineering experience in AWS, Azure, or GCP with at least 2 years of Cloud experience
Proficiency in GCP Cloud Data Engineering, including general infrastructure and data services such as Big Query, Dataflow, Airflow, and Cloud Function
Proficiency in AWS Cloud Data Engineering, including data pipeline technologies like Lake Formation, MWAA, EMR, and storage technologies like S3 and Glue
Proficiency in Azure Cloud Data Engineering, including Azure Data Lake Storage, Azure Databricks, Azure Data Factory, and Synapse
Successful design and implementation of large and complex data solutions using various architectural patterns such as Microservices
Advanced skills in SQL and Python
Experience with DataOps
Experience in using DevOps on Cloud data platforms such as Terraform for Infrastructure as Code (IaC), GitOps, Docker, and Kubernetes
Strong educational background in Information Technology (IT) and Information and Communication Technology (ICT)
Ability to influence both technical and business peers and stakeholders
Fluent in English verbal communication
Experience in Marketing domains is preferred

WHAT'S ON OFFER

This position offers hybrid working arrangements, with three days working in the office and flexible hours.
Salary is negotiable based on candidate expectations.
Employees are entitled to 18 paid leaves annually, which includes 12 annual leaves and 6 personal leaves.
The insurance plan includes coverage based on full salary, a 13th-month salary, and performance bonuses.
A monthly meal allowance of 730,000 VND is provided.
Employees receive 100% full salary and benefits from the start of employment.
Medical benefits are extended to the employee and their family.
The work environment is fast-paced, flexible, and multicultural with opportunities for travel to 49 countries.
The company provides complimentary snacks, refreshments, and parking facilities.
Internal training programs covering technical, functional, and English language skills are offered.
The regular working hours are from 08:30 AM to 06:00 PM on Mondays to Fridays, inclusive of meal breaks.

CONTACT

PEGASI – IT Recruitment Consultancy | Email: recruit@pegasi.com.vn | Tel: +84 28 3622 8666
We are PEGASI – IT Recruitment Consultancy in Vietnam. If you are looking for new opportunity for your career path, kindly visit our website www.pegasi.com.vn for your reference. Thank you!

Job Summary

Company Type:

Information Technology & Services

Technical Skills:

Data Engineering, Cloud, Google Cloud, ETL/ELT

Location:

Ho Chi Minh - Viet Nam

Salary:

Negotiation

Job ID:

J01454

Status:

Close

Related Job:

Senior DevOps (Data Platform)

Ho Chi Minh - Viet Nam


Digital Bank, Product

  • Devops
  • Spark

Managing workloads on EC2 clusters using DataBricks/EMR for efficient data processing Collaborating with stakeholders to implement a Data Mesh architecture for multiple closely related enterprise entities Utilizing Infrastructure as Code (IaC) tools for defining and managing data platform user access Implementing role-based access control (RBAC) mechanisms to enforce least privilege principles Collaborating with cross-functional teams to design, implement, and optimize data pipelines and workflows Utilizing distributed engines such as Spark for efficient data processing and analysis Establishing operational best practices for data warehousing tools Managing storage technologies to meet business requirements Troubleshooting and resolving platform-related issues Staying updated on emerging technologies and industry trends Documenting processes, configurations, and changes for comprehensive system documentation.

Negotiation

View details

Senior Machine Learning Engineer

Ho Chi Minh, Ha Noi - Viet Nam


Information Technology & Services

  • Machine Learning

Negotiation

View details

Fullstack Engineer - BRAIN

Ho Chi Minh - Viet Nam


product, Investment Management

  • Frontend
  • Backend

Create intricate single page applications. Construct components that can be used across various interfaces. Design layouts that are responsive for both desktop and mobile devices. Automate the testing procedures for the user interface. Develop services and APIs for backend applications. Incorporate AWS and external cloud services. Enhance application speed and scalability. Actively contribute to an agile engineering team focused on continual improvement. Utilize leading open-source technologies like MySQL, PostgreSQL, ELK stack, Sentry, Redis, Git, etc. Take part in periodic on-call responsibilities.

Negotiation

View details