Cloud Platform Engineers/Architects

ABOUT CLIENT

Our client is a leading global technology company that provides a wide range of IT services and solutions. With a strong focus on innovation and digital transformation, our client helps businesses adapt to the ever-changing technological landscape. Their expertise in areas like cloud computing, cybersecurity, and AI makes them a valuable partner for organizations.

JOB DESCRIPTION

Create, execute, and oversee cloud-based infrastructure on GCP and AWS
Develop, automate, and enhance data platform provisioning, scaling, and maintenance tasks
Prioritize security across all infrastructure and data platform operations
Construct automations and frameworks to streamline deployment, scaling, and data analytics tasks
Deliver high-quality content using appropriate document templates
Keep informed about industry best practices and emerging technologies to enhance Platform & Infrastructure
Employ containerization technologies such as Docker and orchestration tools like Kubernetes
Implement CI/CD pipelines and adhere to DevOps principles
Use Python and other common programming languages for automation duties
Collaborate with APIs and scripting/frameworks for operational efficiency
Implement and utilize monitoring and logging tools effectively
Collaborate with a variety of operating systems for compatibility and efficiency
Manage network configurations ensuring reliability, security, and performance
Troubleshoot and resolve infrastructure, applications, and networking issues

JOB REQUIREMENT

Engineer level:
The candidate should have proficiency in programming/scripting languages such as Java, Python, SQL, and Bash, as well as experience with tools like Terraform and Infrastructure as Code (IaC).
They should have a strong understanding of DevOps practices, CI/CD processes, and container orchestration, with a preference for experience in Cloud platforms, particularly GCP.
A minimum of 4 years of experience is required, along with fundamentals in Linux operating systems and Git source control.
Experience with data pipelines ETL/ELT fundamentals and SQL is essential, as well as good verbal communication skills in English.
Additionally, expertise in Docker and Kubernetes, knowledge of Istio Service Mesh and routing, and proficiency in SQL and DBT for data processing are necessary.
Familiarity with DevSecOps practices, including static analysis, composition analysis, vulnerability scanning, and secret management, is an advantage.
A solid understanding of CI/CD principles is essential, and experience with cloud providers such as GCP and AWS is preferred.
The candidate should be familiar with a range of tooling and frameworks, including GitHub for version control, Artifactory for artifact management, Codefresh for CI/CD pipelines, and Tableau for reporting.
Experience with Airflow for workflow automation, Helm or Kustomize for Kubernetes management, Twistlock for container security, Checkmarks for static application security testing, Blackduck for open-source security scanning, Ansible for configuration management and automation, and OpenShift for container orchestration is also beneficial.
Qualifications for this position include a Bachelor's degree in Computer Science or an Engineering field and certification in Google Cloud Platform (GCP) and/or Amazon Web Services (AWS).
Frontend skills such as JavaScript, Node.js, and React, Azure AD knowledge, and experience with Ansible for configuration management and automation are considered good-to-have requirements.
 
Architect level:
An architect-level candidate should have all the qualifications above, as well as a minimum of 8 years of experience with strong expertise in cloud architecture, particularly in GCP.
They should also have experience in designing resilient and scalable frameworks to address business needs and be capable of collaborating with stakeholders to translate business requirements into solutions, bridging the gap between cross-functional teams with strong communication skills.

WHAT'S ON OFFER

This position offers hybrid working arrangements, with three days working in the office and flexible hours.
Salary is negotiable based on candidate expectations.
Employees are entitled to 18 paid leaves annually, which includes 12 annual leaves and 6 personal leaves.
The insurance plan includes coverage based on full salary, a 13th-month salary, and performance bonuses.
A monthly meal allowance of 730,000 VND is provided.
Employees receive 100% full salary and benefits from the start of employment.
Medical benefits are extended to the employee and their family.
The work environment is fast-paced, flexible, and multicultural with opportunities for travel to 49 countries.
The company provides complimentary snacks, refreshments, and parking facilities.
Internal training programs covering technical, functional, and English language skills are offered.
The regular working hours are from 08:30 AM to 06:00 PM on Mondays to Fridays, inclusive of meal breaks.

CONTACT

PEGASI – IT Recruitment Consultancy | Email: recruit@pegasi.com.vn | Tel: +84 28 3622 8666
We are PEGASI – IT Recruitment Consultancy in Vietnam. If you are looking for new opportunity for your career path, kindly visit our website www.pegasi.com.vn for your reference. Thank you!

Job Summary

Company Type:

Information Technology & Services

Technical Skills:

Devops, Google Cloud

Location:

Ho Chi Minh - Viet Nam

Working Policy:

Hybrid

Salary:

Negotiation

Job ID:

J01637

Status:

Close

Related Job:

Software Architect

Others - Viet Nam


Outsourcing company

  • Architect
  • NodeJS
  • Cloud
  • Frontend

Lead the development of scalable full-stack applications. Optimize database schemas and queries to support secure, high-performance systems with strong data integrity. Make key architectural decisions that balance scalability, performance, maintainability, and developer experience, and mentor engineers on best practices across the stack. Incorporate DevOps into the development lifecycle by managing CI/CD pipelines and overseeing infrastructure-as-code. Plan, design, and implement cloud infrastructure using cloud services to ensure system availability, scalability, and cost efficiency. Work with cross-functional teams to define system architecture, align on technical direction, and ensure architectural decisions support product goals and timelines. Continuously explore, assess, and adopt modern tools, frameworks, and best practices to enhance engineering productivity, code quality, and system resilience.

Negotiation

View details

Product Quality Engineer

Ho Chi Minh - Viet Nam


Product

  • Automation Test
  • Devops

Develop and implement thorough testing strategies across web, API platform, desktop, and mobile platforms Create automated test suites and Auto QA agents for continuous releases, model updates, and API integrations Manage build and CI/CD pipelines to ensure product functionality across major operating systems Verify compatibility between web clients and different server node versions, including upgrade paths and backwards compatibility testing Validate resource management and performance optimizations across various hardware configurations, including GPU acceleration Engage in the Discord community and GitHub Issues to translate feedback into practical test cases Oversee release cycles, prioritize bugs, and provide timely alerts Generate user-friendly documentation to help users resolve issues

Negotiation

View details

Senior/ Lead Data Engineer (Data Platform / MLOps)

Ho Chi Minh, Ha Noi - Viet Nam


Information Technology & Services

You will be responsible for managing, designing, and enhancing data systems and workflows that drive key business decisions. The role is focused 75% on data engineering, involving the construction and optimization of data pipelines and architectures, and 25% on supporting data science initiatives through collaboration with data science teams for machine learning workflows and advanced analytics. You will leverage technologies like Python, Airflow, Kubernetes, and AWS to deliver high-quality data solutions. Architect, develop, and maintain scalable data infrastructure, including data lakes, pipelines, and metadata repositories, ensuring the timely and accurate delivery of data to stakeholders. Work closely with data scientists to build and support data models, integrate data sources, and support machine learning workflows and experimentation environments. Develop and optimize large-scale, batch, and real-time data processing systems to enhance operational efficiency and meet business objectives. Leverage Python, Apache Airflow, and AWS services to automate data workflows and processes, ensuring efficient scheduling and monitoring. Utilize AWS services such as S3, Glue, EC2, and Lambda to manage data storage and compute resources, ensuring high performance, scalability, and cost-efficiency. Implement robust testing and validation procedures to ensure the reliability, accuracy, and security of data processing workflows. Stay informed of industry best practices and emerging technologies in both data engineering and data science to propose optimizations and innovative solutions.

Negotiation

View details