Solution Architect (Cloud Data)

JOB DESCRIPTION

Overview: We are seeking a Cloud Data Engineer to join our dynamic team. This role involves engineering robust data pipelines and enhancing our data handling capabilities. The ideal candidate will be adept at merging data from various sources to create actionable insights and ensure seamless data integration across platforms.
Responsibilities:
Design and Implement Data Pipelines: Construct scalable and efficient data pipelines to process raw data from diverse sources into consumable formats, utilizing cloud technologies and ETL processes.
Data Sourcing and Gap Analysis: Proactively explore various data sources to identify and resolve data issues. Conduct data profiling to understand data source problems and ensure high data quality.
Estimate Timelines and Workloads: Accurately estimate the time and effort required to develop data pipelines and close gaps, ensuring timely delivery of projects.
Navigate Through Ambiguity: Understand and define the transition from the current to the desired data state, developing strategies to bridge gaps in data handling and infrastructure.
Master Data Management: Address challenges related to combining multiple data sources, focusing on maintaining integrity and consistency across data sets.
Stakeholder Communication: Regularly update stakeholders on project status, issues, and milestones, ensuring transparency and alignment with business goals.

JOB REQUIREMENT

Must-have:
Senior Data Engineer
Experience in leading a team with at least 5 members
Good communication
Experience in working with multiple stakeholders
Experience in building end-to-end data pipeline
Nice-to-have:
DBT (Data Build Tool), and Tableau
Experience in communicating technical details and project status to stakeholders.
Visualization and analytical skills are critical for roles focusing on data analytics.
Experience with data mesh models and understanding the role of data products within such frameworks

WHAT'S ON OFFER

Hybrid working mode (Monday - Friday, 3 working days at the office, flexible time)
Attractive Package including full salary + 13th-month salary + Performance bonus
18 paid leaves/year (12 annual leaves and 6 personal leaves)
Insurance plan based on full salary
100% full salary and benefits as an official employee from the 1st day of working
Medical benefit for employee and family
Working in a fast-paced, flexible, and multinational working environment.
Chance to travel onsite (in 60 countries) and even in Australia
Free snacks, refreshment, and parking
Internal training (Technical & Functional & English)

CONTACT

PEGASI – IT Recruitment Consultancy | Email: recruit@pegasi.com.vn | Tel: +84 28 3622 8666
We are PEGASI – IT Recruitment Consultancy in Vietnam. If you are looking for new opportunity for your career path, kindly visit our website www.pegasi.com.vn for your reference. Thank you!

Job Summary

Company Type:

Outsource

Technical Skills:

Data Engineering

Location:

Ho Chi Minh - Viet Nam

Working Policy:

Salary:

Negotiation

Job ID:

J01553

Status:

Close

Related Job:

DevOps Engineer

Others - Viet Nam


Product

  • Devops
  • Kubernetes
  • Network

Operate and evolve our Kubernetes platform across multiple clusters and environments (Prod, Dev, hybrid on-prem and public cloud), covering control plane operations, node lifecycle, upgrades, and autoscaling at every layer (Cluster Autoscaler, HPA, KEDA). Architect and manage hybrid cloud infrastructure spanning on-premises and public clouds (GCP, AWS), including workload placement, cross-cloud networking, and unified resource management. Own the CI/CD and GitOps experience end-to-end: container build pipelines, image optimization, and progressive delivery via ArgoCD / FluxCD. Own the observability stack as a single pane of glass across all clusters: Grafana, Mimir, Tempo, Loki, Pyroscope, OnCall, Prometheus -- and help push toward agent-assisted SRE workflows. Manage and improve our inference platform: vLLM serving and AIBrix for multi-model orchestration and autoscaling across a fleet of NVIDIA GPUs. Operate platform services: Kafka, Redis, PostgreSQL, OpenSearch. Manage identity and access via Keycloak integrated with Google Workspace; harden SSO, RBAC, and secrets management across the platform. Harden network security across private load balancers, firewalls, and VPC segmentation; design and maintain hub-and-spoke / multi-AZ topologies. Support training infrastructure: self-service VM provisioning, RunPod burst capacity, Weights and Biases integration. Drive infrastructure reliability, cost efficiency, and capacity planning as the platform scales.

Negotiation

View details

Platform Engineer

Ho Chi Minh - Viet Nam


Product

  • Backend
  • Devops
  • Data Engineering

Build and maintain distributed infrastructure handling telemetry, sensory, and control data across cloud and edge environments Design and operate data ingestion and streaming pipelines connecting robot fleets to the cloud in real time, covering video, joint states, audio, and LiDAR Develop and maintain backend services and APIs that power the Company's developer-facing platform, with a focus on reliability and developer experience Manage and evolve cloud native infrastructure using Kubernetes, Docker, and infrastructure as code tooling Ensure platform reliability through monitoring, alerting, autoscaling, failover, and incident response Support ML and robotics teams with data infrastructure for training pipelines, policy rollout, and hardware-in-the-loop simulation Implement secure APIs with access control, rate limiting, and usage metering as we scale

Negotiation

View details

Software Engineer (Digital Twin)

Ho Chi Minh - Viet Nam


Product

  • Python
  • C/C++

Build and maintain high-fidelity digital twin environments for Asimov across MuJoCo, Isaac Sim, and Unreal Engine, calibrated to real hardware behavior. Design and own the systems -- not just the environments -- that let locomotion, autonomy, and perception teams generate, validate, and iterate on simulation scenarios at scale. Build pipelines for asset import, USD and MJCF workflows, sensor modeling, and real-to-sim calibration to keep digital twins synchronized with evolving hardware. Develop photorealistic rendering pipelines in Unreal Engine for synthetic data generation and perception model training. Work with hardware and mechatronics teams to model actuator dynamics, contact physics, and structural behavior, ensuring simulation parameters reflect physical ground truth. Integrate digital twin environments with the Company's locomotion training pipeline (Cyclotron) and autonomy stack, enabling teams to run experiments and close the sim-to-real gap. Contribute to the open-source Asimov simulation stack, including tooling, documentation, and reproducible environment workflows.

Negotiation

View details