Enterprise Data Architect
JOB DESCRIPTION
JOB REQUIREMENT
WHAT'S ON OFFER
CONTACT
Job Summary
Company Type:
Product
Technical Skills:
Data, Data Engineering
Location:
Ho Chi Minh - Viet Nam
Working Policy:
Salary:
Negotiation
Job ID:
J01484
Status:
Close
Related Job:
AI-Native Software Engineering Lead
Ho Chi Minh - Viet Nam
Outsource
- Backend
- AI
Responsible for developing and evolving the AI-native SDLC operating model, including agent workflow designs, verification gates, context management standards, and evaluation frameworks Build and lead multi-agent systems using orchestration layers such as Claude Code, GitHub Copilot Workspace, Cursor, LangGraph, CrewAI, or equivalent, from prototype to production Collaborate with the Director of Engineering to contribute to and maintain the company's AI toolchain selection criteria and evaluate tools with engineering rigor, providing internal guidance on when AI is beneficial and when it is not Establish engineering standards, agent evaluation loops, and AI output quality gates across the delivery organization Previous experience in a lead, principal, or staff engineer role with demonstrated cross-team influence Experience in outsourcing, consulting, or multi-client delivery environments Track record of building or leading an internal community of practice, guild, or AI adoption program Develop and continuously evolve the company's AI-native SDLC playbook, including standards, workflow templates, case studies, and guardrails that delivery teams can adopt immediately Design and lead internal upskilling programs that transition engineers from AI-assisted to AI-native working patterns Keep track of the AI capability frontier, model improvements, new agent frameworks, and emerging risks, translating signals into timely updates to KMS's practices Work closely alongside Delivery Teams as an AI transformation advisor and execution partner, identifying the highest-value automation opportunities across the SDLC and coordinating with the team to implement them Design and deploy agent-orchestrated workflows tailored to each client's stack, team maturity, and delivery context, with measurable ROI Build business cases for AI-native adoption with clients and account managers, framing the value in terms of velocity, quality, and cost Represent the company's AI-native engineering capabilities in client conversations, QBRs, and RFP responses as a credible technical authority
Negotiation
View detailsSenior System Software Engineer - AI Data Platform - Inference Factory
Ho Chi Minh - Viet Nam
Product
- Devops
- C/C++
- Python
- Golang
Create infrastructure and tools to automate complex software processes effectively. Improve performance: Deploy advanced test harnesses, benchmarking frameworks, and analytical tools to thoroughly evaluate and enhance the performance and efficiency of software and hardware platforms. Utilize expertise in operating systems, kernel internals, device drivers, memory management, storage, networking, and high-speed interconnects to construct and troubleshoot high-performance systems. Collaborate with engineering teams to comprehend requirements and deliver efficient solutions. Establish performance objectives, assess feedback, analyze data, and continually enhance system reliability. Shape technical strategies: Contribute to developing technical strategies and roadmaps for platform automation initiatives to ensure they are in line with company goals and industry best practices.
Negotiation
View detailsFeatured Job
Lead Data Engineer
Ho Chi Minh, Ha Noi - Viet Nam
Outsource
- Data Engineering
- Management
Design, create, and maintain scalable data infrastructure, which includes data lakes, pipelines, and metadata repositories, to ensure accurate and timely delivery of data to stakeholders. Collaborate with data scientists to develop and maintain data models, integrate data sources, and facilitate machine learning workflows and experimentation environments. Build and enhance large-scale, batch, and real-time data processing systems to improve operational efficiency and align with business goals. Use Python, Apache Airflow, and AWS services to automate data workflows and processes, ensuring efficient scheduling and monitoring. Utilize AWS services like S3, Glue, EC2, and Lambda to manage data storage and compute resources, striving for high performance, scalability, and cost-effectiveness. Implement comprehensive testing and validation methods to guarantee the reliability, accuracy, and security of data processing workflows. Keep updated on the latest industry best practices and emerging technologies in data engineering and data science to suggest innovative solutions and enhancements.