Enterprise Data Architect

JOB DESCRIPTION

Own the data Architecture Principles;
Co-own and lead the Data Strategy;
Co-own the enterprise data model on the Conceptional Level;
Own the enterprise data model on a Logical Level;
Consult and oversee solution architect on the Physical Level;
Being the Lead Data Governor, incl. Quality, Security, Privacy, GDPR.

JOB REQUIREMENT

Be able to communicate the value of data on all levels and be actively shaping how company is handling data today and tomorrow;
Have strong communication skills as a key task is to connect people from different business units and different roles;
Be able to work conceptually and be able to communicate the impact of this type of work;
Be able to define and foster data architecture principles;
Be able to contribute to the creation, implementation, and maintenance of an enterprise data model;
Be able to think and communicate the big picture for data, and at the same time, be able to link this to individual solutions;
Actively driving the collaboration with technical architects on enterprise and solutions level;
Actively engaging with business process architects;
Have 7.5+ years of experience in complex data management scenarios;
Have a considerable interest in the business processes and business challenges a company is facing; experience in the wholesale/retail domain is a plus but not mandatory;
Have coded data processing pipelines or data applications and should be knowledgeable in modern software delivery and lifecycle management;
Have deep knowledge in data processing technologies;
Profound knowledge (differences and tradeoffs) of different database technologies, like Relational, NoSQL, and Data Lake;
Good knowledge of public clouds (GCP, Azure or AWS);
Have a consulting mindset. Sharing information and proactively supporting projects is a must;
Should be able to guide and respect people and be an internal sales person for all topics around data;
Should have skills in data modeling and governance models and how to make them become reality in a large cooperation;
Should have strong methodology skills when engaging with business units and data teams;
Should have a background in how to bring AI/ML into business processes and what role data management plays in such advanced solutions;
Should have knowledge of GDPR and other data privacy;
Should be willing to represent company on external conferences;
Should be willing to get engaged in hiring data people and mentor people on their data- savvy career path;
Should be familiar with agile software delivery by teams across various locations.

WHAT'S ON OFFER

Flexible and remote work:  create your own schedule!  Flexibility defines the way we work and interact with each other. At our company, you have the possibility to work remotely and adapt your working hours in a very flexible way. 
People development: when you grow so do we!   We want you to become the best version of yourself with individual and company-wide programs and trainings for people development. Focused among other on development,  leadership,  appreciation ... it´s time to upskill your career.  
Support with individual solutions:  we are people-caring!  Life is full of surprises, full of challenges and we want to support you - whenever YOU need - at an individual level and during every stage of your life.
You can choose the location from our tech hubs: Bucharest, Cluj-Napoca, Brasov, Berlin, Dusseldorf, Ho Chi Minh, or you can work remotely anywhere in Romania or Germany. Let's discuss what better suits you!

CONTACT

PEGASI – IT Recruitment Consultancy | Email: recruit@pegasi.com.vn | Tel: +84 28 3622 8666
We are PEGASI – IT Recruitment Consultancy in Vietnam. If you are looking for new opportunity for your career path, kindly visit our website www.pegasi.com.vn for your reference. Thank you!

Job Summary

Company Type:

Product

Technical Skills:

Data, Data Engineering

Location:

Ho Chi Minh - Viet Nam

Working Policy:

Salary:

Negotiation

Job ID:

J01484

Status:

Close

Related Job:

AI-Native Software Engineering Lead

Ho Chi Minh - Viet Nam


Outsource

  • Backend
  • AI

Responsible for developing and evolving the AI-native SDLC operating model, including agent workflow designs, verification gates, context management standards, and evaluation frameworks Build and lead multi-agent systems using orchestration layers such as Claude Code, GitHub Copilot Workspace, Cursor, LangGraph, CrewAI, or equivalent, from prototype to production Collaborate with the Director of Engineering to contribute to and maintain the company's AI toolchain selection criteria and evaluate tools with engineering rigor, providing internal guidance on when AI is beneficial and when it is not Establish engineering standards, agent evaluation loops, and AI output quality gates across the delivery organization Previous experience in a lead, principal, or staff engineer role with demonstrated cross-team influence Experience in outsourcing, consulting, or multi-client delivery environments Track record of building or leading an internal community of practice, guild, or AI adoption program Develop and continuously evolve the company's AI-native SDLC playbook, including standards, workflow templates, case studies, and guardrails that delivery teams can adopt immediately Design and lead internal upskilling programs that transition engineers from AI-assisted to AI-native working patterns Keep track of the AI capability frontier, model improvements, new agent frameworks, and emerging risks, translating signals into timely updates to KMS's practices Work closely alongside Delivery Teams as an AI transformation advisor and execution partner, identifying the highest-value automation opportunities across the SDLC and coordinating with the team to implement them Design and deploy agent-orchestrated workflows tailored to each client's stack, team maturity, and delivery context, with measurable ROI Build business cases for AI-native adoption with clients and account managers, framing the value in terms of velocity, quality, and cost Represent the company's AI-native engineering capabilities in client conversations, QBRs, and RFP responses as a credible technical authority

Negotiation

View details

Senior System Software Engineer - AI Data Platform - Inference Factory

Ho Chi Minh - Viet Nam


Product

  • Devops
  • C/C++
  • Python
  • Golang

Create infrastructure and tools to automate complex software processes effectively. Improve performance: Deploy advanced test harnesses, benchmarking frameworks, and analytical tools to thoroughly evaluate and enhance the performance and efficiency of software and hardware platforms. Utilize expertise in operating systems, kernel internals, device drivers, memory management, storage, networking, and high-speed interconnects to construct and troubleshoot high-performance systems. Collaborate with engineering teams to comprehend requirements and deliver efficient solutions. Establish performance objectives, assess feedback, analyze data, and continually enhance system reliability. Shape technical strategies: Contribute to developing technical strategies and roadmaps for platform automation initiatives to ensure they are in line with company goals and industry best practices.

Negotiation

View details

Lead Data Engineer

Ho Chi Minh, Ha Noi - Viet Nam


Outsource

  • Data Engineering
  • Management

Design, create, and maintain scalable data infrastructure, which includes data lakes, pipelines, and metadata repositories, to ensure accurate and timely delivery of data to stakeholders. Collaborate with data scientists to develop and maintain data models, integrate data sources, and facilitate machine learning workflows and experimentation environments. Build and enhance large-scale, batch, and real-time data processing systems to improve operational efficiency and align with business goals. Use Python, Apache Airflow, and AWS services to automate data workflows and processes, ensuring efficient scheduling and monitoring. Utilize AWS services like S3, Glue, EC2, and Lambda to manage data storage and compute resources, striving for high performance, scalability, and cost-effectiveness. Implement comprehensive testing and validation methods to guarantee the reliability, accuracy, and security of data processing workflows. Keep updated on the latest industry best practices and emerging technologies in data engineering and data science to suggest innovative solutions and enhancements.

Negotiation

View details