MLOps Engineer

JOB DESCRIPTION

Be a part of building the ideal data and ML/AI ecosystem from scratch. Spearheaded the integration of the latest capabilities to enhance customer experiences and transform business operations. Embrace the vision of democratizing ML/AI technology, making it accessible to all by establishing robust engineering standards, simplifying complexities, and designing effective controls and guardrails. This leadership role goes beyond conventional boundaries, empowering you to lead and innovate across many aspects of our data enablement value stream.
Your role as an MLOps Engineer will be similar to a DevOps engineer, with a stretched focus on productionizing Machine Learning features:
Design and implement scalable AI solutions that enables data engineers and ML scientists to train, build, and maintain machine learning models effectively.
Develop automated processes for continuous model training and evaluation pipelines specifically for ML applications.
Ensure the seamless integration of Company Plus's current architecture with newly added ML functionalities, enhancing overall system capabilities.
Collaborating with diverse stakeholders including business partners, risk, legal, and security teams, as well as UX designers and architects to define and implement robust validation and verification strategies
Fostering a culture of quality coding practices, including test-driven development, unit testing, and secure coding awareness
Focus on business practicality and the 80/20 rule, aiming for a high bar for code quality, but recognize the business benefit of "having something now" vs "perfection sometime in the future"

JOB REQUIREMENT

To grow and be successful in this role, you will bring extensive analytical and technical skills, business acumen and natural curiosity to deliver on product investigations and analysis and support initiatives through insights.
You will ideally bring the following:
Proficiency in one of the scripting/programming languages (Python).
Experience in building data products using GCP/ AWS technologies.
Experience with containerization, Terraform, and GitOps principles for automation and deployment.
Strong background in ML concepts and applications and in-depth knowledge of MLOps best practices.
Agile development mindset, appreciating the benefit of constant iteration and improvement.
Have experience in addressing Tech Debt with minimizing production incidents.
Familiarity with RAG architectures and/or have a good understanding of their application.

WHAT'S ON OFFER

Attractive package including fixed 13-month salary and variable performance bonus
Insurance plan based on full salary
100% full salary and benefits as an official employee from the 1st day of working
Medical benefit (private insurance) for employee and their family
18 paid leaves/year (12 annual leaves and 6 personal leaves)
Working in a fast-paced, flexible, and multinational working environment.
Chance to travel for business trip in foreign countries
Free snacks, refreshment, and parking
Career development in a giant tech hub just entering Vietnam market, with very challenging project
Hybrid working mode, flexible time (3 days in office per week)

CONTACT

PEGASI – IT Recruitment Consultancy | Email: recruit@pegasi.com.vn | Tel: +84 28 3622 8666
We are PEGASI – IT Recruitment Consultancy in Vietnam. If you are looking for new opportunity for your career path, kindly visit our website www.pegasi.com.vn for your reference. Thank you!

Job Summary

Company Type:

Information Technology & Services

Technical Skills:

Machine Learning, Devops, Data Science, Python, Java

Location:

Ho Chi Minh - Viet Nam

Salary:

Negotiation

Job ID:

J01554

Status:

Close

Related Job:

Senior DevOps (Data Platform)

Ho Chi Minh - Viet Nam


Digital Bank, Product

  • Devops
  • Spark

Managing workloads on EC2 clusters using DataBricks/EMR for efficient data processing Collaborating with stakeholders to implement a Data Mesh architecture for multiple closely related enterprise entities Utilizing Infrastructure as Code (IaC) tools for defining and managing data platform user access Implementing role-based access control (RBAC) mechanisms to enforce least privilege principles Collaborating with cross-functional teams to design, implement, and optimize data pipelines and workflows Utilizing distributed engines such as Spark for efficient data processing and analysis Establishing operational best practices for data warehousing tools Managing storage technologies to meet business requirements Troubleshooting and resolving platform-related issues Staying updated on emerging technologies and industry trends Documenting processes, configurations, and changes for comprehensive system documentation.

Negotiation

View details

Python Developer (Distributed Systems)

Ho Chi Minh - Viet Nam


Outsourcing

  • Python
  • Flask

Engage in architecture, design, and code reviews. Contribute to strategic project development, testing, and deployment. Tackling scalability and reliability challenges will lead to meaningful discussions on Distributed Systems. Collaborate within a high-impact, cross-functional team. Utilize technologies including Kafka, PostgreSQL, Spark, BigQuery, GitLab with integrated CI/CD, etc.

Negotiation

View details

Senior Machine Learning Engineer

Ho Chi Minh, Ha Noi - Viet Nam


Information Technology & Services

  • Machine Learning

Creating the V1 Evaluation Platform: You will be responsible for designing and building the core backend systems for our new LLM Evaluation Platform, using Arize Phoenix as the basis for traces, evaluations, and experiments. Implementing Production Observability: You will need to architect and implement the observability backbone for our AI services by integrating Phoenix with OpenTelemetry to establish a centralized system for logging, tracing, and evaluating LLM behavior in production. Standardizing LLM Deployment Pipeline: You will be in charge of designing and implementing the CI/CD framework for versioning, testing, and deploying prompt-based logic and LLM configurations, ensuring reproducible and auditable deployments across all AI features. Providing Practical Solutions: Your role will involve making pragmatic technical decisions that prioritize business value and speed of delivery, in line with our early-stage startup environment. Collaborating with Other Teams: You will work closely with the Data Science team to understand their workflow and ensure that the platform you build meets their core needs for experiment tracking and validation. Establishing Core Patterns: You will also help in establishing and documenting the initial technical patterns for MLOps and model evaluation that will serve as the foundation for future development.

Negotiation

View details