Data Engineer

JOB DESCRIPTION

Work with fellow engineers and data scientists to develop and maintain our core product.
Implement our data pipelines (Java, Elasticsearch, Redis, Apache Beam).
Work on the implementation of new features, leaving things properly implemented, well documented and delivered on time.
Always be on the lookout for opportunities to create and improve. From our development process, up to final user’s experiences. 

JOB REQUIREMENT

Solid skills with Java. Python is a plus.
Solid understanding of databases.
Relevant experience with Google Cloud or AWS.
Superior analytical, conceptual and problem-solving skills.
The ability to learn and iterate quickly.
An obsession with agile and lean principles (GitHub, Trello).
Experience with complex text parsing and web scraping a plus.
Experience with data pipelines or ETL a plus.
Education and Experience 
University degree in computer science or similar education.
Minimum 2 years of experience with focus on backend development.
Strong verbal and written communication skills in English. 

WHAT'S ON OFFER

Internal training by ex-Silicon Valley CTO and award winning AI researcher
Singapore visit 2 times per year
Stipend for technical certification and training 
Career path coaching
Flexible hours
Health insurance
15 days vacation
13 month bonus

CONTACT

PEGASI – IT Recruitment Consultancy | Email: recruit@pegasi.com.vn | Tel: +84 28 3622 8666
We are PEGASI – IT Recruitment Consultancy in Vietnam. If you are looking for new opportunity for your career path, kindly visit our website www.pegasi.com.vn for your reference. Thank you!

Job Summary

Company Type:

product, AI platform

Technical Skills:

Big Data, Data, Python

Location:

Ho Chi Minh - Viet Nam

Salary:

$ 800 - $ 1,500

Job ID:

J00287

Status:

Close

Related Job:

Senior DevOps (Data Platform)

Ho Chi Minh - Viet Nam


Digital Bank, Product

  • Devops
  • Spark

Managing workloads on EC2 clusters using DataBricks/EMR for efficient data processing Collaborating with stakeholders to implement a Data Mesh architecture for multiple closely related enterprise entities Utilizing Infrastructure as Code (IaC) tools for defining and managing data platform user access Implementing role-based access control (RBAC) mechanisms to enforce least privilege principles Collaborating with cross-functional teams to design, implement, and optimize data pipelines and workflows Utilizing distributed engines such as Spark for efficient data processing and analysis Establishing operational best practices for data warehousing tools Managing storage technologies to meet business requirements Troubleshooting and resolving platform-related issues Staying updated on emerging technologies and industry trends Documenting processes, configurations, and changes for comprehensive system documentation.

Negotiation

View details

Python Developer (Distributed Systems)

Ho Chi Minh - Viet Nam


Outsourcing

  • Python
  • Flask

Engage in architecture, design, and code reviews. Contribute to strategic project development, testing, and deployment. Tackling scalability and reliability challenges will lead to meaningful discussions on Distributed Systems. Collaborate within a high-impact, cross-functional team. Utilize technologies including Kafka, PostgreSQL, Spark, BigQuery, GitLab with integrated CI/CD, etc.

Negotiation

View details

Senior Machine Learning Engineer

Ho Chi Minh, Ha Noi - Viet Nam


Information Technology & Services

  • Machine Learning

We are seeking a pragmatic Senior Machine Learning Engineer to accelerate our MLOps roadmap. Your primary mission will be to own the design and implementation of our V1 LLM Evaluation Platform, a critical system that will serve as the quality gate for all our AI features. You will be a key builder on a new initiative, working alongside dedicated Data Engineering and DevOps experts to deliver a tangible, high-impact platform. This role is for a hands-on engineer who thrives on building robust systems that provide leverage. You will be fully empowered to own the implementation and success of this project Build the V1 Evaluation Platform: Proactively own the end-to-end process of designing and building the core backend systems for our new LLM Evaluation Platform, leveraging Arize Phoenix as the foundational framework for traces, evaluations, and experiments. Implement Production Observability: Architect and implement the observability backbone for our AI services, integrating Phoenix with OpenTelemetry to create a centralized system for logging, tracing, and evaluating LLM behavior in production. Standardize LLM Deployment Pipeline: Design and implement the CI/CD framework for versioning, testing, and deploying prompt-based logic and LLM configurations, ensuring reproducible and auditable deployments across all AI features. Deliver Pragmatic Solutions: Consistently make pragmatic technical decisions that prioritize business value and speed of delivery, in line with our early-stage startup environment. Cross-functional Collaboration: Work closely with our Data Science team to understand their workflow and ensure the platform you build meets their core needs for experiment tracking and validation. Establish Core Patterns: Help establish and document the initial technical patterns for MLOps and model evaluation that will serve as the foundation for future development.

Negotiation

View details