Data Engineer

JOB DESCRIPTION

Work with fellow engineers and data scientists to develop and maintain our core product.
Implement our data pipelines (Java, Elasticsearch, Redis, Apache Beam).
Work on the implementation of new features, leaving things properly implemented, well documented and delivered on time.
Always be on the lookout for opportunities to create and improve. From our development process, up to final user’s experiences. 

JOB REQUIREMENT

Solid skills with Java. Python is a plus.
Solid understanding of databases.
Relevant experience with Google Cloud or AWS.
Superior analytical, conceptual and problem-solving skills.
The ability to learn and iterate quickly.
An obsession with agile and lean principles (GitHub, Trello).
Experience with complex text parsing and web scraping a plus.
Experience with data pipelines or ETL a plus.
Education and Experience 
University degree in computer science or similar education.
Minimum 2 years of experience with focus on backend development.
Strong verbal and written communication skills in English. 

WHAT'S ON OFFER

Internal training by ex-Silicon Valley CTO and award winning AI researcher
Singapore visit 2 times per year
Stipend for technical certification and training 
Career path coaching
Flexible hours
Health insurance
15 days vacation
13 month bonus

CONTACT

PEGASI – IT Recruitment Consultancy | Email: recruit@pegasi.com.vn | Tel: +84 28 3622 8666
We are PEGASI – IT Recruitment Consultancy in Vietnam. If you are looking for new opportunity for your career path, kindly visit our website www.pegasi.com.vn for your reference. Thank you!

Job Summary

Company Type:

product, AI platform

Technical Skills:

Big Data, Data, Python

Location:

Ho Chi Minh - Viet Nam

Working Policy:

Salary:

$ 800 - $ 1,500

Job ID:

J00287

Status:

Close

Related Job:

Data Experience Lead

Ho Chi Minh - Viet Nam


Product

#Enable Product Teams to Deliver Data Products Coach pods to design, build, deploy, and maintain Data Products as per our playbooks. Translate platform playbooks into simple, actionable user guides. Guide teams evolving into new Data Mesh roles (DPO, Steward, Data Architect, Analytics Engineer etc). Provide high-touch support for early-wave or complex Data Products.#Build a High-Quality Enablement Engine Turn data products & underlying platform relatable & interesting for everyone across the organization. Develop a digital enablement portal (guides, checklists, templates, videos). Create structured training pathways and capability uplift programs for all impacted staff/users. Produce clear visual artifacts - diagrams, flows, web-style docs, promo videos to help with adoption & understanding. Run onboarding, workshops, roadshows, Q&A sessions, town hall presentations, and demos.#Consulting, Support & POCs Provide structured guidance across ingestion patterns, medallion design, semantics, quality, and metrics. In other words, be the voice of our playbooks designed to ensure consistence in a mesh environment. Execute or coordinate targeted POCs for pods needing specialized help. Coordinate expert resourcing required to build PoCs from the Mesh Platform Team. Manage PoC's as mini-project. Identify and relay reusable patterns back to Data Mesh Platform Team. Organize Showcases to create visibility, excitement, and promote reuse.#Own the Mesh Experience Layer of the Platform Own the end-to-end user experience design for the Data Mesh Platform, ensuring clarity, trust, and ease of use. Shape how users discover, understand, and interact with data products across domains. Maintain UX standards partnering with our customer facing UX Design team. Take a deeply user-centric approach: reduce friction, minimize cognitive load, and elevate the data mesh experience driving the change through intuitive and guided technology. Engage with end users such as product teams, analysts, engineers, and bank users, business leaders to understand needs and gather insights. Integrate continuous feedback loops and iterate quickly to improve platform usability. Ensure all Mesh Experience features reinforce the "data-as-a-product" mindset and support adoption.#Drive Communication & Alignment Maintain active channels (Slack, monthly showcases, team updates, wins, success stories). Communicate expectations, standards, and timelines clearly. Highlight wins and success stories to build momentum. Curate external content that is relevant to our transformation e.g. Databricks newsletter#Monitor Adoption Keep track of progress of rollout, leader boards, and raise blockers with appropriate stakeholders. Use data to highlight platform adoption, culture change, wins and challenges. Produce clear and compelling summaries on adoption progress and also to enable decision making. Manage end user feedback, request for assistance and be the link between users and platform team.

Negotiation

View details

Data Scientist Lead

Ho Chi Minh - Viet Nam


Outsource

  • Machine Learning
  • Data Engineering
  • Cloud
  • Management

Creating robust ETL/ELT data pipelines for structured and unstructured data Developing interactive dashboards and visualizations for effective communication of insights Evaluating, deploying, and evaluating machine learning and/or generative AI models Applying statistical analysis and mathematical modeling to extract insights from complex datasets Working with various teams to deliver data-driven solutions Creating and maintaining scalable ML pipelines and APIs for real-time and batch inference Ensuring best practices in model versioning, reproducibility, observability, and governance (MLOps) Staying updated with AI/ML trends and contributing to projects involving semantic search, knowledge graphs, or retrieval-augmented generation as necessary.

Negotiation

View details

NLP Data Engineer

Ho Chi Minh, Ha Noi - Viet Nam


Product

  • Data Engineering
  • Python
  • NLP

The NLP Data Engineer role involves designing, implementing, and overseeing complex data pipelines from various sources with different formats and latencies. Collaboration with data, technology, and research teams is essential to develop and test strong data onboarding and ETL systems, which will be directly used by quantitative investment strategies.

Negotiation

View details