Data Engineer

JOB DESCRIPTION

About the Role/position
The ideal candidate is an experienced data pipeline builder and data wrangler, will act as technical data engineering expert in international team (multilocation). He/she will actively participate in optimizing data systems or building them from the ground up for our oversee customers.
Responsibilities:
The Data engineer will be responsible for:
Develop and maintain data pipelines using ETL processes.
Work closely with data science team to implement data analytics pipelines.
Maintain security and data privacy, working closely with data protection officer.
Implement scalable architectural models for data processing and storage.
Build functionality for data ingestion from multiple heterogeneous sources in batch & realtime mode.
Help in scoping, estimation, and planning for various projects in the enterprise.
Provide technical support to project teams as needed.

JOB REQUIREMENT

Must have Technical Requirements / Qualifications
B.S. in Computer Science, related fields or commensurate work experience.
5+ years of experience in software development with 3+ years of experience of relevant data engineering like Spark, PySpark, Hive, HDFS, Pig… and ETL with large amounts of data.
Solid knowledge and experience of data processing languages, such as SQL, Python, and/or Scala.
Hand on experience with real-time data stream platforms such as Kafka and Spark Streaming.
Knowledge and experience on both relational databases (Oracle), NoSQL databases (e.g. MongoDB) and strong SQL querying skills, Performance Tuning are required.
Experience on complex regulatory data integration projects.
Agile based delivery knowledge
Excellent English communication – verbal, written, and presentation skills.
Strong teambuilding skills and teamwork orientation.
Strong creative problem-solving skills.
Nice to have Technical Requirements / Qualifications
DP-203 – Data Engineering on Microsoft Azure certificate.
Knowledge of at least one cloud environment (Azure, GCP, AWS, IBM)
Experience on Data Warehouse such as Teradata SQL, Informatica, Unix and Control-M.
Experience of data visualisation tools (e.g Tableau, Quantexa and SAS).
Experience on creating Slowly Changing Dimension type data tables in Hive using Spark framework.

WHAT'S ON OFFER

Competitive salary, health insurance covered for employee and dependents
Working on international projects. Professional and dynamic working environment
Achieving valuable experience with variety projects, new technologies and hundreds of talents
Receiving training opportunities including many technical seminars and soft skill training courses
Good opportunity for promotion through regular performance review system
Hybrid work

CONTACT

PEGASI – IT Recruitment Consultancy | Email: recruit@pegasi.com.vn | Tel: +84 28 3622 8666
We are PEGASI – IT Recruitment Consultancy in Vietnam. If you are looking for new opportunity for your career path, kindly visit our website www.pegasi.com.vn for your reference. Thank you!

Job Summary

Company Type:

Outsourcing

Technical Skills:

Data Engineering

Location:

Ho Chi Minh - Viet Nam

Salary:

Negotiate

Job ID:

J01231

Status:

Close

Related Job:

Senior DevOps (Data Platform)

Ho Chi Minh - Viet Nam


Digital Bank, Product

  • Devops
  • Spark

Managing workloads on EC2 clusters using DataBricks/EMR for efficient data processing Collaborating with stakeholders to implement a Data Mesh architecture for multiple closely related enterprise entities Utilizing Infrastructure as Code (IaC) tools for defining and managing data platform user access Implementing role-based access control (RBAC) mechanisms to enforce least privilege principles Collaborating with cross-functional teams to design, implement, and optimize data pipelines and workflows Utilizing distributed engines such as Spark for efficient data processing and analysis Establishing operational best practices for data warehousing tools Managing storage technologies to meet business requirements Troubleshooting and resolving platform-related issues Staying updated on emerging technologies and industry trends Documenting processes, configurations, and changes for comprehensive system documentation.

Negotiation

View details

Senior Machine Learning Engineer

Ho Chi Minh, Ha Noi - Viet Nam


Information Technology & Services

  • Machine Learning

Creating the V1 Evaluation Platform: You will be responsible for designing and building the core backend systems for our new LLM Evaluation Platform, using Arize Phoenix as the basis for traces, evaluations, and experiments. Implementing Production Observability: You will need to architect and implement the observability backbone for our AI services by integrating Phoenix with OpenTelemetry to establish a centralized system for logging, tracing, and evaluating LLM behavior in production. Standardizing LLM Deployment Pipeline: You will be in charge of designing and implementing the CI/CD framework for versioning, testing, and deploying prompt-based logic and LLM configurations, ensuring reproducible and auditable deployments across all AI features. Providing Practical Solutions: Your role will involve making pragmatic technical decisions that prioritize business value and speed of delivery, in line with our early-stage startup environment. Collaborating with Other Teams: You will work closely with the Data Science team to understand their workflow and ensure that the platform you build meets their core needs for experiment tracking and validation. Establishing Core Patterns: You will also help in establishing and documenting the initial technical patterns for MLOps and model evaluation that will serve as the foundation for future development.

Negotiation

View details

Fullstack Engineer - BRAIN

Ho Chi Minh - Viet Nam


product, Investment Management

  • Frontend
  • Backend

Create intricate single page applications. Construct components that can be used across various interfaces. Design layouts that are responsive for both desktop and mobile devices. Automate the testing procedures for the user interface. Develop services and APIs for backend applications. Incorporate AWS and external cloud services. Enhance application speed and scalability. Actively contribute to an agile engineering team focused on continual improvement. Utilize leading open-source technologies like MySQL, PostgreSQL, ELK stack, Sentry, Redis, Git, etc. Take part in periodic on-call responsibilities.

Negotiation

View details