Senior Data Engineer

JOB DESCRIPTION

In this role, you will work with Data Analyst to create data models. You will develop ETL pipelines, perform validations and testing. You will partner closely with our clients on a wide variety of collaborative and innovative engagements.
One should be a phenomenal teammate with a forward-thinking mindset, ability and confidence to challenge the status quo to define future visions
Work with Data analyst to create data models
Implement ETL pipelines using AWS Glue from data extraction, data uploading, data formatting and data loading
Implement complex data transformations using Python or Scala and implement unit testing
Verify data transformation using AWS Athena
Implement data loading into Oracle database and verify
Ensure best possible performance and quality of transformed/loaded data

JOB REQUIREMENT

Primary Skills:
Proficiency in programming languages such as Python or Scala
Solid experience with ETL tools, database management systems and data integration techniques
Proficient in database query languages such as SQL and NoSQL
Secondary Skills:
Scripting languages
AWS Platforms: Glue (Console, Studio, Crawler, Job, Data Catalog, DataBrew), Athena
Insurance or banking domain
 
BS/MS degree in Computer Science, Engineering or a related subject
Good English communication is a must
Minimum 2+ years of relevant experience primarily in ETL development, data analysis and modelling
Experience working in an agile team, practicing Scrum, Kanban
Good communication skills, interpersonal and teamworking skills
Pro-active and flexible working approach
Knowledge of the business domains is a plus: Insurance (Life/Non-life), Banking
Team-player with experience working with international and multi-functional teams
Self-development skills to keep up to date with fast-changing trends

WHAT'S ON OFFER

Competitive salary, health insurance covered for employee and dependents
Working on international projects. Professional and dynamic working environment
Achieving valuable experience with variety projects, new technologies and hundreds of talents
Receiving training opportunities including many technical seminars and soft skill training courses
Good opportunity for promotion through regular performance review system

CONTACT

PEGASI – IT Recruitment Consultancy | Email: recruit@pegasi.com.vn | Tel: +84 28 3622 8666
We are PEGASI – IT Recruitment Consultancy in Vietnam. If you are looking for new opportunity for your career path, kindly visit our website www.pegasi.com.vn for your reference. Thank you!

Job Summary

Company Type:

Outsourcing

Technical Skills:

Data, Data Analyst, ETL/ELT, Data Engineering

Location:

Ho Chi Minh - Viet Nam

Salary:

Negotiation

Job ID:

J01506

Status:

Close

Related Job:

Senior DevOps (Data Platform)

Ho Chi Minh - Viet Nam


Digital Bank, Product

  • Devops
  • Spark

Managing workloads on EC2 clusters using DataBricks/EMR for efficient data processing Collaborating with stakeholders to implement a Data Mesh architecture for multiple closely related enterprise entities Utilizing Infrastructure as Code (IaC) tools for defining and managing data platform user access Implementing role-based access control (RBAC) mechanisms to enforce least privilege principles Collaborating with cross-functional teams to design, implement, and optimize data pipelines and workflows Utilizing distributed engines such as Spark for efficient data processing and analysis Establishing operational best practices for data warehousing tools Managing storage technologies to meet business requirements Troubleshooting and resolving platform-related issues Staying updated on emerging technologies and industry trends Documenting processes, configurations, and changes for comprehensive system documentation.

Negotiation

View details

Senior Machine Learning Engineer

Ho Chi Minh, Ha Noi - Viet Nam


Information Technology & Services

  • Machine Learning

Creating the V1 Evaluation Platform: You will be responsible for designing and building the core backend systems for our new LLM Evaluation Platform, using Arize Phoenix as the basis for traces, evaluations, and experiments. Implementing Production Observability: You will need to architect and implement the observability backbone for our AI services by integrating Phoenix with OpenTelemetry to establish a centralized system for logging, tracing, and evaluating LLM behavior in production. Standardizing LLM Deployment Pipeline: You will be in charge of designing and implementing the CI/CD framework for versioning, testing, and deploying prompt-based logic and LLM configurations, ensuring reproducible and auditable deployments across all AI features. Providing Practical Solutions: Your role will involve making pragmatic technical decisions that prioritize business value and speed of delivery, in line with our early-stage startup environment. Collaborating with Other Teams: You will work closely with the Data Science team to understand their workflow and ensure that the platform you build meets their core needs for experiment tracking and validation. Establishing Core Patterns: You will also help in establishing and documenting the initial technical patterns for MLOps and model evaluation that will serve as the foundation for future development.

Negotiation

View details

Fullstack Engineer - BRAIN

Ho Chi Minh - Viet Nam


product, Investment Management

  • Frontend
  • Backend

Create intricate single page applications. Construct components that can be used across various interfaces. Design layouts that are responsive for both desktop and mobile devices. Automate the testing procedures for the user interface. Develop services and APIs for backend applications. Incorporate AWS and external cloud services. Enhance application speed and scalability. Actively contribute to an agile engineering team focused on continual improvement. Utilize leading open-source technologies like MySQL, PostgreSQL, ELK stack, Sentry, Redis, Git, etc. Take part in periodic on-call responsibilities.

Negotiation

View details