Data Engineer

ABOUT CLIENT

Our client is using new technology to develop products for the banking industry

JOB DESCRIPTION

Our client is currently working on developing some of the world's fastest growing digital banks, and the data team plays a crucial role in shaping the bank's vision. The aim is to create a platform that encourages economic participation and broadens financial inclusion. This is achieved through the implementation of innovative data and analytic solutions to provide high-quality services and products to our customers, ultimately optimizing the business.
If you join as a Data Engineer, your role will involve creating solutions that support informed decision-making and innovation by providing clean, protected, quality, and auditable data from various sources into fit-for-purpose data products. This includes designing, developing, testing, deploying, and monitoring data pipelines in Databricks on AWS from a wide variety of data sources. It also involves designing, developing, testing, deploying, and monitoring scalable code with PySpark and SQL in Databricks.
Identifying opportunities to enhance internal process through code optimization and automation will be a part of the role, as well as building data quality dashboards, lineage flows, and monitoring tools to utilize the data pipeline. This provides active monitoring and actionable insight into overall data quality and data governance. You will also be involved in assisting in migrating data from legacy systems onto newly developed solutions, and following and leading best practices on all data security, retention, and privacy policies.

JOB REQUIREMENT

Completion of a Bachelor's degree is required.
A minimum of 4 years of experience in developing ETL/ELT pipelines is necessary.
Demonstrated proficiency in solution design, development, implementation, reporting, and analysis is essential.
Skillfulness in Apache-Spark, Python, and SQL languages is required.
Proficiency in working with various data formats such as Text, Delta, Parquet, JSON, CSV, and XML is important.
Familiarity with Spark structured streaming is necessary.
Experience with AWS infrastructure, particularly S3, is required.
Solid understanding of git-based version control, DevOps, and CI/CD is essential. Experience with the Atlassian stack is a bonus.
Knowledge of common web API frameworks and web services is necessary.
Strong teamwork, relationship, and client management skills are required, along with the ability to influence peers and senior management to achieve team goals.
Willingness to adopt modern technology, best practices, and ways of working is important.

WHAT'S ON OFFER

Company offers meal and parking benefits.
Full benefits and probationary salary provided.
Insurance coverage as per Vietnamese labor law and premium health care for employees and their families.
Work environment is values-driven, international, and agile in nature.
Opportunities for overseas travel related to training and work.
Participation in internal Hackathons and company events such as team building, coffee runs, and blue card activities.
Additional benefits include a 13th-month salary and performance bonuses.
Employees receive 15 days of annual leave and 3 days of sick leave per year.
Work-life balance with a 40-hour workweek from Monday to Friday.

CONTACT

PEGASI – IT Recruitment Consultancy | Email: recruit@pegasi.com.vn | Tel: +84 28 3622 8666
We are PEGASI – IT Recruitment Consultancy in Vietnam. If you are looking for new opportunity for your career path, kindly visit our website www.pegasi.com.vn for your reference. Thank you!

Job Summary

Company Type:

Offshore

Technical Skills:

Data Engineering, Big Data

Location:

Ho Chi Minh - Viet Nam

Working Policy:

Hybrid

Salary:

Negotiation

Job ID:

J01710

Status:

Active

Related Job:

Senior Signal Processing Engineer

Others - Viet Nam


Outsource

  • Python

Design and improve rPPG/TOI pipelines using RGB/IR video with motion/illumination compensation. Implement multi-stage preprocessing, denoising, and quality scoring. Examples: adaptive filtering, ICA/PCA variants, color-space transforms, skin ROI stabilization, signal confidence metrics. o Build cross-device normalization strategies and error-bounded estimators. Define biomarker-level acceptance criteria and failure modes for consumer-grade capture. Partner with iOS and ML teams to integrate algorithms into on-device or hybrid pipelines. Produce technical documentation suitable for regulatory-risk positioning and Apple review support. Benchmark report across device models, skin tones, lighting, and motion conditions. Biomarker feature specification sheet with recommended thresholds and confidence bands. A/B results showing improvements in stability, missingness, and downstream inference performance.

Negotiation

View details

Senior Mobile App Development Engineer

Others - Viet Nam


Outsource

  • iOS

Develop and enhance capture flows using TrueDepth, ARKit, AVFoundation, and CoreMotion. Convert algorithm requirements into reliable on-device data collection and frame synchronization. Create user-friendly quality control interface and capture-state instrumentation, including tasks such as lighting checks, face alignment, motion thresholds, occlusion detection, and retake prompts. Improve latency, thermal behavior, and battery usage for 10-second diagnostic-style capture. Ensure data privacy and compliance with privacy regulations in telemetry. Create technical documentation for the use of specific APIs and data handling for Apple's internal use. Build a modular capture SDK layer with feature flags and logging capabilities. Develop a cross-device performance matrix and tuning guide. Integrate deterministic quality control gates to minimize noisy biomarker outputs in the capture process.

Negotiation

View details

Senior Data Analyst, Anti-spoofing Operation

Ho Chi Minh - Viet Nam


Product

  • Data Analyst

Lead and develop analytical initiatives to address fraud detection challenges, from problem definition through implementation, monitoring, and continuous optimization. Spearhead strategic efforts to combat emerging fraud patterns and develop innovative detection approaches through data analysis, testing, and impact measurement. Design and implement production-grade data pipelines, automated workflows, and advanced analytical frameworks using statistical methods, machine learning, and geospatial analysis. Conduct in-depth analyses of complex fraud operations and develop behavioral analytics systems. Translate technical findings into actionable recommendations for leadership, product teams, and engineering. Lead compliance analytics initiatives and define evaluation metrics for new features. Provide technical guidance and mentorship, establish best practices, and contribute to hiring top talent in analytics.

Negotiation

View details