Data Engineer

JOB DESCRIPTION

About the Role/position
The ideal candidate is an experienced data pipeline builder and data wrangler, will act as technical data engineering expert in international team (multilocation). He/she will actively participate in optimizing data systems or building them from the ground up for our oversee customers.
Responsibilities:
The Data engineer will be responsible for:
Develop and maintain data pipelines using ETL processes.
Work closely with data science team to implement data analytics pipelines.
Maintain security and data privacy, working closely with data protection officer.
Implement scalable architectural models for data processing and storage.
Build functionality for data ingestion from multiple heterogeneous sources in batch & realtime mode.
Help in scoping, estimation, and planning for various projects in the enterprise.
Provide technical support to project teams as needed.

JOB REQUIREMENT

Must have Technical Requirements / Qualifications
B.S. in Computer Science, related fields or commensurate work experience.
5+ years of experience in software development with 3+ years of experience of relevant data engineering like Spark, PySpark, Hive, HDFS, Pig… and ETL with large amounts of data.
Solid knowledge and experience of data processing languages, such as SQL, Python, and/or Scala.
Hand on experience with real-time data stream platforms such as Kafka and Spark Streaming.
Knowledge and experience on both relational databases (Oracle), NoSQL databases (e.g. MongoDB) and strong SQL querying skills, Performance Tuning are required.
Experience on complex regulatory data integration projects.
Agile based delivery knowledge
Excellent English communication – verbal, written, and presentation skills.
Strong teambuilding skills and teamwork orientation.
Strong creative problem-solving skills.
Nice to have Technical Requirements / Qualifications
DP-203 – Data Engineering on Microsoft Azure certificate.
Knowledge of at least one cloud environment (Azure, GCP, AWS, IBM)
Experience on Data Warehouse such as Teradata SQL, Informatica, Unix and Control-M.
Experience of data visualisation tools (e.g Tableau, Quantexa and SAS).
Experience on creating Slowly Changing Dimension type data tables in Hive using Spark framework.

WHAT'S ON OFFER

Competitive salary, health insurance covered for employee and dependents
Working on international projects. Professional and dynamic working environment
Achieving valuable experience with variety projects, new technologies and hundreds of talents
Receiving training opportunities including many technical seminars and soft skill training courses
Good opportunity for promotion through regular performance review system
Hybrid work

CONTACT

PEGASI – IT Recruitment Consultancy | Email: recruit@pegasi.com.vn | Tel: +84 28 3622 8666
We are PEGASI – IT Recruitment Consultancy in Vietnam. If you are looking for new opportunity for your career path, kindly visit our website www.pegasi.com.vn for your reference. Thank you!

Job Summary

Company Type:

Outsource

Technical Skills:

Data Engineering

Location:

Ho Chi Minh - Viet Nam

Working Policy:

Salary:

Negotiate

Job ID:

J01231

Status:

Close

Related Job:

Senior Backend Engineer - NAVER Financial

Ho Chi Minh - Viet Nam


Product

Responsible for developing and maintaining the server-side components using the Kotlin programming language. Write and maintain technical documentation, including system architecture and API specifications.

Negotiation

View details

Android Engineer (Java/Kotlin)

Ho Chi Minh - Viet Nam


Product

  • Android

Develop Android App part of various Services Develop new services and improve structures Analyze and apply new technologies to services

Negotiation

View details

C++ Engineer - Market Data

Ho Chi Minh, Ha Noi - Viet Nam


Product

  • C/C++

Maintain/enhance our legacy C++-based tick data processing platform as needed, demonstrating product ownership, and helping migrate datasets to our new tick data processing platform. Contribute to our new, modern C++-based tick data processing platform - enhancing the platform to support additional tick data feeds across asset classes, developing/reviewing the implementation of tick data-based interval features/statistics, and adding new functionality to the platform all while maintaining high software standards and best practices. Collaborate with the Research and Portfolio Management organizations to facilitate the transition from our legacy platform to our new platform, supporting their price volume data needs for signal generation.

Negotiation

View details