Senior Big Data Engineer

JOB DESCRIPTION

Design, Develop and enhancement Big Data Systems such as data warehouse, data lake
Designing and develop big data applications
Creating data processing frameworks
Extracting data and isolating data clusters
Doing POC & present about Big Data solutions for new project
Troubleshooting application bugs
Maintaining the data security
Develop and update technical documentation
Develop testing scripts and analyzing results

JOB REQUIREMENT

At least 5 years of experience in the relevant technologies
Bachelor degree in IT/ Computer Science or relevant background
Experience in the Hadoop ecosystem and its components: HDFS, Yarn, MapReduce, Apache Spark (Python/Scala), Apache Sqoop, Apache Impala, Apache Avro, Apache Flume, Apache Kafka
Preferred: having certificate CCA175 – Spark and Hadoop Developer
Designed and developed ETL process
Experienced in Unix with Scripting experience is preferred
Should have strong knowledge on concepts of data warehousing models, data ingestion patterns, data quality and data governance
Experience on the Hadoop systems with good understanding and knowledge of Hadoop cluster
Good at English communication skills

WHAT'S ON OFFER

Working in one of the Best Places to Work in Vietnam
Join a dynamic and fast growing global company (English-speaking environment)
13th-month salary bonus + attractive performance bonus (you'll love it!) + annual performance appraisal
100% monthly basic salary and mandatory social insurances in 2-month probation
Onsite opportunities: short-term and long-term assignments
15++ days of annual leave + 1 day of birthday leave
Premium health insurance for employee and 02 family members
Flexible working time
Lunch and parking allowance
Various training on hot-trend technologies/ foreign language (English/Chinese/Japanese) and soft-skills
Fitness & sport activities: football, badminton, yoga, Aerobic
Free in-house entertainment facilities and snack
Join in various team building, company trip, year-end party, tech talks and a lot of charity events

CONTACT

PEGASI – IT Recruitment Consultancy | Email: recruit@pegasi.com.vn | Tel: +84 28 3622 8666
We are PEGASI – IT Recruitment Consultancy in Vietnam. If you are looking for new opportunity for your career path, kindly visit our website www.pegasi.com.vn for your reference. Thank you!

Job Summary

Company Type:

outsourcing, Germany company

Technical Skills:

Data Engineering, Big Data

Location:

Ho Chi Minh - Viet Nam

Working Policy:

Salary:

Negotiation

Job ID:

J01277

Status:

Close

Related Job:

Senior SAP FI Consultant

Ho Chi Minh - Viet Nam


outsourcing, Germany company

  • SAP

Lead a team of Finance specialists onsite and assist in configuring the solution (must have hands-on configuration experience). Provides guidance in the definition of solution design practices and standards that link back to SAP best practices. Understand the customer business processes and the IT landscape rapidly and able to foresee the likely challenges Ability to lead and facilitate design workshops (Blueprint), assessments, planning sessions. Work with customer business teams & project teams effectively Translates business goals into appropriate solutions while assessing feasibility and optimization of the solution. Conduct work effort estimation and develop work plans. Develops and maintains working relationships with a diverse group of business, functional and technical teams. Adhere to project plans, tasks, and deliverable; identifies dependencies and resource requirements.

Negotiation

View details

Senior Product Manager

Ho Chi Minh, Ha Noi - Viet Nam


No.1 Construction Tech company in Japan

  • Product Management

#The Opportunity: As the promoter of product development projects at the company, you will gain experience in project management and development direction, working with the business department from project definition to requirements definition, development, testing, and release. In the future, you will be responsible for product management of the company's products.#Job Scope: This is an important position that involves understanding the characteristics of users of the service and driving product development projects while balancing the needs of product, sales, customer success, and engineers. #Product Management: Define the scope, objectives, and goals of product development. Build the product structure and process, create and promote schedules. #Development Direction: Working with product managers to organize requirements, design business processes, and define requirements. Creating and executing test and release plans. Creating explanatory materials for internal/customer use. Verifying post-release effectiveness and reporting. Create and maintain comprehensive product documentation #Development Environment: Backend: Ruby on Rails, Go, AWS, Elasticsearch, MySQL, DynamoDB, Redis, Terraform (IaC), OIDC (Authentication). Frontend: Next.js/React.js, Typescript, Vue.js/Nuxt.js. Mobile App: Kotlin, Swift, Flutter. CI/CD & DevOps: Docker, Kubernetes, CodePipeline, CodeBuild, CircleCI, GitHub Actions. Monitoring & Tools: Datadog, Sentry, Bugsnag, Swagger, ZenHub, Figma

Negotiation

View details

Senior/ Lead Data Engineer (Data Platform / MLOps)

Ho Chi Minh, Ha Noi - Viet Nam


Information Technology & Services

You will be responsible for managing, designing, and enhancing data systems and workflows that drive key business decisions. The role is focused 75% on data engineering, involving the construction and optimization of data pipelines and architectures, and 25% on supporting data science initiatives through collaboration with data science teams for machine learning workflows and advanced analytics. You will leverage technologies like Python, Airflow, Kubernetes, and AWS to deliver high-quality data solutions. Architect, develop, and maintain scalable data infrastructure, including data lakes, pipelines, and metadata repositories, ensuring the timely and accurate delivery of data to stakeholders. Work closely with data scientists to build and support data models, integrate data sources, and support machine learning workflows and experimentation environments. Develop and optimize large-scale, batch, and real-time data processing systems to enhance operational efficiency and meet business objectives. Leverage Python, Apache Airflow, and AWS services to automate data workflows and processes, ensuring efficient scheduling and monitoring. Utilize AWS services such as S3, Glue, EC2, and Lambda to manage data storage and compute resources, ensuring high performance, scalability, and cost-efficiency. Implement robust testing and validation procedures to ensure the reliability, accuracy, and security of data processing workflows. Stay informed of industry best practices and emerging technologies in both data engineering and data science to propose optimizations and innovative solutions.

Negotiation

View details