Data Analyst

ABOUT CLIENT

Our client is one of the world’s largest providers of Consulting, Outsourcing and Technology Services

JOB DESCRIPTION

This role involves collaborating with a Business Analyst and ETL Developer to develop data models and ETL pipelines, and to carry out validations. The position will also entail close collaboration with clients on a diverse range of innovative projects, including:
Interpreting data and converting raw data into structured information
Collaborating with a Business Analyst to create data models
Working with an ETL developer to build data pipelines and perform validations
Collaborating with an ETL developer to create intricate data transformations and mappings using Python

JOB REQUIREMENT

Primary Skills:
Proficient in data modeling (conceptual, logical, and physical)
Skilled in SQL and Python languages
Experienced in ETL framework and SQL databases
Secondary Skills:
Familiarity with AWS Glue and AWS Athena
A degree in Computer Science, Computer Engineering, Information Management, Economics, or a related field
Strong English communication skills
Minimum of 4+ years of relevant data analysis experience
Exceptional analytical ability for collecting, organizing, and analyzing large amounts of information with attention to detail
Innovative problem-solving capabilities for complex data challenges
Proficiency in data analysis languages such as SQL and Python
Knowledge of the Insurance (Life/Non-life) domain is beneficial
Experience working with AWS Glue and AWS Athena is advantageous
Familiarity with Agile methodologies like Scrum and Kanban
Good communication and teamwork skills
Proactive and flexible working approach
Team-player with experience in international and multi-functional team environments
Self-development skills to stay updated with fast-changing trends

WHAT'S ON OFFER

Competitive compensation, comprehensive health insurance for employees and dependents.
Participation in international projects within a professional and dynamic work setting.
Gaining valuable experience with diverse projects, new technologies, and a multitude of talents.
Access to training opportunities, including technical seminars and soft skill courses.
Potential for promotion through a regular performance review system.

CONTACT

PEGASI – IT Recruitment Consultancy | Email: recruit@pegasi.com.vn | Tel: +84 28 3622 8666
We are PEGASI – IT Recruitment Consultancy in Vietnam. If you are looking for new opportunity for your career path, kindly visit our website www.pegasi.com.vn for your reference. Thank you!

Job Summary

Company Type:

Outsource

Technical Skills:

Data Analyst

Location:

Ho Chi Minh - Viet Nam

Working Policy:

Hybrid

Salary:

Negotiation

Job ID:

J01504

Status:

Close

Related Job:

Senior Business Analyst

Ho Chi Minh - Viet Nam


Outsource

  • Business Analyst

Negotiation

View details

Senior System Software Engineer - AI Data Platform - Inference Factory

Ho Chi Minh - Viet Nam


Product

  • Devops
  • C/C++
  • Python
  • Golang

Create infrastructure and tools to automate complex software processes effectively. Improve performance: Deploy advanced test harnesses, benchmarking frameworks, and analytical tools to thoroughly evaluate and enhance the performance and efficiency of software and hardware platforms. Utilize expertise in operating systems, kernel internals, device drivers, memory management, storage, networking, and high-speed interconnects to construct and troubleshoot high-performance systems. Collaborate with engineering teams to comprehend requirements and deliver efficient solutions. Establish performance objectives, assess feedback, analyze data, and continually enhance system reliability. Shape technical strategies: Contribute to developing technical strategies and roadmaps for platform automation initiatives to ensure they are in line with company goals and industry best practices.

Negotiation

View details

Lead Data Engineer

Ho Chi Minh, Ha Noi - Viet Nam


Outsource

  • Data Engineering
  • Management

Design, create, and maintain scalable data infrastructure, which includes data lakes, pipelines, and metadata repositories, to ensure accurate and timely delivery of data to stakeholders. Collaborate with data scientists to develop and maintain data models, integrate data sources, and facilitate machine learning workflows and experimentation environments. Build and enhance large-scale, batch, and real-time data processing systems to improve operational efficiency and align with business goals. Use Python, Apache Airflow, and AWS services to automate data workflows and processes, ensuring efficient scheduling and monitoring. Utilize AWS services like S3, Glue, EC2, and Lambda to manage data storage and compute resources, striving for high performance, scalability, and cost-effectiveness. Implement comprehensive testing and validation methods to guarantee the reliability, accuracy, and security of data processing workflows. Keep updated on the latest industry best practices and emerging technologies in data engineering and data science to suggest innovative solutions and enhancements.

Negotiation

View details