Data Engineer

JOB DESCRIPTION

Develop and maintain database scripts, functions, stored procedures to support application development.
Assist Application developers/Testers with database activities
Participate in Database design
Develop and maintain ETL solution
Develop, deploy, and maintain BI solution
Process the client’s requests, feedback, and fix the bugs
Troubleshoot database performance-related and tune complex database queries
Review SQL statements and stored procedures written by developers 

JOB REQUIREMENT

Must have
At least intermediate level of English level
Likely having 3+ years of experience depending on how fast of your learning and developing technical capability
Ability to obtain deep knowledge of the project technologies and work independently with minimum guidance
Ability to self-learn and adapt to new technologies quickly
Fluency in SQL and experience in Extract, Transform, and Load (ETL) development in SQL
Good knowledge and hands-on experience of using popular ETL tools such as: SSIS, Talend or Pentaho
Good knowledge of MS SQL Server (especially with monitoring, performance tuning, optimization)
Solid skill in developing and optimizing SQL queries, store procedures with MS SQL
Good analyzing skill, be able to analyze, design and initialize database model for small and medium size application.
Experience in designing and developing data integration/ data warehousing/ business intelligence solutions
Have knowledge Data design (Start Schema, Snowflake schema)
Nice to have
Experience with reporting and BI tools such as SSRS, Power BI, Tableau
Experience of using cloud platform (AWS/Azure/GCP)
Experience one of the following programming languages: Python
Interested in data business analysis to build metrics and reports

WHAT'S ON OFFER

Competitive salary and bonuses: You don’t have to pay for your medical – social - unemployment insurance and your personal income tax. We will cover all for you.
Guaranteed 13th month salary.
Loyalty bonus equal to 50% of your monthly NET salary each year after the first working year.
Monthly lunch allowance, free daily fruit – snack – coffee, and sponsored sport clubs.
Premium health insurance & Free annual medical check.
14 days annual leave, add 1 day biennial.
Very clear career path for Engineers so that our client can offer you many online/in-house training courses, not only hard skills/technical skills, but also soft skills. We also sponsor to get technical certificates that you can use for your qualifications.
Enjoy English speaking environment. You will be more confident in your English skills because we offer tuition fee sponsor and English proficiency bonus.
Regular parties & gifts in yearly special days: Team dinners, End Year Party, company trip, team building activities, Christmas, Tet holiday, etc.…
Chance to work with top talents from Switzerland, Germany, Greece, and challenges with latest technologies (microservices, CI/CD, latest version of .NET Core, Angular…), as well as with different business domains (ecommerce, automotive, logistics, insurance, healthcare).
Professional Agile software development.
Exchanging knowledge with 20 internal communities (Java, .NET, PHP, Cloud Computing, Mobile Development, IoT, Cryptocurrencies).

CONTACT

PEGASI – IT Recruitment Consultancy | Email: recruit@pegasi.com.vn | Tel: +84 28 3622 8666
We are PEGASI – IT Recruitment Consultancy in Vietnam. If you are looking for new opportunity for your career path, kindly visit our website www.pegasi.com.vn for your reference. Thank you!

Job Summary

Company Type:

Global Outsourcing

Technical Skills:

Data Engineering, ETL/ELT

Location:

Ho Chi Minh - Viet Nam

Working Policy:

Salary:

Negotiation

Job ID:

J01038

Status:

Close

Related Job:

Senior DevOps Engineer

Ho Chi Minh - Viet Nam


Product

  • Devops
  • Cloud
  • Kubernetes

Manage VM/Cloud Infrastructure: Ensure that web servers and cloud services (AWS, GCP) are stable and perform optimally. Manage both on-prem and cloud-based infrastructure following DevSecOps best practices, including network design and segmentation Develop and Maintain Scripts and Tools: Write and maintain scripts (bash, python) to automate routine tasks and improve system efficiency. Build and Contribute to Our Monitoring System: Set up and manage monitoring systems using Prometheus and Grafana to track system performance and send alerts. Prepare CI/CD Pipelines: Implement and maintain automated deployment pipelines using GitLab CI, ArgoCD, and FluxCD. Design and Optimize Infrastructure: Design and optimize the infrastructure to ensure system stability, minimize downtime, and enhance overall performance. Web Server and Platform Management: Manage web server configurations and security, including Nginx, Kubernetes ingress, load balancers, DNS, WAF, and firewall rules, ensuring high availability and secure operations. Collaborate with Development Teams: Work closely with development and production teams to streamline deployment processes and resolve system and security-related issues.

Negotiation

View details

Data Experience Lead

Ho Chi Minh - Viet Nam


Product

  • Data Science
  • Management

Train pods in designing, building, deploying, and maintaining Data Products based on established playbooks. Simplify and translate platform playbooks into actionable user guides. Assist teams transitioning into new Data Mesh roles (DPO, Steward, Data Architect, Analytics Engineer, etc.). Provide hands-on support for early-wave or complex Data Products. Make data products and platform accessible and engaging for all staff across the organization. Develop a digital enablement portal including guides, checklists, templates, and videos. Create structured training pathways and capability improvement programs for all affected staff/users. Generate clear visual materials such as diagrams, flows, web-style docs, and promotional videos to aid adoption and understanding. Facilitate onboarding, workshops, roadshows, Q&A sessions, town hall presentations, and demos. Offer structured guidance across ingestion patterns, medallion design, semantics, quality, and metrics to ensure consistency in a mesh environment. Execute or coordinate targeted POCs for pods needing specialized help. Identify and communicate reusable patterns back to the Data Mesh Platform Team. Organize Showcases to create visibility, excitement, and promote reuse. Oversee the end-to-end user experience design for the Data Mesh Platform, aiming for clarity, trust, and ease of use. Shape how users discover, understand, and interact with data products across domains. Maintain UX standards in partnership with the customer-facing UX Design team. Take a deeply user-centric approach to drive change through intuitive and guided technology. Engage with end users to understand needs and gather insights. Integrate continuous feedback loops and iterate quickly to improve platform usability. Ensure all Mesh Experience features support adoption and reinforce the "data-as-a-product" mindset. Maintain active channels for communication and updates. Communicate expectations, standards, and timelines clearly. Highlight wins and success stories to build momentum. Curate relevant external content to support the transformation. Monitor progress of rollout, leader boards, and raise blockers with appropriate stakeholders. Utilize data to highlight platform adoption, culture change, wins, and challenges. Produce clear and compelling summaries on adoption progress for decision making. Manage end user feedback and be the link between users and the platform team.

Negotiation

View details

Data Scientist Lead

Ho Chi Minh - Viet Nam


Outsource

  • Machine Learning
  • Data Engineering
  • Cloud
  • Management

Creating robust ETL/ELT data pipelines for structured and unstructured data Developing interactive dashboards and visualizations for effective communication of insights Evaluating, deploying, and evaluating machine learning and/or generative AI models Applying statistical analysis and mathematical modeling to extract insights from complex datasets Working with various teams to deliver data-driven solutions Creating and maintaining scalable ML pipelines and APIs for real-time and batch inference Ensuring best practices in model versioning, reproducibility, observability, and governance (MLOps) Staying updated with AI/ML trends and contributing to projects involving semantic search, knowledge graphs, or retrieval-augmented generation as necessary.

Negotiation

View details