Talend Engineer (Big Data)

ABOUT CLIENT

Our client is a software company that provides SaaS solutions for businesses

JOB DESCRIPTION

Create, test, and implement new ETL pipelines or improvements to existing pipelines in a Big Data environment using Talend
Carry out software development for applications, including new development, maintenance and support, and provide production support
Translate functional requirements into technical designs
Automate the ETL process for various datasets being ingested into the big data platform
Develop and integrate software applications following appropriate development methodologies and standards, applying standard architectural patterns, and considering performance and security measures
Address customer complaints with data and consider suggestions for improvements and enhancements
Make recommendations for technical aspects of projects and system improvements
Ensure compliance with established standards and may advise senior managers on technology solutions.

JOB REQUIREMENT

More than 5 years of relevant IT experience
Over 3 years of development experience specifically with the Talend Data Integration module
2+ years of experience in Talend Administration Center, Data Quality, API Designer and Services, and Big Data Frameworks modules
2+ years of hands-on experience with Hadoop, Hive, HDFS, and Oracle RDBMS
Proficient in database schema, object management, data modeling & architecture, and data warehouse design
Strong knowledge of Java programming and PL-SQL
Proficient in API integration (REST)
Familiarity with Unix/Linux and Shell script
Competent in source-code control using a tool such as GitLab
Experience with CI/CD and DevOps development following secure coding practices
Additional expertise in performance tuning is an advantage
Familiarity with emerging cloud technologies related to Big Data is an added bonus
Previous experience in the Finance/Banking industry would be an advantage
Willingness to undertake any additional job duties as required

WHAT'S ON OFFER

Comprehensive healthcare coverage
Dental care benefits
Generous paid leave policy
Support for new parents
Assistance with childcare costs

CONTACT

PEGASI – IT Recruitment Consultancy | Email: recruit@pegasi.com.vn | Tel: +84 28 3622 8666
We are PEGASI – IT Recruitment Consultancy in Vietnam. If you are looking for new opportunity for your career path, kindly visit our website www.pegasi.com.vn for your reference. Thank you!

Job Summary

Company Type:

Information Technology & Services

Technical Skills:

Data Engineering, ETL/ELT, Big Data

Location:

Ho Chi Minh - Viet Nam

Salary:

Negotiation

Job ID:

J01507

Status:

Close

Related Job:

Senior DevOps (Data Platform)

Ho Chi Minh - Viet Nam


Digital Bank, Product

  • Devops
  • Spark

Managing workloads on EC2 clusters using DataBricks/EMR for efficient data processing Collaborating with stakeholders to implement a Data Mesh architecture for multiple closely related enterprise entities Utilizing Infrastructure as Code (IaC) tools for defining and managing data platform user access Implementing role-based access control (RBAC) mechanisms to enforce least privilege principles Collaborating with cross-functional teams to design, implement, and optimize data pipelines and workflows Utilizing distributed engines such as Spark for efficient data processing and analysis Establishing operational best practices for data warehousing tools Managing storage technologies to meet business requirements Troubleshooting and resolving platform-related issues Staying updated on emerging technologies and industry trends Documenting processes, configurations, and changes for comprehensive system documentation.

Negotiation

View details

Senior Machine Learning Engineer

Ho Chi Minh, Ha Noi - Viet Nam


Information Technology & Services

  • Machine Learning

We are seeking a pragmatic Senior Machine Learning Engineer to accelerate our MLOps roadmap. Your primary mission will be to own the design and implementation of our V1 LLM Evaluation Platform, a critical system that will serve as the quality gate for all our AI features. You will be a key builder on a new initiative, working alongside dedicated Data Engineering and DevOps experts to deliver a tangible, high-impact platform. This role is for a hands-on engineer who thrives on building robust systems that provide leverage. You will be fully empowered to own the implementation and success of this project Build the V1 Evaluation Platform: Proactively own the end-to-end process of designing and building the core backend systems for our new LLM Evaluation Platform, leveraging Arize Phoenix as the foundational framework for traces, evaluations, and experiments. Implement Production Observability: Architect and implement the observability backbone for our AI services, integrating Phoenix with OpenTelemetry to create a centralized system for logging, tracing, and evaluating LLM behavior in production. Standardize LLM Deployment Pipeline: Design and implement the CI/CD framework for versioning, testing, and deploying prompt-based logic and LLM configurations, ensuring reproducible and auditable deployments across all AI features. Deliver Pragmatic Solutions: Consistently make pragmatic technical decisions that prioritize business value and speed of delivery, in line with our early-stage startup environment. Cross-functional Collaboration: Work closely with our Data Science team to understand their workflow and ensure the platform you build meets their core needs for experiment tracking and validation. Establish Core Patterns: Help establish and document the initial technical patterns for MLOps and model evaluation that will serve as the foundation for future development.

Negotiation

View details

Fullstack Engineer - BRAIN

Ho Chi Minh - Viet Nam


product, Investment Management

  • Frontend
  • Backend

Create intricate single page applications. Construct components that can be used across various interfaces. Design layouts that are responsive for both desktop and mobile devices. Automate the testing procedures for the user interface. Develop services and APIs for backend applications. Incorporate AWS and external cloud services. Enhance application speed and scalability. Actively contribute to an agile engineering team focused on continual improvement. Utilize leading open-source technologies like MySQL, PostgreSQL, ELK stack, Sentry, Redis, Git, etc. Take part in periodic on-call responsibilities.

Negotiation

View details