Big Data Engineer

JOB DESCRIPTION

Selecting and integrating any Big Data tools and frameworks required to provide requested capabilities.
Implementing ETL process to transform data from OLTP databases to OLAP DB and Data Lake using event streaming platforms such as Kafka.
Develop, transform large datasets and maintain robust data pipelines that can support various use cases with high performance.
Monitoring performance and advising any necessary infrastructure changes.
Defining data retention, data governance policies and framework

JOB REQUIREMENT

At least 5 years experience in Java programming languages.
At least 5 years experience with Big Data, Java Spring, Kafka Streams, Spark Streams frameworks.
Experience in large scale deployment and performance tuning.
Experience with schema design and dimensional data modeling
Experience with non-relational and relational databases (MySQL, MongoDB)
Experience building and optimizing ‘big data’ data pipelines, architectures and data sets.
Experience with data pipeline and workflow management tool
Fluent written and spoken English.
Strong analytical and problem-solving skills.
Bonus Points if You
Have experience with Delta Lake technology.
Good Docker/Kubernetes knowledge is a plus.
Good Kibana, Elasticsearch ELK stack knowledge is a plus.

WHAT'S ON OFFER

We have a track record of success and a vision and a plan for a promising future. Our company has closed to 100% market share for player location regulatory compliance in the US gaming space. And we have fuelled that momentum with the expansion into new markets - media & entertainment and fintech.
We are proud of our values and we live them in all of our actions, conversations, and work: There’s always a way; Together we can do more; Aim higher. Then higher; Act with integrity; For the greater good.
We are proud to be part of a global team that develops award-winning solutions for some of the world’s largest and most innovative companies.
We will support you on your learning journey. We invest in employee career growth and development. Our learning & development commitment includes leadership and technical development, a substantial budget for education and training, as well as dedicated work hours for self-study.
We care about our team. Our team is talented, has a bias for action, and is known for their positive attitude and energy. Team members are generously rewarded with competitive salaries, incentives, and a comprehensive benefits package.
We care about giving back to the communities in which we live and work. We supports a  broad range of community initiatives through donations and employee volunteer activities.
We know that work can be fun. We take the time to create employee events and experiences where everyone can connect and celebrate.

CONTACT

PEGASI – IT Recruitment Consultancy | Email: recruit@pegasi.com.vn | Tel: +84 28 3622 8666
We are PEGASI – IT Recruitment Consultancy in Vietnam. If you are looking for new opportunity for your career path, kindly visit our website www.pegasi.com.vn for your reference. Thank you!

Job Summary

Company Type:

Product

Technical Skills:

Data Engineering, Java

Location:

Ho Chi Minh - Viet Nam

Salary:

Negotiation

Job ID:

J01078

Status:

Close

Related Job:

Senior DevOps (Data Platform)

Ho Chi Minh - Viet Nam


Digital Bank, Product

  • Devops
  • Spark

Managing workloads on EC2 clusters using DataBricks/EMR for efficient data processing Collaborating with stakeholders to implement a Data Mesh architecture for multiple closely related enterprise entities Utilizing Infrastructure as Code (IaC) tools for defining and managing data platform user access Implementing role-based access control (RBAC) mechanisms to enforce least privilege principles Collaborating with cross-functional teams to design, implement, and optimize data pipelines and workflows Utilizing distributed engines such as Spark for efficient data processing and analysis Establishing operational best practices for data warehousing tools Managing storage technologies to meet business requirements Troubleshooting and resolving platform-related issues Staying updated on emerging technologies and industry trends Documenting processes, configurations, and changes for comprehensive system documentation.

Negotiation

View details

Senior Machine Learning Engineer

Ho Chi Minh, Ha Noi - Viet Nam


Information Technology & Services

  • Machine Learning

Creating the V1 Evaluation Platform: You will be responsible for designing and building the core backend systems for our new LLM Evaluation Platform, using Arize Phoenix as the basis for traces, evaluations, and experiments. Implementing Production Observability: You will need to architect and implement the observability backbone for our AI services by integrating Phoenix with OpenTelemetry to establish a centralized system for logging, tracing, and evaluating LLM behavior in production. Standardizing LLM Deployment Pipeline: You will be in charge of designing and implementing the CI/CD framework for versioning, testing, and deploying prompt-based logic and LLM configurations, ensuring reproducible and auditable deployments across all AI features. Providing Practical Solutions: Your role will involve making pragmatic technical decisions that prioritize business value and speed of delivery, in line with our early-stage startup environment. Collaborating with Other Teams: You will work closely with the Data Science team to understand their workflow and ensure that the platform you build meets their core needs for experiment tracking and validation. Establishing Core Patterns: You will also help in establishing and documenting the initial technical patterns for MLOps and model evaluation that will serve as the foundation for future development.

Negotiation

View details

Fullstack Engineer - BRAIN

Ho Chi Minh - Viet Nam


product, Investment Management

  • Frontend
  • Backend

Create intricate single page applications. Construct components that can be used across various interfaces. Design layouts that are responsive for both desktop and mobile devices. Automate the testing procedures for the user interface. Develop services and APIs for backend applications. Incorporate AWS and external cloud services. Enhance application speed and scalability. Actively contribute to an agile engineering team focused on continual improvement. Utilize leading open-source technologies like MySQL, PostgreSQL, ELK stack, Sentry, Redis, Git, etc. Take part in periodic on-call responsibilities.

Negotiation

View details