Big Data Engineer

JOB DESCRIPTION

Selecting and integrating any Big Data tools and frameworks required to provide requested capabilities.
Implementing ETL process to transform data from OLTP databases to OLAP DB and Data Lake using event streaming platforms such as Kafka.
Develop, transform large datasets and maintain robust data pipelines that can support various use cases with high performance.
Monitoring performance and advising any necessary infrastructure changes.
Defining data retention, data governance policies and framework

JOB REQUIREMENT

At least 5 years experience in Java programming languages.
At least 5 years experience with Big Data, Java Spring, Kafka Streams, Spark Streams frameworks.
Experience in large scale deployment and performance tuning.
Experience with schema design and dimensional data modeling
Experience with non-relational and relational databases (MySQL, MongoDB)
Experience building and optimizing ‘big data’ data pipelines, architectures and data sets.
Experience with data pipeline and workflow management tool
Fluent written and spoken English.
Strong analytical and problem-solving skills.
Bonus Points if You
Have experience with Delta Lake technology.
Good Docker/Kubernetes knowledge is a plus.
Good Kibana, Elasticsearch ELK stack knowledge is a plus.

WHAT'S ON OFFER

We have a track record of success and a vision and a plan for a promising future. Our company has closed to 100% market share for player location regulatory compliance in the US gaming space. And we have fuelled that momentum with the expansion into new markets - media & entertainment and fintech.
We are proud of our values and we live them in all of our actions, conversations, and work: There’s always a way; Together we can do more; Aim higher. Then higher; Act with integrity; For the greater good.
We are proud to be part of a global team that develops award-winning solutions for some of the world’s largest and most innovative companies.
We will support you on your learning journey. We invest in employee career growth and development. Our learning & development commitment includes leadership and technical development, a substantial budget for education and training, as well as dedicated work hours for self-study.
We care about our team. Our team is talented, has a bias for action, and is known for their positive attitude and energy. Team members are generously rewarded with competitive salaries, incentives, and a comprehensive benefits package.
We care about giving back to the communities in which we live and work. We supports a  broad range of community initiatives through donations and employee volunteer activities.
We know that work can be fun. We take the time to create employee events and experiences where everyone can connect and celebrate.

CONTACT

PEGASI – IT Recruitment Consultancy | Email: recruit@pegasi.com.vn | Tel: +84 28 3622 8666
We are PEGASI – IT Recruitment Consultancy in Vietnam. If you are looking for new opportunity for your career path, kindly visit our website www.pegasi.com.vn for your reference. Thank you!

Job Summary

Company Type:

Product

Technical Skills:

Data Engineering, Java

Location:

Ho Chi Minh - Viet Nam

Working Policy:

Salary:

Negotiation

Job ID:

J01078

Status:

Close

Related Job:

Senior Deep Learning Algorithms Engineer

Ho Chi Minh, Ha Noi - Viet Nam


Product

  • Machine Learning
  • Algorithm

Analyze and optimize deep learning training and inference workloads on advanced hardware and software platforms. Work with researchers and engineers to enhance workload performance. Develop high-quality software for deep learning platforms. Create automated tools for workload analysis and optimization.

Negotiation

View details

Software Engineer

Ho Chi Minh - Viet Nam


Product

Create and develop the API Platform with a focus on reliability, performance, and providing a top-tier developer experience Deploy and enhance AI/ML models in scalable, production environments in collaboration with research and applied ML teams Manage and advance a contemporary, cloud-native infrastructure stack utilizing Kubernetes, Docker, and infrastructure-as-code (IaC) tools Ensure platform dependability by designing and implementing telemetry, monitoring, alerting, autoscaling, failover, and disaster recovery mechanisms Contribute to developer and operations workflows, encompassing CI/CD pipelines, release management, and on-call rotations Work collaboratively across teams to implement secure APIs with fine-grained access control, usage metering, and billing integration Continuously enhance platform performance, cost-efficiency, and observability to accommodate scaling and serve users globally.

Negotiation

View details

Product Manager (Data & Models)

Ho Chi Minh - Viet Nam


Product

  • Product Management
  • AI

Designing data strategy and model integration for creating efficient data pipelines, evaluation frameworks, and annotation systems to maintain high-performance LLMs. Responsible for ensuring data quality standards and implementing bias mitigation and privacy-preserving techniques. Defining the product's core model roadmaps, taking into account technical feasibility, user needs, and ethical considerations. Collaboration with researchers to incorporate experimental breakthroughs into deployable features. Partnering with Engineering and Research teams to ensure model development aligns with product goals and advocating for transparency in model decision-making to build user trust. Analyzing usage patterns from open-source communities (Discord, Reddit, GitHub) to refine model behavior and address real-world edge cases, contributing to community-driven model evolution. Setting performance benchmarks, cost efficiency, and resource utilization standards for model scalability and reliability.

Negotiation

View details