Cloud Engineer (AWS Kafka)

ABOUT CLIENT

Our client is using new technology to develop products for the banking industry

JOB DESCRIPTION

Designing, implementing, and maintaining streaming solutions using AWS Managed Streaming for Apache Kafka (MSK).
Monitoring and managing Kafka clusters to ensure optimal performance, scalability, and uptime.
Configuring and fine-tuning MSK clusters, including partitioning strategies, replication, and retention policies.
Collaborating with engineering teams to design and implement event-driven systems and microservices architectures.
Developing and maintaining robust data pipelines for real-time data processing and streaming using Kafka.
Ensuring seamless integration between MSK/SQS/SNS and other AWS services such as Lambda, EventBridge Pipes, S3.
Analyzing and optimizing the performance of Kafka clusters and streaming pipelines to meet high-throughput and low-latency requirements.
Implementing best practices for Kafka topic design, consumer group management, and message serialization (e.g., Avro).
Implementing security best practices for MSK, including encryption, authentication, and access controls.
Ensuring compliance with industry standards and regulations related to data streaming and event processing.
Setting up comprehensive monitoring and alerting for Kafka clusters and streaming applications using AWS CloudWatch and Datadog.
Troubleshooting and resolving issues related to data loss, message lag, and streaming failures.
Designing and implementing data integration solutions to stream data between various sources and targets using MSK.
Leading data transformation and enrichment processes to ensure data quality and consistency in streaming applications.

JOB REQUIREMENT

Bachelor's or Master's degree in Computer Science, Information Technology, or related field.
Minimum 5 years of experience in event-driven architectures and streaming solutions.
Proficiency in Apache Kafka, with at least 2 years specifically in AWS MSK.
Design and implementation experience of high-throughput, low-latency streaming applications in AWS environments.
Strong understanding of Kafka internals and proficiency in programming languages such as Java, Python, or Scala.
Experience with AWS services like Lambda, Kinesis, S3, and IAM in conjunction with MSK.
Familiarity with CI/CD tools and IaC tools like CloudFormation, Terraform, or CDK.
Strong analytical and problem-solving skills with effective communication and collaboration abilities.
AWS Certified Solutions Architect, AWS Certified Developer, or similar AWS certification.
Strong analytical and problem-solving skills and effective communication and collaboration abilities.
Ability to manage multiple priorities and projects in a fast-paced environment.

WHAT'S ON OFFER

Company offers meal and parking benefits.
Full benefits and probationary salary provided.
Insurance coverage as per Vietnamese labor law and premium health care for employees and their families.
Work environment is values-driven, international, and agile in nature.
Opportunities for overseas travel related to training and work.
Participation in internal Hackathons and company events such as team building, coffee runs, and blue card activities.
Additional benefits include a 13th-month salary and performance bonuses.
Employees receive 15 days of annual leave and 3 days of sick leave per year.
Work-life balance with a 40-hour workweek from Monday to Friday.

CONTACT

PEGASI – IT Recruitment Consultancy | Email: recruit@pegasi.com.vn | Tel: +84 28 3622 8666
We are PEGASI – IT Recruitment Consultancy in Vietnam. If you are looking for new opportunity for your career path, kindly visit our website www.pegasi.com.vn for your reference. Thank you!

Job Summary

Company Type:

Digital Bank, Product

Technical Skills:

Kafka, AWS

Location:

Ho Chi Minh - Viet Nam

Working Policy:

Hybrid

Salary:

Negotiation

Job ID:

J01556

Status:

Close

Related Job:

Senior/ Lead Data Engineer (Data Platform / MLOps)

Ho Chi Minh, Ha Noi - Viet Nam


Information Technology & Services

You will be responsible for managing, designing, and enhancing data systems and workflows that drive key business decisions. The role is focused 75% on data engineering, involving the construction and optimization of data pipelines and architectures, and 25% on supporting data science initiatives through collaboration with data science teams for machine learning workflows and advanced analytics. You will leverage technologies like Python, Airflow, Kubernetes, and AWS to deliver high-quality data solutions. Architect, develop, and maintain scalable data infrastructure, including data lakes, pipelines, and metadata repositories, ensuring the timely and accurate delivery of data to stakeholders. Work closely with data scientists to build and support data models, integrate data sources, and support machine learning workflows and experimentation environments. Develop and optimize large-scale, batch, and real-time data processing systems to enhance operational efficiency and meet business objectives. Leverage Python, Apache Airflow, and AWS services to automate data workflows and processes, ensuring efficient scheduling and monitoring. Utilize AWS services such as S3, Glue, EC2, and Lambda to manage data storage and compute resources, ensuring high performance, scalability, and cost-efficiency. Implement robust testing and validation procedures to ensure the reliability, accuracy, and security of data processing workflows. Stay informed of industry best practices and emerging technologies in both data engineering and data science to propose optimizations and innovative solutions.

Negotiation

View details

Senior Software Engineer (Backend + Network)

Ho Chi Minh - Viet Nam


Product

  • Backend
  • Network

Continuously monitor and analyze new VPN providers, proxy services, and anonymization tools. Conduct investigations on IP allocation patterns, hosting provider behaviors, and network infrastructure changes. Develop and maintain comprehensive databases of known VPN/proxy IP ranges and behavioral signatures. Research emerging threats such as residential proxies, mobile proxies, and distributed proxy networks. Monitor darkweb marketplaces and security forums for emerging proxy/VPN trends. Perform deep packet analysis and network traffic pattern recognition. Develop and maintain the system using PHP, Python, and/or Go. Optimize VPN/Proxy detection algorithms. Design scalable infrastructure to handle millions of IP lookups per day. Implement monitoring and alerting systems for detection accuracy and system performance. Analyze production incidents related to false positives/negatives in threat detection. Collaborate with DevOps team on deployment of detection rule updates and model improvements. Investigate customer-reported bypass attempts and develop rapid response solutions. Provide technical expertise during customer security consultations. Support sales engineering team with technical demonstrations and proof-of-concepts. Document threat analysis findings and detection methodologies for internal and customer use. Maintain relationships with cybersecurity vendors, threat intelligence providers, and ISPs. Monitor industry threat reports, security advisories, and academic research. Participate in cybersecurity conferences and forums to stay current with the threat landscape. Contribute to open-source security tools and research when appropriate.

Negotiation

View details

Data Engineer - RefData

Ho Chi Minh, Ha Noi - Viet Nam


product, Investment Management

  • Data Engineering

Developing an automated data processing system and overseeing its maintenance Consolidating and integrating various data sources and databases into a unified system Designing interfaces and micro services using Python Enhancing the organization's data through NLP and AI models Preparing and cleaning semi-structured or unstructured data Creating effective algorithms for data processing Testing and incorporating external APIs Assisting the Business Analysts team

Negotiation

View details