Cloud Engineer (AWS Kafka)

ABOUT CLIENT

Our client is using new technology to develop products for the banking industry

JOB DESCRIPTION

We are looking for a highly experienced Cloud Engineer with strong expertise in AWS Managed Streaming for Apache Kafka (MSK) to join our engineering team. The role involves designing, implementing, and managing event-driven architectures and real-time streaming solutions using AWS MSK. This position is vital in ensuring our systems can efficiently process large-scale data streams while maintaining high availability, scalability, and reliability.
 
AWS MSK Management
Designing, implementing, and maintaining streaming solutions using AWS Managed Streaming for Apache Kafka (MSK).
Monitoring and managing Kafka clusters to ensure optimal performance, scalability, and uptime.
Configuring and fine-tuning MSK clusters, including partitioning strategies, replication, and retention policies.
 
Event-Driven Architecture
Collaborating with engineering teams to design and implement event-driven systems and microservices architectures.
Developing and maintaining robust data pipelines for real-time data processing and streaming using Kafka.
Ensuring seamless integration between MSK/SQS/SNS and other AWS services such as Lambda, EventBridge Pipes, S3.
 
Performance Optimization
Analyzing and optimizing the performance of Kafka clusters and streaming pipelines to meet high-throughput and low-latency requirements.
Implementing best practices for Kafka topic design, consumer group management, and message serialization (e.g., Avro).
 
Security and Compliance
Implementing security best practices for MSK, including encryption, authentication, and access controls.
Ensuring compliance with industry standards and regulations related to data streaming and event processing.
 
Monitoring and Troubleshooting
Setting up comprehensive monitoring and alerting for Kafka clusters and streaming applications using AWS CloudWatch and Datadog.
Troubleshooting and resolving issues related to data loss, message lag, and streaming failures.
 
Data Integration and ETL
Designing and implementing data integration solutions to stream data between various sources and targets using MSK.
Leading data transformation and enrichment processes to ensure data quality and consistency in streaming applications.

JOB REQUIREMENT

Must have:
Bachelor's or Master's degree in Computer Science, Information Technology, or related field.
Minimum 5 years of experience in event-driven architectures and streaming solutions.
Proficiency in Apache Kafka, with at least 2 years specifically in AWS MSK.
Design and implementation experience of high-throughput, low-latency streaming applications in AWS environments.
Strong understanding of Kafka internals and proficiency in programming languages such as Java, Python, or Scala.
Experience with AWS services like Lambda, Kinesis, S3, and IAM in conjunction with MSK.
Familiarity with CI/CD tools and IaC tools like CloudFormation, Terraform, or CDK.
Strong analytical and problem-solving skills with effective communication and collaboration abilities.
 
Nice to have:
AWS Certified Solutions Architect, AWS Certified Developer, or similar AWS certification.
Strong analytical and problem-solving skills and effective communication and collaboration abilities.
Ability to manage multiple priorities and projects in a fast-paced environment.

WHAT'S ON OFFER

Company offers meal and parking benefits.
Full benefits and probationary salary provided.
Insurance coverage as per Vietnamese labor law and premium health care for employees and their families.
Work environment is values-driven, international, and agile in nature.
Opportunities for overseas travel related to training and work.
Participation in internal Hackathons and company events such as team building, coffee runs, and blue card activities.
Additional benefits include a 13th-month salary and performance bonuses.
Employees receive 15 days of annual leave and 3 days of sick leave per year.
Work-life balance with a 40-hour workweek from Monday to Friday.

CONTACT

PEGASI – IT Recruitment Consultancy | Email: recruit@pegasi.com.vn | Tel: +84 28 3622 8666
We are PEGASI – IT Recruitment Consultancy in Vietnam. If you are looking for new opportunity for your career path, kindly visit our website www.pegasi.com.vn for your reference. Thank you!

Job Summary

Company Type:

Digital Bank, Product

Technical Skills:

Kafka, AWS

Location:

Ho Chi Minh - Viet Nam

Salary:

Negotiation

Job ID:

J01556

Status:

Active

Related Job:

Senior Machine Learning Engineer

Ho Chi Minh, Ha Noi - Viet Nam


Information Technology & Services

  • Machine Learning

Creating the V1 Evaluation Platform: You will be responsible for designing and building the core backend systems for our new LLM Evaluation Platform, using Arize Phoenix as the basis for traces, evaluations, and experiments. Implementing Production Observability: You will need to architect and implement the observability backbone for our AI services by integrating Phoenix with OpenTelemetry to establish a centralized system for logging, tracing, and evaluating LLM behavior in production. Standardizing LLM Deployment Pipeline: You will be in charge of designing and implementing the CI/CD framework for versioning, testing, and deploying prompt-based logic and LLM configurations, ensuring reproducible and auditable deployments across all AI features. Providing Practical Solutions: Your role will involve making pragmatic technical decisions that prioritize business value and speed of delivery, in line with our early-stage startup environment. Collaborating with Other Teams: You will work closely with the Data Science team to understand their workflow and ensure that the platform you build meets their core needs for experiment tracking and validation. Establishing Core Patterns: You will also help in establishing and documenting the initial technical patterns for MLOps and model evaluation that will serve as the foundation for future development.

Negotiation

View details

Fullstack Engineer - BRAIN

Ho Chi Minh - Viet Nam


product, Investment Management

  • Frontend
  • Backend

Create intricate single page applications. Construct components that can be used across various interfaces. Design layouts that are responsive for both desktop and mobile devices. Automate the testing procedures for the user interface. Develop services and APIs for backend applications. Incorporate AWS and external cloud services. Enhance application speed and scalability. Actively contribute to an agile engineering team focused on continual improvement. Utilize leading open-source technologies like MySQL, PostgreSQL, ELK stack, Sentry, Redis, Git, etc. Take part in periodic on-call responsibilities.

Negotiation

View details

Lead Engineer (Power Platform)

Ho Chi Minh - Viet Nam


IT Service Provider

  • Power Platform

As the lead engineer, the role involves using generative AI technologies in conjunction with Microsoft Power Platform services to solve business and development challenges for clients. The primary responsibilities include project acquisition and management, as well as ensuring efficient communication with stakeholders during all project phases to meet project goals and ensure client satisfaction. The role also involves leading projects using tools such as Azure OpenAI, Microsoft Copilot, Microsoft 365, and Power Platform. This includes supporting the implementation of generative AI and Power Platform, prototype development, business application development, business process automation, RAG system utilization, and ongoing operation and maintenance. Additionally, a key part of the role is fostering effective communication with internal and external stakeholders to ensure successful project execution.

Negotiation

View details