AWS DevOps Lead

ABOUT CLIENT

Our client is using new technology to develop products for the banking industry

JOB DESCRIPTION

Work with various teams to collect requirements and create scalable and sustainable software solutions
Create integration solutions to facilitate smooth communication between microservices, APIs, and external systems
Contribute to the development of continuous delivery, automation frameworks, and pipelines to enhance the developer and customer experience
Improve database interactions and maintain data integrity across distributed systems
Establish best practices for messaging, integration, and data pipeline architectures
Identify and implement enhancements for automation processes and tools
Acquire new skills and support the adoption of a continuous delivery and cloud-first approach.

JOB REQUIREMENT

Minimum 7 years of backend development experience using Python or Java, including at least 2 years in a lead role.
Proficiency in Apache Kafka, including Kafka Connect, Schema Registry, and related components.
Expertise in building and deploying microservice and event-driven architecture, distributed systems, event sourcing, and CQRS patterns.
Experience with AWS foundation services such as VPC, ECS, Lambda, RDS, SNS, SQS, and Eventbridge.
Hands-on experience with tools like Kafka Connectors and Debezium.
Strong experience with application integration patterns, RESTful APIs, and messaging protocols.
Ability to conduct hands-on troubleshooting and optimization of the platform, collaborating closely with team members.
Capability to design scalable systems and multi-country patterns for platforms.
Familiarity with AWS CloudFormation, Terraform, or CDK for infrastructure provisioning.
A focus on automation and the ability to develop tooling for enhancing the efficiency of repeatable tasks, reliability, and performance.
Understanding of cloud change management practices, compliance, and security standards.
Strong English language skills for effective communication and coordination with business partners and technical teams.
Strong logical thinking and problem-solving abilities.
Curiosity and a self-learning attitude are highly desirable.
Big Plus:
AWS Certification in DevOps, SysOps, or Advance Networking Speciality.

WHAT'S ON OFFER

Company offers meal and parking benefits.
Full benefits and probationary salary provided.
Insurance coverage as per Vietnamese labor law and premium health care for employees and their families.
Work environment is values-driven, international, and agile in nature.
Opportunities for overseas travel related to training and work.
Participation in internal Hackathons and company events such as team building, coffee runs, and blue card activities.
Additional benefits include a 13th-month salary and performance bonuses.
Employees receive 15 days of annual leave and 3 days of sick leave per year.
Work-life balance with a 40-hour workweek from Monday to Friday.

CONTACT

PEGASI – IT Recruitment Consultancy | Email: recruit@pegasi.com.vn | Tel: +84 28 3622 8666
We are PEGASI – IT Recruitment Consultancy in Vietnam. If you are looking for new opportunity for your career path, kindly visit our website www.pegasi.com.vn for your reference. Thank you!

Job Summary

Company Type:

Digital Bank, Product

Technical Skills:

Devops, AWS

Location:

Ho Chi Minh - Viet Nam

Working Policy:

Hybrid

Salary:

Negotiation

Job ID:

J01325

Status:

Active

Related Job:

Product Quality Engineer

Ho Chi Minh - Viet Nam


Product

  • Automation Test
  • Devops

Develop and implement thorough testing strategies across web, API platform, desktop, and mobile platforms Create automated test suites and Auto QA agents for continuous releases, model updates, and API integrations Manage build and CI/CD pipelines to ensure product functionality across major operating systems Verify compatibility between web clients and different server node versions, including upgrade paths and backwards compatibility testing Validate resource management and performance optimizations across various hardware configurations, including GPU acceleration Engage in the Discord community and GitHub Issues to translate feedback into practical test cases Oversee release cycles, prioritize bugs, and provide timely alerts Generate user-friendly documentation to help users resolve issues

Negotiation

View details

Growth Marketing Lead

Ho Chi Minh - Viet Nam


Product

  • Marketing

Leading growth strategy and execution for user acquisition, activation, and retention. Developing clear and consistent positioning and messaging around the product's mission of local, open intelligence. Managing the product's Discord and social media accounts, growing online engagement, and organizing in-person meetups in various locations. Creating and distributing product explainers, updates, and thought leadership content to build trust and awareness. Utilizing analytics tools to track funnels and optimize outcomes. Collaborating with founders, designers, and engineers to align growth with the product direction and user feedback.

Negotiation

View details

Senior/ Lead Data Engineer (Data Platform / MLOps)

Ho Chi Minh, Ha Noi - Viet Nam


Information Technology & Services

You will be responsible for managing, designing, and enhancing data systems and workflows that drive key business decisions. The role is focused 75% on data engineering, involving the construction and optimization of data pipelines and architectures, and 25% on supporting data science initiatives through collaboration with data science teams for machine learning workflows and advanced analytics. You will leverage technologies like Python, Airflow, Kubernetes, and AWS to deliver high-quality data solutions. Architect, develop, and maintain scalable data infrastructure, including data lakes, pipelines, and metadata repositories, ensuring the timely and accurate delivery of data to stakeholders. Work closely with data scientists to build and support data models, integrate data sources, and support machine learning workflows and experimentation environments. Develop and optimize large-scale, batch, and real-time data processing systems to enhance operational efficiency and meet business objectives. Leverage Python, Apache Airflow, and AWS services to automate data workflows and processes, ensuring efficient scheduling and monitoring. Utilize AWS services such as S3, Glue, EC2, and Lambda to manage data storage and compute resources, ensuring high performance, scalability, and cost-efficiency. Implement robust testing and validation procedures to ensure the reliability, accuracy, and security of data processing workflows. Stay informed of industry best practices and emerging technologies in both data engineering and data science to propose optimizations and innovative solutions.

Negotiation

View details