Site Reliability Engineer

JOB DESCRIPTION

We are seeking an engineer to ensure the reliability and performance of Our Client's Data Platform. Successful candidates will work with researchers, operations, and other technology teams to establish the smooth functioning of our production data pipeline sourced from an enormous and continuously updating catalog of vendor and market data. This engineer will also develop solutions to improve the efficiency and scalability of our ever-growing business-critical management system
Operate, monitor, and provision the system to make sure it works smoothly
Provide feedback for system improvement
Provide solutions for live monitoring of the production data pipeline
Design and implement continuous integration and test automation
Deliver release management solutions
Collaborate with engineering, analyst, and research teams to ensure the reliability and operability of new data pipeline components
Analyze and diagnose platform performance and reliability problems
Understand, manage, and utilize the right technologies for building our platforms such as Kubernetes, Kafka, and Spark

JOB REQUIREMENT

Bachelor’s degree in Computer Science or equivalent experience
Excellent analytical skills and a passion for solving problems
Experience in Linux administration; fluent in Linux standard command line programs
Fluency in Python and its ecosystem (numpy, pandas, etc.) is strongly recommended
Experience in metrics and logs aggregation and analysis with a focus on performance optimization
Understanding of Git and CI/CD concept
A great support attitude (our job is to make life easier for other teams!)
Strong written and verbal communication skills; Fluency in the English language
Knowledgeable in:
Computer science fundamentals (algorithms and data structures)
Relational databases
Modern service architectures
Experience in the following technologies is relevant: Kafka, Docker, Helm, Kubernetes, GC, AWS, Spark and Pyspark, Hadoop, Redis, MySQL, gRPC, Apache Arrow, Apache Airflow

WHAT'S ON OFFER

Competitive and attractive compensation package with a clear career road-map – where you feel challenged every day
We offer a strong culture of learning and development: training courses, library, speakers, share and learn events
Learn from who sits next to you! Working in our client's environment, you are surrounded by smart and talented people
Employee resources groups with strong diversity and inclusion culture
Premium Health Insurance and Employee Assistance Program
Generous time-off policy, unlimited sick days, re-creation sabbatical leave (based on tenure), Trade Union benefits for staff and family
Team building activities every month: Local engagement events, monthly team lunches – Employee clubs: football, ping-pong, badminton, yoga, running, PS5, movies, etc.
Annual company trips and occasional global conferences – the opportunity to travel and connect with our global teams
Happy hour with tea breaks, snacks, and meals every day in the office!

CONTACT

PEGASI – IT Recruitment Consultancy | Email: recruit@pegasi.com.vn | Tel: +84 28 3622 8666
We are PEGASI – IT Recruitment Consultancy in Vietnam. If you are looking for new opportunity for your career path, kindly visit our website www.pegasi.com.vn for your reference. Thank you!

Job Summary

Company Type:

product, Investment Management

Technical Skills:

Devops, Kubernetes, Kafka, Python

Location:

Ho Chi Minh, Ha Noi - Viet Nam

Working Policy:

Salary:

Negotiation

Job ID:

J01251

Status:

Close

Related Job:

Senior/ Lead Data Engineer (Data Platform / MLOps)

Ho Chi Minh, Ha Noi - Viet Nam


Information Technology & Services

You will be responsible for managing, designing, and enhancing data systems and workflows that drive key business decisions. The role is focused 75% on data engineering, involving the construction and optimization of data pipelines and architectures, and 25% on supporting data science initiatives through collaboration with data science teams for machine learning workflows and advanced analytics. You will leverage technologies like Python, Airflow, Kubernetes, and AWS to deliver high-quality data solutions. Architect, develop, and maintain scalable data infrastructure, including data lakes, pipelines, and metadata repositories, ensuring the timely and accurate delivery of data to stakeholders. Work closely with data scientists to build and support data models, integrate data sources, and support machine learning workflows and experimentation environments. Develop and optimize large-scale, batch, and real-time data processing systems to enhance operational efficiency and meet business objectives. Leverage Python, Apache Airflow, and AWS services to automate data workflows and processes, ensuring efficient scheduling and monitoring. Utilize AWS services such as S3, Glue, EC2, and Lambda to manage data storage and compute resources, ensuring high performance, scalability, and cost-efficiency. Implement robust testing and validation procedures to ensure the reliability, accuracy, and security of data processing workflows. Stay informed of industry best practices and emerging technologies in both data engineering and data science to propose optimizations and innovative solutions.

Negotiation

View details

Senior Software Engineer (Backend + Network)

Ho Chi Minh - Viet Nam


Product

  • Backend
  • Network

Continuously monitor and analyze new VPN providers, proxy services, and anonymization tools. Conduct investigations on IP allocation patterns, hosting provider behaviors, and network infrastructure changes. Develop and maintain comprehensive databases of known VPN/proxy IP ranges and behavioral signatures. Research emerging threats such as residential proxies, mobile proxies, and distributed proxy networks. Monitor darkweb marketplaces and security forums for emerging proxy/VPN trends. Perform deep packet analysis and network traffic pattern recognition. Develop and maintain the system using PHP, Python, and/or Go. Optimize VPN/Proxy detection algorithms. Design scalable infrastructure to handle millions of IP lookups per day. Implement monitoring and alerting systems for detection accuracy and system performance. Analyze production incidents related to false positives/negatives in threat detection. Collaborate with DevOps team on deployment of detection rule updates and model improvements. Investigate customer-reported bypass attempts and develop rapid response solutions. Provide technical expertise during customer security consultations. Support sales engineering team with technical demonstrations and proof-of-concepts. Document threat analysis findings and detection methodologies for internal and customer use. Maintain relationships with cybersecurity vendors, threat intelligence providers, and ISPs. Monitor industry threat reports, security advisories, and academic research. Participate in cybersecurity conferences and forums to stay current with the threat landscape. Contribute to open-source security tools and research when appropriate.

Negotiation

View details

Data Engineer - RefData

Ho Chi Minh, Ha Noi - Viet Nam


product, Investment Management

  • Data Engineering

Developing an automated data processing system and overseeing its maintenance Consolidating and integrating various data sources and databases into a unified system Designing interfaces and micro services using Python Enhancing the organization's data through NLP and AI models Preparing and cleaning semi-structured or unstructured data Creating effective algorithms for data processing Testing and incorporating external APIs Assisting the Business Analysts team

Negotiation

View details