Splunk telemetry engineers

ABOUT CLIENT

Our client is one of the world’s largest providers of Consulting, Outsourcing and Technology Services

JOB DESCRIPTION

Compile data to support Cyber Security outcomes through engineering.
Maintain high engineering standards and thorough documentation.
Assist in determining the urgency of addressing issues within the team.
Possess a strong understanding of data processing, storage, alerting, and search technologies to guide and validate team work.
Adhere to strict processes.

JOB REQUIREMENT

Proficiency in programming languages such as Python and SQL
Familiarity with cloud computing platforms such as AWS and Azure including specific services like Amazon SNS/SQS, Azure Service Bus, Azure Event Hubs, and Apache Kafka
Prior experience in creating and managing ETL pipelines for data lakes or SIEM
Knowledge of technologies including Windows Event Forwarder, Syslog, Cribl, Splunk (including Data Modelling), and concepts related to Data lake (e.g. Data Bricks)
Ideally, experience in the Cyber Security domain
A tertiary qualification in a Technology discipline or related field would be advantageous

WHAT'S ON OFFER

Competitive compensation, comprehensive health insurance for employees and dependents.
Participation in international projects within a professional and dynamic work setting.
Gaining valuable experience with diverse projects, new technologies, and a multitude of talents.
Access to training opportunities, including technical seminars and soft skill courses.
Potential for promotion through a regular performance review system.

CONTACT

PEGASI – IT Recruitment Consultancy | Email: recruit@pegasi.com.vn | Tel: +84 28 3622 8666
We are PEGASI – IT Recruitment Consultancy in Vietnam. If you are looking for new opportunity for your career path, kindly visit our website www.pegasi.com.vn for your reference. Thank you!

Job Summary

Company Type:

Outsource

Technical Skills:

Data Engineering, Security

Location:

Ho Chi Minh, Ha Noi - Viet Nam

Working Policy:

Hybrid

Salary:

Negotiation

Job ID:

J01751

Status:

Close

Related Job:

Backend Engineer (Python/Kotlin)

Ho Chi Minh, Ha Noi - Viet Nam


Outsource

  • Python
  • Kotlin

Design and develop financial products built on top of our core banking platform - Thought Machine Vault Design and develop event driven micro services for enhancing the functionality of our core banking platform Maintain and improve the reliability of our services using effective simulation, e2e and performance tests Improve SRE processes and provide production support for our services Write high quality, maintainable code using TDD Use Kubernetes and Docker to schedule and run microservices Our technology stack is predominantly Python and Kotlin / Java, but our architecture allows for using the most appropriate language to solve a given problem. - PostgreSQL, Aurora and S3 for persistence Leverage our elastic AWS infrastructure Practice continuous integration and delivery You build it, you run it.

Negotiation

View details

Senior Full-stack Java Software Engineer

Ho Chi Minh - Viet Nam


Outsource

  • Java
  • Angular
  • Cloud

Develop and maintain full-stack web applications using Java (main core), Spring Boot on the backend and Angular (TypeScript) on the frontend. Design and implement $1ESTful APIs, ensuring scalability, security, and performance. Participate in system design, code review, and technical discussions. Work with AWS cloud services to deploy and operate applications in production. Write unit tests and integration tests to ensure high code quality. Collaborate in Agile/Scrum teams with Product Owner, QA, and DevOps. Support and mentor junior developers.

Negotiation

View details

Senior Data Engineer

Ho Chi Minh - Viet Nam


Product

We're seeking a Staff Data Engineer to own and evolve our data infrastructure as we scale globally. You'll design and build the data systems that power our platform - from real-time pipelines and analytics infrastructure to the AI/ML foundations enabling intelligent insurance products.#Data Architecture & Engineering Design and implement scalable, future-proof data architectures aligned with business objectives across multiple regions and regulatory environments Build and maintain data pipelines for ingestion, transformation, and delivery using modern orchestration tools (Airflow, Spark, Kafka) Architect data solutions spanning data warehousing, data lakes, and real-time analytics Create and maintain data models (conceptual, logical, physical) using recognized modeling approaches Develop and document the enterprise data landscape, mapping data stores and flows across our microservices architecture#AI/ML Infrastructure Build and maintain data infrastructure supporting ML model training, deployment, and monitoring (MLOps) Design and implement vector database solutions for AI-powered features (e.g., MongoDB Atlas Vector Search, Pinecone, Weaviate) Develop data pipelines feeding recommendation engines, claims processing automation, fraud detection, and other AI-driven capabilities Ensure AI infrastructure scales globally while meeting data residency and compliance requirements#Data Operations & Quality Implement DataOps practices ensuring data quality, lineage, and governance across the platform Define and enforce data strategy and architectural principles across engineering teams Build monitoring and alerting for pipeline health, data quality, and SLA compliance Optimize query performance and cost efficiency across data systems#Technical Leadership Collaborate with product and engineering teams to translate business requirements into data solutions Act as a change agent, driving adoption of modern data practices across the organization Contribute to architectural reviews and technical decision-making Own data problems through to resolution

Negotiation

View details