Data Platform Engineer
Office in Singapore: Singapore
Data DevOps
& 4 others
Refer a Friend
Know someone who would be a great fit for this opportunity?
We are seeking a Data Platform Engineer to design, operate and enhance a global data platform supporting large-scale analytics and data science workloads. You will focus on platform reliability, automation, security and CI/CD for data pipelines across on-premise and hybrid environments. Here's your chance to collaborate closely with security, engineering, and cloud operations teams while applying DevSecOps and Agile practices to deliver resilient, high-performing data solutions. If you are currently a senior data platform or DevOps engineer, this role is for you.
Responsibilities
- Operate and enhance global data platform components including VM servers, Kubernetes, Kafka and enterprise data applications
- Design and implement automation for infrastructure, security and CI/CD to support ELT/ETL pipelines
- Build resilient and observable data pipelines with health checks, monitoring, alerting and data quality controls
- Investigate incidents, perform root cause analysis and drive continuous improvement in platform performance and stability
- Apply DevSecOps and Agile methodologies to deliver secure, scalable solutions
- Collaborate with security, cloud operations and engineering teams on architecture and standards
- Stay current with industry trends to introduce new platform capabilities
Requirements
- Proven experience designing or operating large-scale, fault-tolerant distributed systems
- Strong experience with data platform technologies such as data lakes, streaming platforms and data pipelines
- Hands-on expertise with Kafka, Kubernetes, Spark, distributed storage formats (Parquet, S3-compatible systems) and platform health management including monitoring and alerting
- Proven experience in DevOps or CI/CD automation using tools such as Jenkins or Octopus
- Strong programming skills in Python and Java, Scala or R, with experience building and supporting ELT/ETL pipelines and streaming or file-based ingestion
- Hands-on experience with containerization and orchestration using Docker, Kubernetes and CI/CD image pipelines
- Experience integrating data science platforms such as Dataiku
- No visa sponsorship available
Nice to have
- Experience with on-premise Big Data architectures and cloud migration initiatives
- Exposure to configuration management and release tools like Ansible, Chef, XL Deploy and XL Release
- Familiarity with enterprise analytics and BI tools and ML Ops workflows
Looking for something else?
Find a vacancy that works for you. Send us your CV to receive a personalized offer.
Find me a jobRefer a Friend
Know someone who would be a great fit for this opportunity?