Skip To Main Content
backBack to Search

Senior Data DevOps (Machine Learning)

Remote in Colombia,
& 6 others
Data DevOps
& 7 others

We are seeking a highly skilled remote Senior Data DevOps Engineer to join our team, working on a cutting-edge project that involves the development and deployment of large-scale data processing pipelines.

In this position, you will play a critical role in designing, implementing, and maintaining the infrastructure that enables data processing, storage, and analysis. You will work with a team of experienced professionals, tackling complex challenges and driving innovation in the field of data engineering. If you are passionate about DevOps and have a solid understanding of data processing technologies, we invite you to apply for this exciting opportunity.

Responsibilities
  • Design, implement, and maintain data processing pipelines using technologies
  • Develop and maintain CI/CD pipelines for data processing applications, ensuring efficient and reliable deployment
  • Implement and manage containerization technologies to enable scalable and flexible infrastructure
  • Collaborate with data scientists and analysts to design and implement data storage and retrieval solutions
  • Ensure the security and availability of data processing infrastructure, implementing best practices for data protection and disaster recovery
  • Monitor and troubleshoot data processing pipelines and infrastructure, identifying and resolving issues in a timely manner
  • Continuously improve data processing infrastructure, staying up-to-date with the latest technologies and industry trends
Requirements
  • A minimum of 3 years of experience in DevOps, with a focus on data engineering and infrastructure management
  • Expertise in CI/CD processes and tools, including Git, Jenkins, and TeamCity
  • Hands-on experience with containerization technologies such as Docker and Kubernetes, as well as container orchestration tools like Amazon ECS or Kubernetes
  • In-depth knowledge of Amazon Web Services (AWS), including EC2, S3, and Lambda
  • Strong proficiency in Linux system administration and shell scripting
  • Experience with infrastructure as code tools such as Terraform, Ansible, or CloudFormation
  • Familiarity with the Elastic Stack (Elasticsearch, Logstash, and Kibana) for log management and analysis
  • Excellent communication and collaboration skills, with the ability to work effectively in a team environment
  • Fluent spoken and written English at an Upper-Intermediate level or higher (B2+)
Nice to have
  • Experience with other cloud providers such as Google Cloud Platform or Microsoft Azure
  • Experience with Big Data technologies such as Hadoop, Hive, and Pig
  • Familiarity with configuration management tools such as Chef or Puppet
  • Knowledge of scripting languages such as Python or Ruby
Benefits
  • International projects with top brands
  • Work with global teams of highly skilled, diverse peers
  • Healthcare benefits
  • Employee financial programs
  • Paid time off and sick leave
  • Upskilling, reskilling and certification courses
  • Unlimited access to the LinkedIn Learning library and 22,000+ courses
  • Global career opportunities
  • Volunteer and community involvement opportunities
  • EPAM Employee Groups
  • Award-winning culture recognized by Glassdoor, Newsweek and LinkedIn