Senior Data DevOps Engineer
Data DevOps
& 8 others
Colombia
We are seeking a highly skilled remote Senior Data DevOps Engineer to join our team, working on a cutting-edge project that leverages data to drive business insights and innovation.
In this position, you will work closely with cross-functional teams to design, build, and maintain data pipelines and ETL/ELT solutions, enabling the seamless integration of data across various systems. You will also play a key role in managing and optimizing our cloud infrastructure, utilizing Amazon Web Services and Databricks to ensure high levels of performance, security, and reliability.
Responsibilities
- Design, build, and maintain data pipelines and ETL/ELT solutions, ensuring the seamless integration of data across various systems
- Manage and optimize cloud infrastructure, utilizing AWS and Databricks to ensure high levels of performance, security, and reliability
- Collaborate with cross-functional teams to identify and resolve issues related to data pipelines and infrastructure, ensuring smooth operations
- Develop and maintain monitoring and alerting systems for data pipelines and infrastructure, ensuring timely identification and resolution of issues
- Automate deployment and testing processes, utilizing CI/CD pipelines and infrastructure as code (IaC) tools
- Contribute to the development of best practices and standards for data engineering and DevOps
- Provide guidance and mentorship to junior engineers on the team
Requirements
- A minimum of 3 years of experience in DevOps, with a focus on data engineering and data pipelines
- Expertise in designing and implementing ETL/ELT solutions, with practical experience in tools like Apache Nifi, Apache Airflow, or similar
- Strong hands-on experience in Amazon Web Services (AWS), including EC2, S3, RDS, Lambda, and CloudFormation
- Proficiency in Databricks and Spark
- Experience with containerization technologies such as Docker and Kubernetes, and their orchestration tools
- Strong knowledge of CI/CD pipelines and tools such as Jenkins, GitLab, or CircleCI
- Experience in infrastructure as code (IaC) tools such as Terraform, CloudFormation, or Ansible
- Excellent communication and collaboration skills, able to work effectively with cross-functional teams
- Fluent spoken and written English at an Upper-intermediate level or higher (B2+)
Nice to have
- Experience with database technologies such as MySQL, PostgreSQL, or Oracle
- Knowledge of Big Data technologies such as Hadoop, Hive, or Presto
- Experience with data visualization tools such as Tableau or Power BI
- Familiarity with machine learning and data science concepts and tools
Benefits
- International projects with top brands
- Work with global teams of highly skilled, diverse peers
- Healthcare benefits
- Employee financial programs
- Paid time off and sick leave
- Upskilling, reskilling and certification courses
- Unlimited access to the LinkedIn Learning library and 22,000+ courses
- Global career opportunities
- Volunteer and community involvement opportunities
- EPAM Employee Groups
- Award-winning culture recognized by Glassdoor, Newsweek and LinkedIn