Senior Build Engineer (Data DevOps)
Data DevOps
& 6 others
Argentina
We are seeking a highly skilled remote Senior Data DevOps Engineer to join our team, working on a cutting-edge project that leverages the latest technologies in Azure Data Factory, Azure DevOps, and Databricks.
In this position, you will play a critical role in designing, building, and deploying data pipelines, ensuring the reliability, scalability, and performance of our data systems. This is a unique opportunity to work on a project that has a significant impact on our business and the industry, collaborating with a team of talented professionals.
Responsibilities
- Design, build, and deploy data pipelines using Azure Data Factory, Databricks, and other related technologies
- Automate data operations using Python or Java, ensuring the reliability and scalability of data systems
- Collaborate with cross-functional teams to understand business requirements and design data solutions that meet those needs
- Implement DataOps and MLOps practices, ensuring the quality and accuracy of data systems
- Monitor and troubleshoot data pipelines, identifying and resolving issues in a timely manner
- Develop and maintain documentation for data systems and processes
- Stay up-to-date with the latest trends and technologies in data engineering and DevOps
Requirements
- A minimum of 3 years of experience in Data DevOps, demonstrating your expertise in designing, building, and deploying data pipelines in Microsoft Azure
- In-depth knowledge of Azure Data Factory, Azure DevOps, Databricks, and other related technologies
- Experience with DataOps and MLOps
- Strong programming skills in Python or Java
- Good understanding of cloud infrastructure and networking concepts, including security, scalability, and resilience
- Excellent communication skills and the ability to work collaboratively with cross-functional teams
- Strong analytical and problem-solving skills, enabling you to identify and resolve complex issues
- Fluent spoken and written English at an Upper-intermediate level or higher (B2+)
Nice to have
- Experience with other cloud platforms such as AWS or Google Cloud
- Knowledge of big data technologies such as Hadoop, Spark, or Kafka
- Experience with containerization technologies such as Docker and Kubernetes
- Certifications in Microsoft Azure or related technologies
Benefits
- International projects with top brands
- Work with global teams of highly skilled, diverse peers
- Healthcare benefits
- Employee financial programs
- Paid time off and sick leave
- Upskilling, reskilling and certification courses
- Unlimited access to the LinkedIn Learning library and 22,000+ courses
- Global career opportunities
- Volunteer and community involvement opportunities
- EPAM Employee Groups
- Award-winning culture recognized by Glassdoor, Newsweek and LinkedIn