Data DevOps Engineer with Azure, Databricks [Senior/Lead]
Hybrid in Ukraine: Lviv
Data DevOps
& 10 others
We are looking for a Senior/Lead Data DevOps to join EPAM and contribute to a project for a large customer.
As a Senior/Lead Data DevOps in Data Platform, you will focus on maintaining and implementing new features to the data transformation architecture, which is the backbone of the Customer's analytical data platform. As a key figure in our team, you'll implement and deliver high-performance data processing solutions that are efficient and reliable at scale.
Responsibilities
- Design, build and maintain highly available production systems utilizing Azure data solutions including Data Lake Storage, Databricks, ADF, and Synapse Analytics
- Design and implement build, deployment, and configuration management systems together with CI/CD experience improvements based on Terraform and Azure DevOps pipeline solutions across multiple subscriptions and environments
- Improve users experience with Databricks platform based on best practices of Databricks cluster management, cost-effective setups, data security models, etc.
- Design, implement and improve monitoring and alerting system
- Collaborate with Architecture teams to ensure platform architecture and design standards align with support model requirements
- Identify opportunities to optimize platform activities and processes, implement automation mechanisms to streamline operations
Requirements
- 4+ years of professional experience
- 2+ years of hands-on experience with a variety of Azure services
- Proficiency in Azure data solutions including Data Lake Storage, Databricks, ADF, and Synapse Analytics
- Solid Linux/Unix systems administration background
- Advanced skills in configuring, managing and maintaining networking on Azure cloud
- Solid experience in managing production infrastructure with Terraform
- Hands-on experience with one of the Azure DevOps/GitLab CI/GitHub Actions pipelines for infrastructure management and automation
- Hands-on experience with Databricks platform
- Practical knowledge of Python combined with SQL knowledge
- Hands-on experience in one scripting language: Bash, Perl, Groovy
- Advanced skills in Kubernetes/Docker
- Good knowledge of Security Best Practices
- Good knowledge of Monitoring Best Practices
- Good organizational, analytical and problem solving skills
- Ability to present and communicate the architecture in a visual form
- English language proficiency – ability to communicate directly with a customer. B2 level is required
Looking for something else?
Find a vacancy that works for you. Send us your CV to receive a personalized offer.
Find me a job