Skip To Main Content
backBack to Search

Senior Data DevOps Engineer

Remote in Argentina,
& 5 others
Data DevOps
& 4 others

We are seeking an experienced remote Senior Data DevOps Engineer to join our team.

You will be working on a project to monitor and manage Hadoop and big data infrastructure for production and UAT environments, ensuring optimal utilization across all teams and use cases. If you are passionate about data and have a strong track record of managing complex infrastructure projects, we invite you to apply for this exciting opportunity.

Responsibilities
  • Monitor and manage Hadoop and big data infrastructure for production and UAT environments, ensuring optimal utilization across all teams and use cases
  • Set up Hadoop infrastructure across dev, pre-prod, and prod that are separate from Compliance
  • Separate out compute clusters from storage clusters, enabling exploration of cloud opportunities for storage or compute independently
  • Manage customer escalations and be the first point of contact for troubleshooting issues
  • Enhance troubleshooting tools and enrich debugging and monitoring alarms to help engineers with troubleshooting and incident analysis
  • Collaborate with cross-functional teams to ensure successful project delivery
Requirements
  • A minimum of 3 years of experience in DevOps, with a focus on data and infrastructure management
  • Expertise in Apache Hadoop and Big Data, with experience setting up and managing Hadoop infrastructure across multiple environments
  • Strong knowledge of Security Incident Management, with experience in managing customer escalations and troubleshooting issues
  • Experience in separating compute clusters from storage clusters, enabling exploration of cloud opportunities for storage or compute independently
  • Strong analytical and problem-solving skills, with a focus on enhancing troubleshooting tools and debugging and monitoring alarms
  • Excellent communication skills and the ability to work collaboratively with cross-functional teams
  • Fluent spoken and written English at an Upper-Intermediate level or higher
Nice to have
  • Experience in cloud platforms, such as AWS or Azure
  • Knowledge of containerization technologies, such as Docker or Kubernetes
  • Experience with scripting languages, such as Python or Bash
  • Familiarity with configuration management tools, such as Ansible or Chef
  • Expertise in performance tuning and optimization for big data applications
Benefits
  • International projects with top brands
  • Work with global teams of highly skilled, diverse peers
  • Healthcare benefits
  • Employee financial programs
  • Paid time off and sick leave
  • Upskilling, reskilling and certification courses
  • Unlimited access to the LinkedIn Learning library and 22,000+ courses
  • Global career opportunities
  • Volunteer and community involvement opportunities
  • EPAM Employee Groups
  • Award-winning culture recognized by Glassdoor, Newsweek and LinkedIn