Skip To Main Content
backBack to Search

Lead Data Software Engineer

Remote in India
Data Software Engineering
& 6 others
warning.png
Sorry, this position is no longer available

We are seeking a highly skilled Lead Data Software Engineer to join our remote team and provide critical support to our Data Science teams.

As a key figure within the team, you will be responsible for building datamarts, developing REST APIs, and providing on-call support for ad-hoc requests. You will play a pivotal role in driving the success of our projects, which focus on delivering industry-leading solutions for our clients in the healthcare sector.

Responsibilities
  • Lead and mentor the Data Software Engineering team, fostering a culture of growth and continuous learning within the group
  • Collaborate with cross-functional teams to deliver high-quality software solutions in line with project goals and timelines
  • Build and maintain datamarts, ensuring efficient and scalable data management
  • Develop REST APIs, enabling seamless integration with other applications and systems
  • Provide on-call support for ad-hoc requests, ensuring the efficient resolution of issues
  • Design and implement data pipelines using Apache Airflow and Apache Spark
  • Work closely with Data Science teams, providing technical support and guidance
  • Ensure the implementation and maintenance of CI/CD pipelines for efficient delivery of application releases
  • Continuously evaluate industry trends and best practices to refine and implement the most effective software engineering strategies
  • Collaborate with stakeholders, demonstrating excellent communication and leadership skills
Requirements
  • Minimum of 5 years of experience in Data Software Engineering, working with complex infrastructures and large-scale projects
  • At least 1 years of demonstrated leadership experience, managing and guiding a team of Data Software Engineers
  • Expertise in Amazon Web Services, designing and implementing scalable solutions in the cloud environment
  • Strong experience in Apache Airflow and Apache Spark for building and scheduling data pipelines
  • Proficiency in Python and SQL for ETL and data manipulation tasks
  • Experience with CI/CD pipelines for efficient delivery of application releases
  • Excellent analytical and problem-solving skills, enabling effective decision-making in complex environments
  • Strong English language proficiency at the Upper-Intermediate level, enabling clear communication and collaboration with the team and stakeholders
Nice to have
  • Experience with Redshift and Databricks for data processing and analysis
Benefits
  • International projects with top brands
  • Work with global teams of highly skilled, diverse peers
  • Healthcare benefits
  • Employee financial programs
  • Paid time off and sick leave
  • Upskilling, reskilling and certification courses
  • Unlimited access to the LinkedIn Learning library and 22,000+ courses
  • Global career opportunities
  • Volunteer and community involvement opportunities
  • EPAM Employee Groups
  • Award-winning culture recognized by Glassdoor, Newsweek and LinkedIn

These jobs are for you