Skip To Main Content
backBack to Search

Lead Data Software Engineer

Remote in Argentina,
& 6 others
Data Software Engineering
& 6 others

We are in search of a highly proficient Lead Data Software Engineer to join our remote team, contributing essential support to our Data Science units.

As a pivotal team member, your role involves constructing datamarts, crafting REST APIs, and offering on-call assistance for spontaneous requests. Your contribution will be instrumental in propelling the triumph of our initiatives, concentrated on furnishing cutting-edge solutions for our clientele within the healthcare domain.

Responsibilities
  • Guide and nurture the Data Software Engineering team, instilling a culture of continuous learning and advancement within the unit
  • Coordinate with multifaceted teams to furnish top-notch software solutions aligning with project objectives and schedules
  • Establish and manage datamarts, ensuring streamlined and scalable data administration
  • Fabricate REST APIs, facilitating seamless incorporation with diverse applications and systems
  • Extend on-call support for impromptu requests, guaranteeing prompt issue resolution
  • Devise and execute data pipelines using Apache Airflow and Apache Spark
  • Collaborate intimately with Data Science teams, delivering technical support and direction
  • Ensure the execution and sustenance of CI/CD pipelines to ensure the efficient rollout of application releases
  • Persistently assess industry trends and optimal practices, refining and implementing highly effective software engineering approaches
  • Engage with stakeholders, showcasing exemplary communication and leadership proficiencies
Requirements
  • A minimum of 5 years' experience in Data Software Engineering, encompassing intricate infrastructures and expansive projects
  • Demonstrated leadership experience of at least 1 year, adept at overseeing and directing a team of Data Software Engineers
  • Proficiency in Amazon Web Services, adept at constructing and deploying scalable solutions in the cloud milieu
  • Robust familiarity with Apache Airflow and Apache Spark for devising and scheduling data pipelines
  • Strong knowledge of Python and SQL for executing ETL and data manipulation assignments
  • Experience with CI/CD pipelines to facilitate the streamlined deployment of application releases
  • Superlative analytical and problem-solving acumen, enabling astute decision-making in intricate settings
  • High-level English language proficiency, ensuring lucid communication and collaboration with the team and stakeholders
Nice to have
  • Familiarity with Redshift and Databricks for data processing and analysis
Benefits
  • International projects with top brands
  • Work with global teams of highly skilled, diverse peers
  • Healthcare benefits
  • Employee financial programs
  • Paid time off and sick leave
  • Upskilling, reskilling and certification courses
  • Unlimited access to the LinkedIn Learning library and 22,000+ courses
  • Global career opportunities
  • Volunteer and community involvement opportunities
  • EPAM Employee Groups
  • Award-winning culture recognized by Glassdoor, Newsweek and LinkedIn