Skip To Main Content
backBack to Search

Senior Data Software Engineer

Data Software Engineering, Python, Amazon Web Services, Databricks, Apache Airflow, Apache Spark, CI/CD, SQL, REST API
warning.png
Sorry, this position is no longer available

Are you an experienced Data Software Engineer looking for a new challenge? Join our team as a Senior Data Software Engineer and play a pivotal role in supporting Data Science teams, building efficient data solutions, and contributing to cutting-edge analytics projects. Proficiency in Python and AWS is key to success in this role.

Responsibilities
  • Collaborate closely with Data Science teams, providing essential support and mentorship as they work on analytical projects
  • Take charge of building and maintaining tables, responding to incoming support tickets, and offering guidance through minimal ticket interactions
  • Utilize your expertise to construct datamarts and separate data models that cater to specific analytics needs
  • Engage in sprint-based development, ensuring timely delivery of data solutions aligned with Data Scientists' requirements
  • Handle ad-hoc data requests and provide on-call support as necessary
  • Contribute to the development of data ingestion pipelines, enhancing the efficiency of data processing workflows
  • Publish comprehensive documentation to aid users in effectively utilizing the data solutions you provide
  • Collaborate effectively within the team and demonstrate exceptional communication and documentation skills
  • Embrace constructive feedback and exhibit a proactive attitude towards continuous learning and adapting to new technologies
Requirements
  • A minimum of 3 years of relevant experience in Data Software Engineering, showcasing a strong track record of successful project delivery
  • Proficiency in Python, enabling you to develop robust and efficient data solutions
  • In-depth knowledge of Amazon Web Services, utilizing cloud resources to optimize data processing and storage
  • Experience with Apache Airflow, contributing to streamlined workflow orchestration
  • Familiarity with Apache Spark, enhancing your ability to work with large-scale data processing
  • Solid understanding of ETL processes, ensuring efficient data movement and transformation
  • Proficiency in CI/CD practices, facilitating smooth and automated software delivery
  • Strong command of SQL, enabling you to manipulate and query complex datasets
  • Experience working with REST APIs, facilitating seamless data integration with other systems
  • Fluent English communication skills at a B2+ level, ensuring effective collaboration within an international team
Nice to have
  • Familiarity with DataBricks and Spark, further enhancing your ability to work with big data
  • Experience with Redshift, contributing to optimized data warehousing solutions
Benefits
  • International projects with top brands
  • Work with global teams of highly skilled, diverse peers
  • Healthcare benefits
  • Employee financial programs
  • Paid time off and sick leave
  • Upskilling, reskilling and certification courses
  • Unlimited access to the LinkedIn Learning library and 22,000+ courses
  • Global career opportunities
  • Volunteer and community involvement opportunities
  • EPAM Employee Groups
  • Award-winning culture recognized by Glassdoor, Newsweek and LinkedIn

These jobs are for you