Skip To Main Content
backBack to Search

Senior Data Software Engineer

Data Software Engineering, Databricks, Microsoft Azure, PySpark, Python, SQL

Join our remote team as a Senior Data Software Engineer contributing to a project centered around Databricks workflows, APIs, analytical development, and data engineering.

In this role, you will play a pivotal part in constructing and sustaining intricate data pipelines while facilitating seamless deployments to production. Your involvement will extend to crafting end-to-end production solutions and collaborating with cross-functional teams to deliver top-tier solutions.

Responsibilities
  • Contribute to the design and development of novel features within the Agile development framework (Scrum)
  • Prioritize and uphold high-quality standards across all development stages
  • Ensure the reliability, availability, performance, and scalability of systems
  • Troubleshoot and maintain code within expansive and intricate environments
  • Collaborate with Developers, Product and Program Management, and senior technical personnel to provide customer-centric solutions
  • Offer technical insights for new feature requirements, collaborating with business owners and architects
  • Stay abreast of industry trends and emerging technologies for continuous improvement
  • Implement solutions aligned with business objectives
  • Guide and mentor less experienced team members to foster skill enhancement and career growth
  • Participate in code reviews, upholding code quality and adherence to standards
  • Actively engage in architectural and technical discussions within cross-functional teams to achieve project goals
Requirements
  • Minimum of 3 years of hands-on experience in Data Software Engineering in a production setting
  • Proficiency in Databricks, Microsoft Azure, PySpark, Python, and SQL for development and deployment to production
  • Familiarity with Azure DevOps, GitHub (or alternative platforms), and version control for effective project management
  • Capability to architect end-to-end production solutions
  • Robust experience on one or more cloud platforms like Azure, GCP, AWS
  • Proven track record in constructing resilient data pipelines
  • Ability to integrate disparate elements for comprehensive solutions across systems
  • Exceptional communication skills in both spoken and written English, at an upper-intermediate level or higher
Nice to have
  • Exposure to REST APIs and Power BI would be advantageous
Benefits
  • International projects with top brands
  • Work with global teams of highly skilled, diverse peers
  • Healthcare benefits
  • Employee financial programs
  • Paid time off and sick leave
  • Upskilling, reskilling and certification courses
  • Unlimited access to the LinkedIn Learning library and 22,000+ courses
  • Global career opportunities
  • Volunteer and community involvement opportunities
  • EPAM Employee Groups
  • Award-winning culture recognized by Glassdoor, Newsweek and LinkedIn