GO TO SEARCH
Senior Data Software Engineer
Become a key player in our remote team by taking on the role of a Senior Data Software Engineer dedicated to a project centered around Databricks workflows, APIs, analytical development, and data engineering.
In this role, your primary focus will involve constructing and sustaining intricate data pipelines, facilitating seamless deployment to production. Your responsibilities extend to crafting end-to-end production solutions while engaging with cross-functional teams to deliver top-notch results.
Responsibilities
- Engage in the Agile development process (Scrum) to conceive and implement innovative features
- Prioritize and uphold high-quality standards throughout each developmental phase
- Ensure the dependability, accessibility, performance, and scalability of systems
- Troubleshoot and maintain code within expansive, intricate environments
- Work in tandem with Developers, Product and Program Management, and seasoned technical professionals to furnish customer-centric solutions
- Provide technical insights for new feature requirements in collaboration with business owners and architects
- Stay abreast of industry trends and emerging technologies for continuous improvement
- Champion the execution of solutions aligned with business objectives
- Guide and mentor less seasoned team members, fostering skill enhancement and career growth
- Participate in code reviews, ensuring adherence to standards and code quality
- Collaborate seamlessly with cross-functional teams to achieve project objectives
- Actively contribute to architectural and technical discourse
Requirements
- A minimum of 3 years of hands-on experience in Data Software Engineering
- Proficiency in Databricks, Microsoft Azure, PySpark, Python, and SQL for development and deployment in production
- Familiarity with Azure DevOps, GitHub (or alternative platforms), and version control for efficient project management
- Capability to develop comprehensive end-to-end production solutions
- Robust experience on one or more cloud platforms such as Azure, GCP, AWS
- Proven track record in constructing resilient data pipelines
- Capacity to integrate disparate elements for solutions spanning multiple systems
- Exceptional communication skills in both spoken and written English, at an upper-intermediate level or higher
Nice to have
- Experience with REST APIs and Power BI would be an advantage