Senior Data Software Engineer
Remote in India
Data Software Engineering
& 6 others

Sorry, this position is no longer available
India
We are in search of a highly proficient Senior Data Software Engineer to join our remote team, delivering crucial assistance to our Data Science units.
As a seasoned Senior Data Software Engineer, your primary role involves constructing datamarts and delivering on-the-fly assistance within a fast-paced, ever-changing environment. Your collaboration with Data Scientists and various stakeholders is integral to comprehending their requirements and devising solutions that precisely align with their needs.
Responsibilities
- Forge datamarts and data pipelines, catering to the requirements of the Data Science teams
- Offer spontaneous support to Data Scientists and stakeholders, ensuring the smooth functioning of data pipelines and associated processes
- Engage in teamwork with cross-functional units to grasp their needs, innovating solutions in alignment with their specifications
- Architect, develop, and sustain streamlined, scalable ETL processes
- Enhance the efficiency of intricate SQL queries and database operations
- Instigate CI/CD pipelines for executing data engineering tasks
- Establish and maintain REST APIs dedicated to data processing and consumption
- Collaborate with stakeholders to outline project requisites and timelines
- Furnish technical direction and mentorship to junior team members
Requirements
- Accumulate a minimum of 4 years of experience as a Data Software Engineer, maneuvering through large datasets and intricate data pipelines
- Demonstrate proficiency in Amazon Web Services, particularly navigating services like S3 and EC2
- Possess advanced familiarity with Apache Airflow and Apache Spark, specializing in data processing and workflow orchestration
- Exhibit prowess in the Python programming language, showcasing competence in crafting efficient, scalable code
- Demonstrate a robust comprehension of SQL and relational databases, showcasing experience in crafting and optimizing intricate queries
- Showcase expertise in CI/CD pipelines, with exposure to tools such as Jenkins or GitLab
- Bring proficiency in PySpark and the utilization of REST APIs for executing data engineering tasks
- Possess a thorough understanding of ETL processes and data modeling
- Showcase exemplary communication skills, enabling effective collaboration with both technical and non-technical stakeholders
- Display upper-intermediate English language proficiency, facilitating transparent communication and collaboration with the team and stakeholders
Nice to have
- Familiarity with Redshift and Databricks for data processing and analysis
Benefits
- International projects with top brands
- Work with global teams of highly skilled, diverse peers
- Healthcare benefits
- Employee financial programs
- Paid time off and sick leave
- Upskilling, reskilling and certification courses
- Unlimited access to the LinkedIn Learning library and 22,000+ courses
- Global career opportunities
- Volunteer and community involvement opportunities
- EPAM Employee Groups
- Award-winning culture recognized by Glassdoor, Newsweek and LinkedIn