Senior Data Software Engineer
Remote in India: Hyderabad,
& 5 others
Data Software Engineering
& 4 others

Sorry, this position is no longer available
India: Hyderabad
We are looking for a remote Senior Data Software Engineer. You will build Enterprise Data Platform and Data Products on top. As a senior member of the team, you will be responsible for building and maintaining high-performance Big Data solutions. This role offers an opportunity to work with cutting-edge technologies and collaborate with a talented team of developers.
Responsibilities
- Design, develop, and implement high-performance Big Data solutions using Python and Databricks
- Collaborate with cross-functional teams to define project goals and requirements
- Create and maintain data pipelines and ETL processes to support data-driven applications
- Develop and maintain RESTful APIs and microservices for data-driven applications
- Implement and maintain Event-driven architecture for scalable and reliable systems
- Deploy and manage applications using containerization technologies like Docker and Kubernetes
- Participate in code reviews, ensuring code quality and adherence to standards
- Provide technical input for new feature requirements, partnering with business owners and architects
- Ensure continuous improvement by staying abreast of industry trends and emerging technologies
- Mentor and guide less experienced team members, helping them enhance their skills and grow their careers
Requirements
- At least 3 years of experience in building Data Platform
- Proven track record of successful implementation of Big Data high-performance solutions
- Strong expertise in Python and Databricks for building Big Data solutions
- In-depth knowledge of Event-driven architecture for developing scalable and reliable systems
- Experience with Microsoft Azure Big Data Services for developing and deploying cloud-based applications
- Strong understanding of Big Data technologies and data modeling concepts
- Experience with large scale data processing technologies such as Hadoop and Spark
- Expertise in developing RESTful APIs and microservices for data-driven applications
- Knowledge of containerization technologies like Docker and Kubernetes for deploying and managing applications
- Experience with Agile development methodologies and tools like Jira and Git
- Excellent communication skills in spoken and written English, at an upper-intermediate level or higher
Nice to have
- Experience with Machine Learning and AI technologies
Benefits
- International projects with top brands
- Work with global teams of highly skilled, diverse peers
- Healthcare benefits
- Employee financial programs
- Paid time off and sick leave
- Upskilling, reskilling and certification courses
- Unlimited access to the LinkedIn Learning library and 22,000+ courses
- Global career opportunities
- Volunteer and community involvement opportunities
- EPAM Employee Groups
- Award-winning culture recognized by Glassdoor, Newsweek and LinkedIn