Skip To Main Content
backBack to Search

Senior Big Data Engineer

Remote in Ukraine
Data Software Engineering
& 5 others
warning.png
Sorry, this position is no longer available

We are seeking a Senior Big Data Engineer to join our remote team, working closely with Product Management teams in identifying solution requirements and developing the data platform capabilities. As a Senior Big Data Engineer, you will contribute to data analytics development by utilizing the latest data technology stack. You will also create and present data platform solution documents to support the development effort to build data pipelines and work as a team member in an agile environment to implement data pipelines and support all the testing and DevOps activities through go-live.

Responsibilities
  • Design and develop scalable data pipelines using Azure Databricks and other standard tools
  • Collaborate with Product Management teams to identify solution requirements and develop data platform capabilities
  • Create and present data platform solution documents to support the development effort to build data pipelines
  • Work as a team member in an agile environment to implement data pipelines and support all the testing and DevOps activities through go-live
  • Contribute to the development of data analytics by utilizing the latest data technology stack based on Azure Databricks and Microsoft Azure
  • Ensure data quality and reliability by designing and implementing efficient data pipelines
  • Provide technical guidance and support to junior team members
Requirements
  • A minimum of 3 years of experience in Data Software Engineering, demonstrating your proficiency in designing, developing, and maintaining big data solutions
  • Expertise in Apache Spark and Scala, showcasing your ability to develop distributed data processing applications
  • Hands-on experience with Azure Databricks and Microsoft Azure, highlighting your familiarity with cloud-based data platforms
  • Strong understanding of data modeling, data warehousing, and data architecture principles, enabling you to design and implement efficient data pipelines
  • Experience in developing and deploying data pipelines at scale, utilizing tools like Azure Data Factory, Azure Event Stores, and Event Hubs
  • Expertise in Agile methodologies, including Scrum and Kanban, demonstrating your ability to work in an agile environment
  • Excellent communication skills and strong critical thinking capabilities to effectively convey feedback and insights
  • Good organizational skills and a detail-oriented mindset, crucial for meticulous testing efforts
  • Fluent spoken and written English at an Upper-Intermediate level or higher, enabling effective communication
Nice to have
  • Experience in data visualization and reporting tools like Power BI or Tableau
  • Familiarity with other big data technologies like Hadoop, Hive, or Pig
  • Knowledge of DevOps methodologies and tools like Jenkins 
  • Understanding of machine learning principles and algorithms
Benefits
  • International projects with top brands
  • Work with global teams of highly skilled, diverse peers
  • Healthcare benefits
  • Employee financial programs
  • Paid time off and sick leave
  • Upskilling, reskilling and certification courses
  • Unlimited access to the LinkedIn Learning library and 22,000+ courses
  • Global career opportunities
  • Volunteer and community involvement opportunities
  • EPAM Employee Groups
  • Award-winning culture recognized by Glassdoor, Newsweek and LinkedIn

These jobs are for you