Skip To Main Content
backBack to Search

Senior Java Data Pipeline Engineer

hot
Java, Google Cloud Dataflow, Spring, Google Cloud Platform

We are currently seeking an experienced Senior Java Data Pipeline Engineer to join our team remotely. In this role, you will focus on designing and implementing robust data pipelines in Google's DataFlow environment using Java and Apache Beam to efficiently process large amounts of data. Your work will incorporate the Spring Framework, a microservices system supporting our pipelines. You will also employ Google Cloud Platform services to deploy and manage data pipelines, ensuring high availability and reliability.

Responsibilities
  • Design and create robust data pipelines using Java and Apache Beam in Google's DataFlow environment
  • Employ the Spring framework to support our pipelines
  • Use Google Cloud Platform services to deploy and manage data pipelines
  • Collaborate with data scientists and analysts to implement solutions for data modeling, mining, and extraction processes
  • Optimize data pipelines for performance and scalability, identify bottlenecks and implement improvements
  • Develop and maintain documentation for data pipeline architectures and operational procedures
  • Ensure data integrity and compliance with all governance and security policies throughout the data processing lifecycle
  • Monitor pipeline performance, implementing logging and alerting mechanisms to preemptively detect and address issues
Requirements
  • A Bachelor's or Master's degree in Computer Science, Engineering, Information Technology, or a related field
  • At least 3 years of experience in data pipeline development, backed by a strong Java and cloud-based data processing technology background
  • A proven track record of designing, implementing, and optimizing data pipelines capable of processing large data sets in a cloud environment
  • A thorough understanding of the Spring Framework, including Spring Boot, for building high-performance applications
  • Hands-on experience with Google DataFlow and Apache Beam for building and managing data pipelines
  • Experience with GCP services, particularly those related to data storage, processing, and analytics
  • Proficiency in using version control systems, such as Git for code management and collaboration
  • B2+ English level proficiency
Benefits
  • International projects with top brands
  • Work with global teams of highly skilled, diverse peers
  • Healthcare benefits
  • Employee financial programs
  • Paid time off and sick leave
  • Upskilling, reskilling and certification courses
  • Unlimited access to the LinkedIn Learning library and 22,000+ courses
  • Global career opportunities
  • Volunteer and community involvement opportunities
  • EPAM Employee Groups
  • Award-winning culture recognized by Glassdoor, Newsweek and LinkedIn