Skip To Main Content
backGO TO SEARCH

Senior Data Engineer

bullets
Data Software Engineering, Python, Amazon Web Services, Databricks, Apache Spark, CI/CD, SQL, Terraform
warning.png
Sorry, this position is no longer available

Join our Corporate Data Engineering team as a Senior Data Engineer and contribute to the creation of cutting-edge data solutions and applications that drive essential business decisions across the organization. We are seeking a forward-thinking, detail-oriented individual who thrives on building scalable systems.

Responsibilities
  • Collaborate with cross-functional teams to design, develop, and implement data solutions and applications that align with business needs
  • Build and maintain scalable data pipelines for efficient data collection, processing, and storage
  • Perform data modeling and transformation to facilitate data analysis and reporting
  • Implement best practices for data quality, security, and compliance
  • Contribute to the enhancement of data engineering processes through continuous improvement and adoption of CI/CD practices
  • Collaborate within an Agile environment, participating in sprint planning, daily stand-ups, and other Agile ceremonies
Requirements
  • A minimum of 3 years of relevant experience in Data Engineering or a similar role, demonstrating your proficiency in developing and optimizing data solutions
  • Proficiency in Python, leveraging its capabilities to construct robust and scalable data pipelines
  • Strong familiarity with AWS, using its resources for efficient data processing and storage
  • Expertise in Databricks, contributing to efficient data processing and analysis
  • Proficiency in either Apache Spark or Hive, enabling you to work with large-scale data processing and analysis
  • Knowledge of CI/CD practices, promoting efficient and reliable software development processes
  • Proficiency in SQL, enabling you to query and manipulate data effectively
  • Experience with Terraform, enhancing your ability to manage and automate infrastructure
  • Strong understanding of Agile methodologies, enabling you to work collaboratively within dynamic teams
  • Fluent English communication skills at a B2+ level, facilitating effective collaboration and communication
Nice to have
  • Familiarity with containerization and orchestration tools such as Docker and Kubernetes
  • Knowledge of streaming data processing technologies like Apache Kafka or Apache Flink
  • Experience with data warehousing solutions like Amazon Redshift or Snowflake
  • Understanding of machine learning concepts and their integration into data engineering pipelines

These jobs are for you

Benefits | Community | Professional Development

poland.svg
For you
  • Discounts on health insurance, sport clubs, shopping centers, cinema tickets, etc.
  • Stable income
  • Flexible roles
For your comfortable work
  • EPAM hardware
  • EPAM software licenses
  • Access to offices and co-workings
  • Stable workload
  • Relocation opportunities
  • Flexible engagement models
For your growth
  • Free trainings for technical and soft skills
  • Free access to LinkedIn Learning platform
  • Language courses
  • Free access to internal and external e-Libraries
  • Certification opportunities
  • Skill advisory service