Skip To Main Content
backBack to Search

Senior Python Data Engineer with AI

Hybrid in Ukraine
Python.Core
& 11 others
hot

We are seeking a highly skilled Senior Python Developer to join our dynamic team and play a pivotal role in building scalable, robust APIs and innovative backend solutions for generative AI applications.

This position will involve close collaboration with cross-functional teams to deliver high-performance, well-architected solutions aimed at solving complex problems.

Responsibilities
  • Develop and deploy scalable RESTful APIs to serve web apps and integrations with other software
  • Ensure API performance, scalability, and maintainability with focus on clear, concise code and robust logging/monitoring
  • Support and optimize database integrations, ensuring efficient data modeling and query performance
  • Collaborate with data engineers on pipelines and orchestration frameworks to deliver clean, organized data to APIs
  • Conduct thorough unit and integration testing in collaboration with testing teams
  • Review code from peers to ensure high-quality solutions and provide constructive feedback
  • Create and maintain technical documentation for solutions and Software Development Lifecycle (SDLC) processes
  • Apply modern software design best practices including version control, containerization, and scalable service design
  • Work collaboratively with product teams throughout all phases of the development lifecycle
Requirements
  • 5+ years of professional experience in Python-focused backend development
  • Extensive experience with at least one Python backend framework, such as FastAPI
  • Proficiency in asynchronous programming and advanced Python development
  • Expertise in relational database systems, including PostgreSQL and data modeling
  • Knowledge of RESTful API design principles and modern software development practices
  • Familiarity with scalable architecture patterns, containerization tools, and cloud-based operations
  • Extensive background in version control systems and CI/CD pipelines
  • Experience working with data engineering frameworks, such as Spark or PySpark, and orchestration tools like Airflow is a plus
Nice to have
  • Knowledge of Amazon Redshift and Databricks for data engineering workflows
  • Background in ETL/ELT solutions for data transformation and loading
  • Skills in leveraging Apache Airflow for orchestration frameworks
  • Familiarity with PySpark and GitHub Actions for workflow automation and development pipelines
Looking for something else?

Find a vacancy that works for you. Send us your CV to receive a personalized offer.

Find me a job