Skip To Main Content
backBack to Search

Middle Data Software Engineer

Remote in Colombia, Mexico
Data Software Engineering
& 11 others

We are searching for a skilled Middle Data Software Engineer with expertise in data engineering and development using a modern cloud data warehouse stack (BigQuery/Databricks).

This position requires crafting a reliable and efficient data infrastructure while collaborating with Engineers, Data Analysts, and Data Scientists to deliver meaningful insights that inform decision-making. The ideal candidate is a proactive team player who prioritizes code quality and efficiency alongside performance and automation.

Responsibilities
  • Build reliable and scalable data pipelines using Python and SQL
  • Participate in the design and development of data solutions, integrating data ingestion and transformation workflows
  • Support both real-time and batch data workflows with tools including BigQuery/Databricks, Apache Airflow, and DBT
  • Write clean, maintainable code, automating routine tasks wherever feasible
  • Collaborate with cross-functional teams to gather and implement business requirements
  • Implement monitoring and alerting tools to secure pipeline reliability
  • Participate in discussions about architecture and cloud infrastructure for data engineering initiatives
Requirements
  • Bachelor's degree in Computer Science, Software Engineering, or a related field
  • 2+ years of experience in data engineering
  • Proficiency in Python and SQL, along with skills in data frameworks such as Spark for processing
  • Knowledge of modern Cloud Data Warehousing tools like BigQuery or Databricks
  • Competency in version control systems like Git and CI/CD pipelines
  • Familiarity with workflow orchestration tools, including Apache Airflow or DBT
  • Understanding of cloud platforms such as AWS, GCP, or Azure
  • Basic knowledge of Agile or DevOps methodologies
  • Fluency in English at a B1+ level or higher
Nice to have
  • Familiarity with MySQL and visualization tools such as Looker/Tableau, as well as analytics platforms like Amplitude or Segment
  • Basic skills in Linux/Unix system administration and shell scripting
  • Understanding of machine learning pipelines and MLOps practices
  • Knowledge of real-time analytics and streaming technologies, including Apache Kafka or Spark Streaming
Benefits
  • International projects with top brands
  • Work with global teams of highly skilled, diverse peers
  • Healthcare benefits
  • Employee financial programs
  • Paid time off and sick leave
  • Upskilling, reskilling and certification courses
  • Unlimited access to the LinkedIn Learning library and 22,000+ courses
  • Global career opportunities
  • Volunteer and community involvement opportunities
  • EPAM Employee Groups
  • Award-winning culture recognized by Glassdoor, Newsweek and LinkedIn