We are seeking a dynamic and experienced Lead Data Engineer to architect, develop, and maintain cutting-edge data solutions within a fast-paced and collaborative environment. This role requires expertise in modern data platforms, business intelligence tools, process optimization, and significant experience in agile methodologies, ensuring scalable and efficient data management for actionable business insights.
responsibilities
Lead the design and productionalization of ETL/ELT processes to ensure efficient data ingestion and transformation
Architect and implement Data Platforms/Data Warehousing solutions on technologies such as Microsoft SQL Server, Azure Data Warehouse, or Azure Synapse
Manage and optimize traditional relational Data Warehouse platforms (e.g., SQL Server, Oracle, Teradata) with expertise in structured data management
Craft reporting data models using multidimensional/Kimball design principles for high-performance analytics
Develop ML algorithms or frameworks with Python to support advanced analytics initiatives
Oversee fine-tuning of T-SQL queries and performance optimization for existing solutions
Define and enhance workflows leveraging tools like Power BI, Tableau, and QlikView for business intelligence needs
Create conceptual and logical solution designs through ERD diagramming and Project Start Architecture documentation
Apply ITIL practices to manage change, incident, problem resolution, release, and service request processes
Collaborate in agile settings to translate complex challenges into actionable business solutions
Implement Master Data Management strategies (or Reference Data Management) and design interfaces such as APIs for seamless data exchange
Ensure scalable and efficient pipelines using platforms like Azure Data Factory, Data Lake, and Databricks
requirements
Minimum 6 years of experience designing and developing data ingestion processes within Data Warehouse and Data Lake environments
Proficiency in Microsoft SQL Server, Azure Data Warehouse, or Azure Synapse; expertise implementing relational Data Warehouse platforms (SQL Server, Oracle, Teradata)
Advanced competency in multidimensional/Kimball modeling and structured data management
Strong skills in T-SQL, including advanced query optimization techniques and execution plans
Proficiency in BI tools such as Power BI, Tableau, or QlikView
Knowledge of Python for ML algorithm development; flexibility to design or work with ML frameworks
Capability to define architectural standards and create Project Start Architecture documentation
Familiarity with ITIL processes related to change, incident, and problem management
Experience in Azure Data Lake, Data Factory, and Azure DevOps or equivalent AWS/GCP tools
Background in agile environments with a focus on collaborative, cross-functional problem-solving
nice to have
Agile/SAFe certifications
Azure Data Engineer or equivalent AWS/GCP certifications
Azure Administrator or Solutions Architect certifications or their AWS/GCP equivalents
We are seeking a Senior Data Engineer to join a dynamic team dedicated to building a secure, scalable, and highly available cloud-based data platform. This role focuses on delivering a unified data and reporting solution by consolidating multiple data sources, enabling data-driven decision-making and empowering self-service analytics.
responsibilities
Monitor Snowflake health and performance
Debug and resolve failures in data ingestion or transformation
Support platform users in Snowflake and Apache Airflow-related queries
Build and maintain end-to-end testing frameworks to ensure platform changes don’t disrupt data flows
Tune queries and optimize data storage for speed and cost efficiency
Automate manual processes using GitLab CI/CD pipelines
Support and enhance the release process
Document data architecture, pipeline processes, and best practices
Collaborate on researching and adopting new tools and frameworks
Deliver reports and insights to unlock business value and enhance analytics capabilities
requirements
3+ years of experience in Data Software Engineering or similar roles
Proficiency in SQL with advanced query writing and optimization skills
Background in Python for data processing and automation
Expertise in Snowflake (preferred), Databricks, or Oracle
Familiarity with AWS (preferred) or Azure, including cloud services for data engineering
Basic knowledge of CI/CD pipelines using GitLab or similar tools
English language proficiency at an Upper-Intermediate level (B2) or higher
nice to have
Understanding of Terraform for infrastructure as code
Familiarity with Apache Airflow for workflow orchestration
Background in dbt for data transformation and modeling
We are seeking a highly skilled Senior Data Test Engineer to design and oversee robust automated testing frameworks, ensuring data integrity, reliability, and performance within complex enterprise data pipelines. You will lead quality engineering initiatives and collaborate with cross-functional teams to deliver high-performance solutions. Join us and help drive exceptional data quality across high-scale systems.
responsibilities
Design and implement automated testing frameworks for data systems
Lead and mentor the quality engineering team
Develop and execute advanced API test cases for large-scale systems
Define and execute performance testing strategies
Select performance testing tools and create scripts
Prepare and manage test data for various scenarios
Validate data integrity and reliability across pipelines
Analyze test results to identify performance bottlenecks
Collaborate with development teams to optimize performance
Document test plans, cases, and results
Contribute to continuous improvement of QA processes
Coordinate with cross-functional teams to ensure compliance with quality standards
requirements
Minimum 6+ years of test automation experience
Proven experience in data and data pipeline testing
Experience in performance testing of large-scale APIs
Hands-on expertise with performance testing tools such as Apache JMeter or LoadRunner
Ability to prepare and manage test data for performance and functional testing
Excellent analytical and troubleshooting skills
Strong documentation and reporting abilities
English language proficiency at B2 level (Upper-Intermediate)
nice to have
Experience in cloud-based environments
Familiarity with CI/CD pipeline integration
Proficiency in scripting or programming for test automation
We are seeking a highly skilled and detail-oriented Senior Data Engineer to join our growing team in Birkirkara. In this role, you will be a key contributor to build and optimize our data infrastructure, pipelines and analytics systems. You will be responsible for designing, building and maintaining highly scalable and secure ETL/ELT data pipelines to support the needs of analytics, data science and business teams. The ideal candidate has strong technical expertise, problem-solving skills and leadership capabilities to support the development of a scalable and robust data engineering ecosystem. This role offers a hybrid work setup, providing flexibility to work both remotely and in-office, helping you achieve a balanced professional and personal life.
responsibilities
Architect and maintain modern data platforms, warehouses and lakes (e.g., Snowflake, BigQuery, Redshift, Databricks)
Optimize the storage and retrieval of data and ensure performance and cost efficiency
Establish processes and systems for monitoring data quality, completeness and reliability
Automate manual processes and optimize data delivery workflows to reduce latency and improve job reliability
Implement and maintain Kafka-based streaming data pipelines for real-time data processing and integration with various systems
Integration to third party databases and APIs
Continuously refine and improve existing data systems and pipelines for scalability
Implement monitoring and alerting systems for data pipelines
Ensure data infrastructure uptime and availability
requirements
Minimum 5–8+ years of experience in data engineering or related roles, including experience with large-scale data processing
Proficiency in programming languages like Python, SQL
Expertise in building and maintaining ETL/ELT workflows using tools like Apache Airflow
Hands-on experience with Big Data technologies like Spark, Hadoop and Kafka
Working experience with version control systems (Git) and CI/CD pipelines
We are looking for a Senior Data Integration Engineer to join our dynamic Data Practice team and contribute to EPAM's strategic projects. As a Senior Data Integration Engineer, your primary responsibility will be to develop data systems and applications that facilitate decision-making. You will focus on creating and maintaining data pipelines and solutions in public cloud environments such as Azure, and assist in migrating data from on-premises data centers to the cloud. Additionally, you will contribute to enterprise data platform development. Based in our Cyprus office in a hybrid setup, this is your chance to make a meaningful impact in a forward-thinking environment. Would you like to be a part of this ambitious and innovative project? We look forward to receiving your application.
responsibilities
Design and implement enterprise data solutions in Public Cloud environments (Azure)
Create components for the enterprise data platform to ensure data consistency and availability
Work with Data Scientists, Data engineers and Data Architects to address data needs
Ensure secure and auditable processes to maintain data integrity
Identify and resolve data-related issues to enhance system quality
requirements
Proven track record as a Data Engineer or similar role with at least 3 years of experience
Proficiency in SQL and Python
Background in Public Cloud platforms (Azure), database management systems and ETL/ELT tools
Hands-on expertise with Databricks and/or Snowflake
Experience with data acquisition methods such as API calls and FTP downloads, data transformation/normalization, storage solutions (raw files, database servers) and distribution mechanisms (access layers, APIs, entitlements)
Skills in building components for enterprise data platforms, including Data Warehouses, Operational Data Stores and API-driven access layers
A bachelor's degree in computer science, information technology or other relevant discipline
We are seeking an experienced Senior Data Integration Engineer with expertise in Markit EDM and GTMatch , who can deliver high-quality data solutions and successfully support production environments while contributing to ongoing improvements.
responsibilities
Work on requirement analysis, ensuring accurate interpretation and implementation of data integration needs
Handle post-implementation tasks, including production support and user guidance
Manage incident resolution and user support through ITSM tools
Conduct risk analysis and devise mitigation strategies for critical situations
Perform system integration and unit testing to ensure seamless data processing
Develop and maintain scripts using Java, PL/SQL, and Shell Scripting
Facilitate integration of Markit EDM and GTFrame/GTMatch solutions into existing workflows
Utilize strong analytical skills to diagnose and resolve issues within data integration processes
Collaborate effectively with stakeholders and team members to enhance workflows and propose improvements
Ensure compliance with SWIFT banking protocol and its integration with data platforms
Maintain effective communication to deliver project updates and address client concerns
requirements
Minimum 3 years of working experience in data integration or a related field
Expertise in Markit EDM, Bottomline GTFrame/GTMatch, and SWIFT messaging protocols
Proficiency in Microsoft SQL Server and Oracle for database management
Background in Java coding language, PL/SQL, and Shell Scripting
Knowledge of incident management processes and ITSM tools
Capability to perform requirement analysis and system integration testing
Competency in risk analysis and mitigation strategies
Understanding of production support and user incident management
Strong communication skills with a collaborative mindset
Excellent command of written and spoken English (B2+ level)
nice to have
Familiarity with MarkitSERV DSMatch and its capabilities
Showcase of experience in risk analysis and management
Background in carrying out system integration testing for large-scale solutions
We are seeking a skilled Data Science Consultant to join our team and contribute to the delivery of AI and Data Cloud Solutions. As a member of our team, you will collaborate with data scientists, engineers, and product owners to advance our AI delivery framework. Your expertise in data structures, databases, and ETL tools will help optimize AI processes while ensuring compliance with security and ethical standards. This role provides the opportunity to drive AI innovation in a collaborative and dynamic environment.
responsibilities
Drive and implement continuous improvements in the client's delivery framework
Apply (SAFe) Agile operational standards and practices to optimize efficiency and AI investment returns
Act as a subject matter expert in Data Science or Data Engineering
Collaborate with Product Owners to gather, address, and align their needs and requirements
Ensure our client's solutions comply with security, AI ethics, DPP, legal, and works council standards
Facilitate quarterly planning activities for the client
Contribute to operational KPIs reporting and drive improvements in these KPIs
Contribute to the development of reusable and enablement assets relevant to the client's context
Educate customers and stakeholders on AI and machine learning concepts
requirements
Bachelor’s or master’s degree in machine learning, computer science, engineering, or related technical fields
Background in working with teams of AI Scientists, MLOps Engineers, and Product Owners
Showcase of stakeholder communication skills to understand SAP business processes
Understanding of agile team practices and methods including Scrum, Kanban, and SAFe
Proficiency in AI and ML concepts, including MLOps, technologies, and cloud frameworks such as Jupyter, Docker, Kubernetes, Github, SAP BTP, OCR, NLP, and CV
Expertise in Python libraries and machine learning frameworks such as NumPy, Pandas, Keras, scikit-learn, TensorFlow, PyTorch, and Gensim
Interest in business process engineering and modeling
Qualifications in ITIL and ITSM practices
Capability to take ownership of tasks and demonstrate collaborative teamwork skills
Ability to communicate effectively in both written and spoken English (B2 level or higher)
Lead the development team, ensuring high-quality code delivery and adherence to architectural standards. We are seeking a highly skilled Senior Data Lead with deep expertise in PySpark and strong experience in Azure Data Factory/Synapse. The ideal candidate will have a proven track record in building and optimizing data pipelines, implementing scalable data solutions, and working with modern DevOps practices in cloud environments.
responsibilities
Design, develop, and optimize large-scale data processing solutions using PySpark
Implement and maintain advanced data pipelines and workflows in Azure Data Factory and Azure Synapse
Develop, automate, and monitor development and build pipelines for robust data engineering solutions
Collaborate with cross-functional teams to deliver high-quality, scalable, and reliable data solutions
Apply best practices for code optimization, version control (Git), and infrastructure automation
Integrate Azure Functions for data orchestration and transformation tasks as needed
Contribute to infrastructure setup and automation using Terraform
Document solutions and mentor junior team members
requirements
Minimum 8 years’ experience in Data engineering
Extensive hands-on experience with PySpark, including writing optimized and scalable code
Strong background in Azure Data Factory and/or Azure Synapse for data integration and orchestration
Hands-on experience with Azure services and Azure Functions
Experience designing and managing development and build pipelines in cloud environments
Proficient with Git for version control and collaboration
Familiarity with Terraform for infrastructure automation
Experience integrating Azure Functions into data workflows
Excellent knowledge of building development pipelines & build pipelines
Advanced knowledge of Azure DevOps, CI/CD, infrastructure as code, and automation tools
Excellent problem-solving, communication, and documentation skills
Ability to work independently and mentor junior engineers
We are excited to announce an opportunity for a Manager, Data Analytics Consulting at EPAM, a role essential for assisting our customers in creating information strategies and supporting the data-driven transformation of their enterprises. Our ideal candidate will come with a robust Life Sciences background, prepared to lead and manage data analytics initiatives. If you are ready to make significant impacts and thrive in a vibrant, engaging environment, we welcome you to join our team!
responsibilities
Engage with customer business stakeholders to identify Data & Analytics opportunities
Lead and challenge customers and team members to develop innovative strategies and solutions
Map customer visions and requirements to specific products and components
Advocate for data and work to eliminate business, technical, and political impediments
Mentor and coach clients and team members on data & analytics best practices
Oversee consulting engagements, ensuring quality delivery and output
Guarantee strategies are actionable and well documented for smooth handover to implementation teams
Assemble EPAM analysts & engineers to deliver MVPs
Collaborate with account managers to nurture and manage customer relationships
Design and lead workshops & interviews to define and prioritize client needs
Collaborate with Solution Architects and Technical Leads to design solutions that drive business value
Assist the business development team with proposals and pitches for new business
Respond to RFxs, create proposals, and present to potential clients
Lead Data & Analytics presale activities in the Healthcare segment
requirements
7+ years of total experience in the data and analytics domain
3+ years in roles focused on management, delivery, consulting, or presales
5+ years in the Life Science industry or with Life Science clients
Knowledge of one or more Life Science segments such as clinical trials, R&D, Commercial, Regulatory & Compliance
Proficiency in facilitating and driving strategy discussions
Competency to engage in and influence solution design discussions
Solid knowledge of the data and analytics landscape and recent trends
Experience in comprehensive data and analytics project delivery
Excellent communication skills and dynamic presentation abilities
Skills in managing customer expectations and presenting project deliverables to senior stakeholders
Capability to shape a presales opportunity into a customer engagement
Primary career focus on areas such as Cloud Data Platforms, BI, Data Warehousing, Data Lakes, Data Science & Predictive Analytics
Experience leading medium to large-scale software/technology projects
Understanding of Cloud, Data, Analytics, and Data Science technologies and trends
Flexibility to learn & expand knowledge in new technology trends within Data & Analytics
English level of minimum B2 (Upper-Intermediate) for effective communication
We're seeking a Data Technology Consultant to join the Data Practice team and help our clients in unlocking their data's full potential. In this role, you'll contribute to projects centered around digital transformation, data platforms & science, business analytics, intelligent automation, and cloud solutions.
responsibilities
Work with European technical and business data practices, assisting clients in their Data Analytics strategy and delivery programs
Maximize the value of clients' Data & Analytics initiatives by recognizing appropriate solutions and services
Act as Data Technology Consultant and/or Data Solution Architect, working with the delivery team in complex programs
Maintain understanding of technical solutions, architecture design trends and best practices
Drive Data Analytics initiatives and technology consulting engagements
Collaborate with internal, client and third-party teams to execute transformations
Understand the intersection between technology, customers and business
Stay updated on emerging trends and challenges in clients' markets and geographies and how it affects clients’ business and initiatives
Work closely with project/program management to ensure successful delivery through an integrated delivery model
Deliver clear and consistent communications within projects with relevant stakeholders
Establish and cultivate strong relationships with clients
requirements
Strong experience as a Data Technical Consultant and Data Solution Architect
Hands-on technology experience in the areas of Data Analytics
Skills in one of the following: Big Data, BI, Data Warehousing, Data Science, Data Management, Data Storage, Data Visualization
Good knowledge in at least one of the Cloud providers (AWS, Azure, GCP)
Background in continuous delivery tools and technologies
Ability to work with relevant delivery teams
Skill in effectively communicating technology pros & cons and presenting rational options to clients
Confidence in expressing viewpoints, making recommendations and presenting analysis when needed
English language proficiency at an Upper-Intermediate level (B2) or higher