Skip To Main Content
backBack to Search

Senior Palantir Data Engineer (Python/Pyspark) (relocation to Cyprus)

Office in Spain,
& 12 others
Data Software Engineering
& 3 others
relocation
Refer a Friend
Know someone who would be a great fit for this opportunity?

Are you passionate about solving complex big data analytics problems using cutting-edge technologies?

EPAM is looking for a skilled Senior Palantir Data Engineer to join our growing, globally distributed team.

In this role, you’ll work on a high-impact Data Transformation project with our client from the insurance sector. This initiative leverages Big Data and Machine Learning technologies to shape data-driven decisions in the Property & Casualty business domain.

If you’re eager to apply your expertise in Python/PySpark, SQL and Palantir to design and implement complex data pipelines, while collaborating with a multicultural and dynamic team, we’d love to hear from you! We offer a hybrid work model with a mix of remote and on-site work at EPAM’s Nicosia office.

Responsibilities
  • Lead the design and implement robust, large-scale data pipelines and analytics solutions
  • Oversee the monitoring and optimization of data pipelines for performance and scalability using advanced tools and techniques, including Python/PySpark and structured query languages
  • Optimize data workflows to support critical decision-making processes
  • Harness state-of-the-art tools and technologies (including Palantir Foundry) to address new and emerging business challenges
  • Partner with cross-functional and globally distributed teams (e.g., data scientists, analysts, business stakeholders) to align project goals and execution strategies
  • Contribute to a global strategic initiative focused on enhancing the ability to make data-driven decisions across the Property & Casualty value chain
  • Stay ahead of emerging technologies and trends (e.g., Generative AI, Machine Learning) and recommend potential applications in the data ecosystem
Requirements
  • A Bachelor’s degree (or equivalent) in Computer Science, Data Science or a related discipline
  • 5+ years of experience working with large-scale distributed computing systems
  • Proficiency in Python/PySpark to build and optimize complex data pipelines
  • Hands-on experience working with Databricks for large-scale data processing and analytics
  • Strong SQL skills (preferably Spark SQL) for data querying and manipulation
  • Deep understanding of data warehousing concepts and ELT techniques
  • Experience with Palantir Foundry is a must
  • Familiarity with Agile and Scrum development methodologies
Nice to have
  • Knowledge of HTML, CSS, JavaScript and Gradle
  • Experience in the Insurance domain or the financial industry
  • Familiarity with Microsoft Power BI
  • Exposure to Machine Learning or Generative AI technologies
We offer/Benefits
  • Private healthcare insurance
  • Global travel medical and accident insurance
  • Regular performance assessments
  • Referral bonuses
  • Family friendly initiatives
  • Learning and development opportunities including in-house training and coaching, professional certifications, over 22,000 courses on LinkedIn Learning Solutions and much more
  • *All benefits and perks are subject to certain eligibility requirements
Looking for something else?

Find a vacancy that works for you. Send us your CV to receive a personalized offer.

Find me a job
Refer a Friend
Know someone who would be a great fit for this opportunity?