We are seeking a skilled Big Data Engineer to become an essential part of this forward-looking and dynamic team.
In this role, you will leverage advanced cloud technologies to deliver impactful solutions.
If you excel at tackling complex challenges, creating robust cloud-based solutions, and driving meaningful business outcomes, this opportunity offers the chance to join a fast-paced, collaborative environment where your work shapes global success.
Responsibilities
- Build scalable, cloud-based systems in Azure using tools like Azure Data Factory, Azure Data Lake Storage, and Databricks to enhance seamless, high-performance data pipelines
- Migrate on-premises MS SQL databases to Azure Data Lake, utilizing the Delta Lake format to optimize operational performance
- Develop interfaces that connect non-Microsoft proprietary applications, enabling interoperability and unlocking valuable data insights
- Provide expertise in Data Lakes, Data Warehouses (DWH), and Delta Lakehouse architectures to guide transformative business solutions
- Assess new feature proposals, prioritize them based on business value, and collaborate to make impactful product development decisions
Requirements
- Expertise in Azure Data Services, including Data Factory, Data Lake Storage, and Databricks
- Strong SQL skills with experience in Python, Scala, or C# (versatility is highly valued)
- Background in working with Agile or XP methodologies, thriving in fast-moving, adaptive environments
- Good English communication skills (Upper-Intermediate/B2+ or higher) to engage effectively with diverse teams and stakeholders
Nice to have
- Proficiency in advanced Azure tools such as Synapse Analytics, Cosmos DB, or Apache Synapse
- Flexibility to use Core Python or Scala for automation and advanced data processing
Looking for something else?
Find a vacancy that works for you. Send us your CV to receive a personalized offer.
Find me a job