PySpark Jobs
FILTERS
COUNTRY
9
4
SPECIALIZATIONS
13
SKILLS
311
249
241
198
191
13
Start typing...
SENIORITY
11
2
INDUSTRY
13
Viewing 1-10 out of 13 jobs found
relevant first
40 hrs/week
12+ months
Poland
Poland
Join our remote team as a Senior Data Software Engineer.
Join our remote team as a Senior Data Software Engineer within a global leader at the forefront of data analytics and insights. We are actively seeking a hands-on and deeply technical developer to collaborate closely with development peers, product leadership, and other technical staff to create innovative and impactful data-driven solutions. This role offers an opportunity to contribute significantly to the design, development, and optimization of features in a dynamic Agile development environment.
responsibilities
- Design and develop new features using the Agile development process (Scrum)
- Prioritize and ensure high-quality standards at every stage of development
- Guarantee reliability, availability, performance, and scalability of systems
- Maintain and troubleshoot code in large-scale, complex environments.
- Collaborate with Developers, Product and Program Management, and senior technical staff to deliver customer-centric solutions.
- Provide technical input for new feature requirements, partnering with business owners and architects
- Ensure continuous improvement by staying abreast of industry trends and emerging technologies
- Drive the implementation of solutions aligned with business objectives.
- Mentor and guide less experienced team members, helping them enhance their skills and grow their careers
- Participate in code reviews, ensuring code quality and adherence to standards
- Collaborate with cross-functional teams to achieve project goals
- Actively contribute to architectural and technical discussions
requirements
- At least 3+ years of production experience in Data Software Engineering
- Be hands-on with deep expertise in server development in Python and PySpark
- Deep expertise in Azure Data Factory for building scalable and high-performance applications
- Experience with Advanced SQL for designing and managing database schema, including procedures, triggers, and views
- Experience in Data analysis and troubleshooting
- Knowledge of Integration testing support for version control, integration, and deployment
- Support applications and systems in a production environment, ensuring timely resolution of issues
- Reviewing requirements and translating them into a documented technical design for implementation
- Exposure to Databricks, hdinishght, azure data lake, data api, Spark, Scala, Kafka for application packaging and deployment
- Expertise in Big Data Primary skills and Data background for designing and building scalable applications
- Excellent communication skills in spoken and written English, at an upper-intermediate level or higher
nice to have
- Experience with EDL changes in DB Views/Stored procedures is a plus
40 hrs/week
12+ months
Poland
Poland
Become a key player in our remote team by taking on the role of a Senior Data Software Engineer dedicated to a project centered around Databricks workflows, APIs, analytical development, and data engineering.
In this role, your primary focus will involve constructing and sustaining intricate data pipelines, facilitating seamless deployment to production. Your responsibilities extend to crafting end-to-end production solutions while engaging with cross-functional teams to deliver top-notch results.
responsibilities
- Engage in the Agile development process (Scrum) to conceive and implement innovative features
- Prioritize and uphold high-quality standards throughout each developmental phase
- Ensure the dependability, accessibility, performance, and scalability of systems
- Troubleshoot and maintain code within expansive, intricate environments
- Work in tandem with Developers, Product and Program Management, and seasoned technical professionals to furnish customer-centric solutions
- Provide technical insights for new feature requirements in collaboration with business owners and architects
- Stay abreast of industry trends and emerging technologies for continuous improvement
- Champion the execution of solutions aligned with business objectives
- Guide and mentor less seasoned team members, fostering skill enhancement and career growth
- Participate in code reviews, ensuring adherence to standards and code quality
- Collaborate seamlessly with cross-functional teams to achieve project objectives
- Actively contribute to architectural and technical discourse
requirements
- A minimum of 3 years of hands-on experience in Data Software Engineering
- Proficiency in Databricks, Microsoft Azure, PySpark, Python, and SQL for development and deployment in production
- Familiarity with Azure DevOps, GitHub (or alternative platforms), and version control for efficient project management
- Capability to develop comprehensive end-to-end production solutions
- Robust experience on one or more cloud platforms such as Azure, GCP, AWS
- Proven track record in constructing resilient data pipelines
- Capacity to integrate disparate elements for solutions spanning multiple systems
- Exceptional communication skills in both spoken and written English, at an upper-intermediate level or higher
nice to have
- Experience with REST APIs and Power BI would be an advantage
40 hrs/week
12+ months
Poland
Poland
We are seeking a remote Senior Data Software Engineer with experience in PySpark, Azure Data Factory, and advanced SQL for the project.
The ideal candidate must have at least 2 years of solid/hands-on experience in Data Software Engineering with primary skills in Big Data and a strong data background.
responsibilities
- Conduct data analysis and troubleshootin
- Plan and implement new requirements/data entities on ED
- Provide support for integration testing
- Make sure data pipelines are scalable and efficient
requirements
- 3+ years of relevant work experience
- Must have skills in DSE Python and Azure Databricks
- Familiarity with EDL changes in DB Views/Stored procedures and integration testing support
- Advanced knowledge of PySpark, Azure Data Factory, and SQL
- Ability to collaborate effectively with the team
- Excellent communication skills with an upper-intermediate level of English
nice to have
- Experience with HDInsight, Azure Data Lake, Data API, Spark, Scala, and Kafka will be an added advantage
40 hrs/week
12+ months
Portugal
Portugal
We are looking for a talented and driven Data Engineer to become a part of our dynamic team.
The chosen candidate will handle the upkeep of existing data systems, boost the capabilities of our existing Data Platform, and manage upcoming modifications to align with our clients' evolving consumption requirements and data ingestion. This position demands proficiency in managing intricate datasets and working in close collaboration with senior team members on projects of significant impact.
responsibilities
- Uphold and boost the capabilities of existing data systems
- Enhance the existing Data Platform
- Manage modifications to meet growing consumption requirements and data ingestion
- Handle complex datasets, including the creation of new datasets from the ground up
- Partner with senior team members on complex projects, especially when introducing new large datasets
- Evolve with the technological landscape, embracing new tools and methodologies as they appear
requirements
- 2+ years in Data Software Engineering
- Proficient in PySpark
- Skilled in Azure Databricks, Azure DevOps, and Azure Event Hubs
- Knowledgeable about the Azure cloud stack
- Experienced in handling large and complex datasets
- Capable of teamwork and collaboration with various stakeholders
- Knowledge of Databricks Unity Catalog and Terraform
nice to have
- Background in Azure Analytics Engineering
- Knowledge of DevOps practices
40 hrs/week
12+ months
Portugal
Portugal
We are seeking a highly skilled and motivated Data Engineer to join our team.
The ideal candidate will be responsible for maintaining existing data systems, enhancing our current Data Platform, and implementing future changes as our clients' consumption needs and ingestion evolve. This role requires expertise in handling complex datasets and collaborating with senior team members on high-impact projects.
responsibilities
- Maintain and enhance existing data systems
- Develop enhancements for the existing Data Platform
- Implement future changes to accommodate growing consumption needs and ingestion
- Work with complex datasets, including the addition of new datasets from scratch
- Collaborate with senior team members on complex projects, particularly when adding new large datasets
- Grow with the technological landscape, adapting to new tools and methodologies as they emerge
requirements
- 2+ experience in Data Software Engineering
- Proficiency in PySpark
- Experience with Azure Databricks, Azure DevOps, and Azure Event Hubs
- Knowledge of the Azure cloud stack
- Experience in working with large and complex datasets
- Ability to work in a team and collaborate with various stakeholders
- Familiarity with Databricks Unity Catalog and Terraform
nice to have
- Experience in Azure Analytics Engineering
- Familiarity with DevOps practices
40 hrs/week
12+ months
Portugal
Portugal
We are seeking a highly skilled and motivated Senior Data Engineer to join our team.
The ideal candidate will be responsible for maintaining existing data systems, enhancing our current Data Platform, and implementing future changes as our clients' consumption needs and ingestion evolve. This role requires expertise in handling complex datasets and collaborating with senior team members on high-impact projects.
responsibilities
- Maintain and enhance existing data systems
- Develop enhancements for the existing Data Platform
- Implement future changes to accommodate growing consumption needs and ingestion
- Work with complex datasets, including the addition of new datasets from scratch
- Collaborate with senior team members on complex projects, particularly when adding new large datasets
- Grow with the technological landscape, adapting to new tools and methodologies as they emerge
- Ensure data quality and integrity in all solutions delivered
- Optimize data retrieval and data storage processes
- Conduct data analysis and provide insights to inform business decisions
requirements
- 3+ years of experience in Data Software Engineering
- Proficiency in PySpark
- Experience with Azure Databricks, Azure DevOps, and Azure Event Hubs
- Knowledge of the Azure cloud stack
- Experience working with large and complex datasets
- Ability to work in a team and collaborate with various stakeholders
- Familiarity with Databricks Unity Catalog and Terraform
- Strong problem-solving skills and attention to detail
- Fluent English communication skills at a B2+ level
nice to have
- Experience in Azure Analytics Engineering
- Familiarity with DevOps practices
40 hrs/week
12+ months
Poland
Poland
Join our remote team as a Senior Data Software Engineer.
Join our remote team as a Senior Data Software Engineer . We are actively seeking a hands-on and deeply technical engineer to collaborate closely with development peers, product leadership, and other technical staff to create innovative and impactful solutions. This role offers an opportunity to contribute significantly to the design, development, and optimization of features in a dynamic Agile development environment, with a focus on Databricks workflows, APIs, and Data Engineering.
responsibilities
- Design and develop new features using the Agile development process (Scrum)
- Prioritize and ensure high-quality standards at every stage of development
- Guarantee reliability, availability, performance, and scalability of systems
- Maintain and troubleshoot code in large-scale, complex environments.
- Collaborate with Developers, Product and Program Management, and senior technical staff to deliver customer-centric solutions.
- Provide technical input for new feature requirements, partnering with business owners and architects
- Ensure continuous improvement by staying abreast of industry trends and emerging technologies
- Drive the implementation of solutions aligned with business objectives.
- Mentor and guide less experienced team members, helping them enhance their skills and grow their careers
- Participate in code reviews, ensuring code quality and adherence to standards
- Collaborate with cross-functional teams to achieve project goals
- Actively contribute to architectural and technical discussions
requirements
- At least 3 years of production experience in Data Software Engineering
- Expertise in Databricks, Microsoft Azure, PySpark, Python, and SQL for building both within development and enabling deployment to production
- Experience with Azure DevOps, GitHub, (or others), and version control for effective project management
- Ability to develop end-to-end production solutions
- Strong experience working on one or more cloud platforms such as Azure, GCP, AWS
- Experience in building out robust data pipelines
- Ability to tie loose ends together for solutions across systems
- Excellent communication skills in spoken and written English, at an upper-intermediate level or higher
nice to have
- Experience with REST APIs and Power BI would be a plus
40 hrs/week
12+ months
Poland
Poland
Join our remote team as a Senior Data Software Engineer.
Join our remote team as a Senior Data Software Engineer within a global leader in providing cutting-edge cloud-based solutions. We are actively seeking a hands-on and deeply technical developer to collaborate closely with development peers, product leadership, and other technical staff to create innovative and impactful solutions. This role offers an opportunity to contribute significantly to the design, development, and optimization of features in a dynamic Agile development environment, using technologies such as Python, PySpark, Microsoft Azure, SQL Azure, and Databricks.
responsibilities
- Setting up required Azure services
- Building and deploying POC for replacement of the external vendor inside of the infrastructure
- Extracting data from data lake (EDL)
- Processing data based on application requirements & architecture
- Mimicking current existing application
- Prioritizing and ensuring high-quality standards at every stage of development
- Guaranteeing reliability, availability, performance, and scalability of systems
- Collaborating with Developers, Product and Program Management, and senior technical staff to deliver customer-centric solutions.
- Providing technical input for new feature requirements, partnering with business owners and architects
- Ensuring continuous improvement by staying abreast of industry trends and emerging technologies
- Actively contributing to architectural and technical discussions
requirements
- At least 3+ years of production experience in Data Software Engineering
- Be hands-on with deep expertise in Data Engineering, in both functional and non-functional areas
- Deep expertise in PySpark for building scalable and high-performance data applications
- Experience with Microsoft Azure for cloud-based infrastructure and application management
- Familiarity with SQL Azure for designing and managing database schema, including procedures, triggers, and views
- Exposure to Databricks for creating unified data analytics platforms
- Knowledge of Python web services such as Django and Flask for building efficient APIs and web solutions
- Support applications and systems in a production environment, ensuring timely resolution of issues
- Expertise in build and test tools for managing build and testing processes
- Excellent communication skills in spoken and written English at an Upper-intermediate level or higher
nice to have
- Experience in Big Data technologies such as Hadoop, Spark, Kafka, and Hive is a plus
40 hrs/week
12+ months
Poland
Poland
Join our remote team as a Senior Data Software Engineer.
Join our remote team as a Senior Data Software Engineer within a global leader at the forefront of cybersecurity technologies, unified endpoint management, and intelligent automation services. We are actively seeking a hands-on and deeply technical developer to collaborate closely with development peers, product leadership, and other technical staff to create innovative and impactful solutions. This role offers an opportunity to contribute significantly to the design, development, and optimization of features in a dynamic Agile development environment.
responsibilities
- Develop and implement Proof of Concept (POC) for the replacement of the external vendor inside of Estee Lauder Azure infrastructure.
- Setup required Azure services for POC
- Extract data from data lake (EDL)
- Process data based on application requirements & architecture
- Mimic the current existing application.
- Prioritize and ensure high-quality standards at every stage of development
- Guarantee reliability, availability, performance, and scalability of systems
- Collaborate with Developers, Product and Program Management, and senior technical staff to deliver customer-centric solutions.
- Provide technical input for new feature requirements, partnering with business owners and architects
- Ensure continuous improvement by staying abreast of industry trends and emerging technologies
- Drive the implementation of solutions aligned with business objectives.
- Mentor and guide less experienced team members, helping them enhance their skills and grow their careers
- Participate in code reviews, ensuring code quality and adherence to standards
- Collaborate with cross-functional teams to achieve project goals
- Actively contribute to architectural and technical discussions
requirements
- At least 3 years of production experience in Data Software Engineering
- Be hands-on with deep expertise in Python and PySpark for building scalable data processing pipelines
- Deep expertise in Microsoft Azure for designing and building cloud-based infrastructure and applications
- Experience with SQL Azure for designing and managing database schema, including procedures, triggers, and views
- Familiarity with Databricks for data engineering and analytics
- Support applications and systems in a production environment, ensuring timely resolution of issues
- Reviewing requirements and translating them into a documented technical design for implementation
- Exposure to containerization technologies such as Docker for application packaging and deployment
- Experience with Python web services (e.g., Django, Flask) for building scalable and high-performance applications
- Excellent communication skills in spoken and written English, at an upper-intermediate level or higher
nice to have
- Experience with Advanced SQL is a plus
40 hrs/week
12+ months
Poland
Poland
Join our remote team as a Senior Data Software Engineer contributing to a project centered around Databricks workflows, APIs, analytical development, and data engineering.
In this role, you will play a pivotal part in constructing and sustaining intricate data pipelines while facilitating seamless deployments to production. Your involvement will extend to crafting end-to-end production solutions and collaborating with cross-functional teams to deliver top-tier solutions.
responsibilities
- Contribute to the design and development of novel features within the Agile development framework (Scrum)
- Prioritize and uphold high-quality standards across all development stages
- Ensure the reliability, availability, performance, and scalability of systems
- Troubleshoot and maintain code within expansive and intricate environments
- Collaborate with Developers, Product and Program Management, and senior technical personnel to provide customer-centric solutions
- Offer technical insights for new feature requirements, collaborating with business owners and architects
- Stay abreast of industry trends and emerging technologies for continuous improvement
- Implement solutions aligned with business objectives
- Guide and mentor less experienced team members to foster skill enhancement and career growth
- Participate in code reviews, upholding code quality and adherence to standards
- Actively engage in architectural and technical discussions within cross-functional teams to achieve project goals
requirements
- Minimum of 3 years of hands-on experience in Data Software Engineering in a production setting
- Proficiency in Databricks, Microsoft Azure, PySpark, Python, and SQL for development and deployment to production
- Familiarity with Azure DevOps, GitHub (or alternative platforms), and version control for effective project management
- Capability to architect end-to-end production solutions
- Robust experience on one or more cloud platforms like Azure, GCP, AWS
- Proven track record in constructing resilient data pipelines
- Ability to integrate disparate elements for comprehensive solutions across systems
- Exceptional communication skills in both spoken and written English, at an upper-intermediate level or higher
nice to have
- Exposure to REST APIs and Power BI would be advantageous
Viewing 1-10 out of 13 jobs found