We are seeking a Cloud Security Architect to lead the design and implementation of secure cloud solutions across diverse platforms and client environments. This role is ideal for a seasoned security leader with deep expertise in cloud-native architectures, threat modelling, and enterprise security strategy. You will collaborate with engineering, DevOps, and business stakeholders to embed security into every layer of cloud infrastructure and application lifecycle.
responsibilities
Define and drive cloud security architecture strategies across AWS, Azure, GCP, and hybrid environments
Lead threat modelling, risk assessments, and architecture reviews for cloud-native and containerised applications
Develop and enforce cloud security standards, policies, and reference architectures
Guide engineering teams in implementing secure CI/CD pipelines, infrastructure-as-code (IaC), and automated security controls
Evaluate and integrate cloud security tools (e.g., CSPM, CWPP, CIEM) into enterprise environments
Collaborate with application architects and product teams to ensure secure design patterns and compliance with regulatory frameworks (e.g., ISO 27001, NIST, GDPR)
Mentor and coach cloud security engineers and contribute to internal knowledge sharing
Support incident response and forensic investigations related to cloud environments
Stay current with emerging threats, technologies, and best practices in cloud security
requirements
8+ years of experience in information security, with 4+ years focused on cloud security architecture
Proven expertise in designing secure solutions on AWS, Azure, or GCP
Strong understanding of cloud-native services, IAM, network security, encryption, and logging
Experience with IaC tools (Terraform, CloudFormation), container security (Kubernetes, Docker), and DevSecOps practices
Familiarity with cloud security frameworks (CSA, CIS Benchmarks, NIST)
Excellent communication and stakeholder management skills
Relevant certifications (e.g., CCSP, AWS Certified Security Specialty, AWS Certified Solutions Architect, Azure Security Engineer Associate, Azure Solutions Architect Expert, Google Professional Cloud Security Engineer, Google Professional Cloud Architect) are a plus
We are looking for a skilled and innovative BTP Cloud Developer with expertise in Node.js to join our forward-thinking development team. In this role, you will design, develop, and deploy cutting-edge cloud-native solutions leveraging modern frameworks, tools, and technologies while contributing to transformative projects in a collaborative environment.
responsibilities
Design and implement scalable cloud-based applications leveraging SAP Business Technology Platform (BTP)
Develop efficient and robust backend solutions using Node.js and related frameworks
Integrate microservice architectures and event-driven design principles to build enterprise-grade applications
Collaborate with cross-functional teams to integrate features, troubleshoot issues, and maintain system performance
Enhance application performance through in-memory database optimization
Promote DevOps practices to streamline deployment pipelines and CI/CD processes
Participate in Agile ceremonies and contribute to iterative development cycles
Implement and refine security best practices for cloud-native applications
Explore AI integration to optimize software development lifecycle processes when applicable
Build APIs and services leveraging standards like REST or GraphQL while ensuring high-quality code
requirements
3+ years of professional experience in software development
Strong expertise in Node.js, including building and deploying scalable, robust solutions
Knowledge of SAP BTP, SAP CAP, SAP Fiori, or UI5 is an advantage
Background in cloud platforms such as AWS, Azure, or GCP
Familiarity with microservice architecture and event-driven systems design
Understanding of in-memory database solutions for performance optimization (e.g., Redis or HANA)
Competency in working within Agile environments and using DevOps tools for automation
Skills in modern API design, including REST or GraphQL
Flexibility to adapt to new technologies and frameworks as needed
nice to have
Expertise in frontend frameworks like React, Vue, or Angular
Familiarity with other SAP products
Practical experience with CI/CD pipelines or automation-focused DevOps practices
Interest in AI-driven approaches within software development and delivery processes
We are seeking a Senior BTP Cloud Developer with exceptional Node.js expertise to join our innovative team, building scalable and high-performing cloud-native applications that empower businesses to succeed in a rapidly changing tech landscape.
responsibilities
Design, develop, and deploy highly performant cloud-native software solutions
Create microservice-based architectures for enterprise-grade applications
Integrate in-memory database technologies for optimized application performance
Collaborate on projects utilizing AI to enhance software development processes
Apply Agile methodologies and DevOps practices to ensure efficient project lifecycles
requirements
5+ years of professional development experience
Strong expertise in Node.js, with advanced skills in building large-scale applications
Background in SAP BTP, SAP CAP, Fiori, or UI5 and understanding their integrations
Knowledge of at least one hyperscaler (AWS, Azure, GCP)
Familiarity with microservices and event-driven designs for enterprise applications
Competency in building and managing APIs with secure and optimized workflows
Showcase of experience in delivering solutions within Agile and DevOps settings
nice to have
Proficiency in frontend frameworks like React, Angular, or Vue
Familiarity with other SAP products
Flexibility to work with tools supporting CI/CD and automation pipelines
Interest in leveraging AI to transform software development methods
We are looking for a Jira Administrator & Application Support Specialist to join our international team supporting one of the world’s largest Jira environments. This role offers the opportunity to collaborate directly with Atlassian Premier Support and engage in cloud migration, app modernization, and SRE practices within a dynamic agile ecosystem.
responsibilities
Support end-user requests and process them in line with operational guidelines
Maintain and extend automation rules using Jira Automation
Develop and manage workflows with Jira Misc Workflow Extensions (JMWE)
Manage Xray Test Management configurations and permissions
Configure and support Structure for Jira boards and reports
Troubleshoot and resolve configuration or integration issues for end-users
Develop Python scripts or REST integrations for custom reporting or automation
Participate in incident resolution and root cause analysis
Support cloud migration initiatives including app gap analysis and re-platforming
requirements
2+ years of experience in Jira Administration (Data Center and Cloud)
Knowledge of Jira Automation, including rules, triggers and actions
Expertise in JMWE, Structure and Xray plugins
Competency in workflow design, permission schemes and field configurations
Understanding of Scrum, Kanban and Agile methodology
Skills in using AI tools and agents such as ChatGPT or Copilot for automation and productivity in Jira
Proficiency in Python scripting or REST API integrations
Familiarity with Jira Software Cloud migration
Background in integrating ServiceNow, Confluence, Bitbucket or GitHub
Knowledge of Kubernetes or GCP for Data Center environments
Strong communication and documentation skills
Capability to work proactively and solve problems in a fast-paced agile environment
English level of minimum B2 (Upper-Intermediate) or higher for daily collaboration and documentation
We are looking for a DevOps Engineer with GCP expertise , motivated to solve technical challenges and grow in modern cloud-native environments. In this role, you will contribute to our infrastructure and automation strategy, supporting scalable, secure, and high-performing systems. You will work closely with senior engineers and cross-functional teams to improve reliability, streamline deployments, and help development teams deliver high-quality software faster and more efficiently.
responsibilities
Configure and maintain CI/CD pipelines
Contribute to Infrastructure as Code implementations
Operate and support Kubernetes-based workloads
Assist in implementing GitOps-based deployments
Support observability and monitoring solutions
Create and maintain dashboards in Prometheus and Grafana/Plutono
Assist with logging and tracing solutions
Support error tracking and monitoring tools
Work with Helm charts and contribute to their maintenance
Provide operational support and contribute to infrastructure reliability improvements
Collaborate with stakeholders and team members to ensure successful delivery
requirements
2+ years of experience in a DevOps / Cloud Engineering role
Hands-on experience with GCP or similar cloud platforms
Practical experience with Kubernetes
Experience with CI/CD pipelines and automation
Basic understanding of GitOps concepts
Experience with monitoring tools
Scripting or automation experience (Bash, Python or similar)
Good analytical and problem-solving skills
Strong communication skills in English, meeting at least a B2 proficiency level (written and spoken)
Willingness to work in a hybrid setup (home office and office)
We are looking for a skilled Software Architect with experience in SAP CAP and Node.js to design and implement cutting-edge solutions using SAP technologies and modern cloud-native methods. Join our team to shape scalable systems, lead with expertise, and drive innovation for our customers.
responsibilities
Design and architect SAP solutions, including CAP Node.js applications
Collaborate with teams to ensure delivery of cloud-native solutions
Provide expertise in ERP and S/4HANA processes for end-to-end workflows
Ensure seamless integration with SAP BTP, OData, and other technologies
Guide projects with a customer-oriented approach and solid communication
Maintain code quality, scalability, and best practices in software development
Support DevOps principles like automation and CI/CD pipelines
Deliver presentations and communicate effectively in English
Travel occasionally to support customer projects
Uphold Agile methodologies and team collaboration
requirements
7+ years of working experience in SAP, with expertise in BTP, ABAP, Fiori, HANA, and Integration
Background in ERP or S/4HANA with understanding of end-to-end processes
Strong expertise in Java and Spring Boot, paired with robust application development skills
Familiarity with microservice architectures and event-driven design
Proficiency in major cloud providers like AWS, Azure or GCP
Knowledge of SAP CAP, Fiori, and UI5 is advantageous
Excellent presentation and communication skills in English (B2+)
Willingness to travel occasionally to customer sites
nice to have
Understanding of Agile methodologies and practical DevOps skills (CI/CD, automation)
Interest in AI-driven SDLC methodologies
Competency in additional SAP products beyond CAP and BTP
We are seeking a qualified Engineering Manager with expertise in SAP CAP and Node.js to design and implement innovative solutions using SAP technologies and modern cloud-native approaches. Join us to build scalable systems, lead with proficiency, and foster innovation for our clients.
responsibilities
Design SAP solutions, including CAP Node.js applications
Ensure collaboration across teams to deliver cloud-native solutions
Offer expertise in ERP and S/4HANA workflows for end-to-end processes
Achieve seamless integration with SAP BTP, OData, and related technologies
Lead projects with a customer-focused mindset and effective communication
Maintain high standards of code quality, scalability, and best practices
Support DevOps principles, including automation and CI/CD pipelines
Deliver presentations and communicate fluently in English
Travel to customer sites when required
Encourage team collaboration and implement Agile methodologies
requirements
9+ years of experience in SAP, with expertise in BTP, ABAP, Fiori, HANA, and Integration
Background in ERP or S/4HANA with understanding of end-to-end workflows
Strong skills in Java and Spring Boot, with a focus on application development
Familiarity with microservice architectures and event-driven design
Proficiency with major cloud platforms such as AWS, Azure or GCP
Knowledge of SAP CAP, Fiori, and UI5 is a plus
Excellent presentation and communication skills in English (B2+)
Willingness for occasional travel to customer locations
nice to have
Understanding of Agile methodologies and adaptable DevOps practices, including CI/CD and automation
Interest in applying AI-driven methods to SDLC optimization
Competency with additional SAP solutions beyond CAP and BTP
We are looking for a motivated Data Engineer to join our growing Data Engineering team. You will work as part of one of our Platform Teams, developing and maintaining data pipelines in Databricks with PySpark on Microsoft Azure for the AI Factory team. This role offers the opportunity to work with modern cloud and big data technologies, contributing to reliable and scalable data platforms that support innovation across the organization.
responsibilities
Develop and maintain data pipelines in Databricks with PySpark on Azure
Support the design and implementation of cloud-based analytical solutions using Big Data and NoSQL technologies
Assist in building and maintaining data lakes and warehouses to ensure reliability and performance
Participate in the development of ETL/ELT workflows to collect, clean, and structure data
Help implement data quality, lineage, and monitoring frameworks
Collaborate with ML and analytics teams to deliver clean, production-ready datasets
Participate in code reviews and follow technical standards and best practices
Work with CI/CD methodologies in data engineering workflows using tools like Jenkins or GitLab CI/CD
Collaborate with architects, technical leads, and cross-functional teams to deliver solutions aligned with requirements
Engage with stakeholders to understand processes and ensure deliverable alignment
requirements
2+ years of experience in Data Engineering or a related field
Proficiency in Python and PySpark
Experience with Databricks and Microsoft Azure cloud services
Familiarity with software version control tools (e.g., GitHub, Git)
Exposure to CI/CD frameworks such as Jenkins, Concourse, or GitLab CI/CD
Ability to build reliable and scalable data solutions
Strong problem-solving and analytical skills, with effective communication and teamwork abilities
nice to have
Experience with additional programming languages such as Java, SQL, or Scala
Knowledge of SAP BTP or similar enterprise data platforms
We are seeking an experienced and driven Senior Data Engineer to join our expanding Data Engineering team. This position involves working as part of one of our established Platform Teams, where you will design, develop, and scale data pipelines in Databricks with PySpark on Microsoft Azure, empowering our AI Factory teams to create and deploy advanced machine learning solutions. This role offers the chance to contribute at the nexus of big data, cloud engineering, and MLOps, delivering reliable, scalable, and high-performance data platforms that support innovation across the organization.
responsibilities
Design cloud-native analytical solutions using Big Data and NoSQL technologies
Build scalable data pipelines in Databricks with PySpark on Azure
Collaborate with MLOps and ML Engineering teams to deliver data platforms for AI development and deployment
Support requirements gathering and deliver aligned solutions with architects, technical leads, and cross-functional teams
Assist the SAP Platform Team in utilizing SAP BTP and hyperscaler offerings for enterprise-grade data solutions
Conduct code reviews to ensure technical standards and practices are maintained
Provide mentorship to junior engineers to cultivate a high-performance culture
Incorporate CI/CD methodologies into data engineering workflows using tools like Jenkins or GitLab CI/CD
Engage with stakeholders to understand processes, model input data, and ensure deliverable alignment
requirements
4+ years of experience in Software Engineering with a focus on Data Engineering, Machine Learning, or MLOps
Proficiency in Python and PySpark
Background in Databricks and Microsoft Azure cloud services
Knowledge of software version control tools, including GitHub or Git
Capability to work with CI/CD frameworks such as Jenkins, Concourse, or GitLab CI/CD
Expertise in building scalable, robust, and available data solutions
Competency in problem-solving and analytical skills, along with effective stakeholder engagement
nice to have
Skills in an additional programming language like Java, SQL, or Scala
Understanding of SAP BTP or similar enterprise data platforms
We are seeking an experienced and driven Databricks Engineer to join our expanding Data Engineering team. In this role, you will be a key member of one of our established Platform Teams, designing, developing, and scaling data pipelines in Databricks with PySpark on Microsoft Azure for the AI Factory team. This is an exciting opportunity to work at the intersection of big data and cloud engineering, delivering reliable, scalable, and high-performance data platforms that drive innovation across our organization.
responsibilities
Design cloud-native analytical solutions using Big Data and NoSQL technologies
Build and optimize scalable data pipelines in Databricks with PySpark on Azure
Develop and maintain data lakes and data warehouses to ensure reliability and performance
Design and implement ETL/ELT workflows to collect, clean, and structure data
Implement data quality, lineage, and monitoring frameworks
Collaborate with ML and analytics teams to deliver clean, production-ready datasets
Conduct code reviews to uphold technical standards and best practices
Mentor junior engineers and foster a high-performance, collaborative culture
Integrate CI/CD methodologies into data engineering workflows using tools such as Jenkins or GitLab CI/CD
Support requirements gathering and deliver solutions in alignment with architects, technical leads, and cross-functional teams
Engage with stakeholders to understand business processes, model input data, and ensure deliverables meet requirements
requirements
2+ years of experience in Data Engineering or a related field
Proficiency in Python and PySpark
Hands-on experience with Databricks and Microsoft Azure cloud services
Familiarity with software version control tools (e.g., GitHub, Git)
Experience with CI/CD frameworks such as Jenkins, Concourse, or GitLab CI/CD
Proven ability to build scalable, robust, and highly available data solutions
Strong problem-solving, analytical, and stakeholder engagement skills
English level of minimum B2 (Upper-Intermediate) for effective communication
nice to have
Experience with additional programming languages such as Java, SQL, or Scala
Knowledge of SAP BTP or similar enterprise data platforms
Familiarity with agile development methodologies
Let us find a perfect job for you
Share your CV and pass our review to get a personalized job offer even if you didn't find a job on the site.