We are seeking a Cloud Security Architect to lead the design and implementation of secure cloud solutions across diverse platforms and client environments. This role is ideal for a seasoned security leader with deep expertise in cloud-native architectures, threat modelling, and enterprise security strategy. You will collaborate with engineering, DevOps, and business stakeholders to embed security into every layer of cloud infrastructure and application lifecycle.
responsibilities
Define and drive cloud security architecture strategies across AWS, Azure, GCP, and hybrid environments
Lead threat modelling, risk assessments, and architecture reviews for cloud-native and containerised applications
Develop and enforce cloud security standards, policies, and reference architectures
Guide engineering teams in implementing secure CI/CD pipelines, infrastructure-as-code (IaC), and automated security controls
Evaluate and integrate cloud security tools (e.g., CSPM, CWPP, CIEM) into enterprise environments
Collaborate with application architects and product teams to ensure secure design patterns and compliance with regulatory frameworks (e.g., ISO 27001, NIST, GDPR)
Mentor and coach cloud security engineers and contribute to internal knowledge sharing
Support incident response and forensic investigations related to cloud environments
Stay current with emerging threats, technologies, and best practices in cloud security
requirements
8+ years of experience in information security, with 4+ years focused on cloud security architecture
Proven expertise in designing secure solutions on AWS, Azure, or GCP
Strong understanding of cloud-native services, IAM, network security, encryption, and logging
Experience with IaC tools (Terraform, CloudFormation), container security (Kubernetes, Docker), and DevSecOps practices
Familiarity with cloud security frameworks (CSA, CIS Benchmarks, NIST)
Excellent communication and stakeholder management skills
Relevant certifications (e.g., CCSP, AWS Certified Security Specialty, AWS Certified Solutions Architect, Azure Security Engineer Associate, Azure Solutions Architect Expert, Google Professional Cloud Security Engineer, Google Professional Cloud Architect) are a plus
We are looking for a skilled and innovative BTP Cloud Developer with expertise in Node.js to join our forward-thinking development team. In this role, you will design, develop, and deploy cutting-edge cloud-native solutions leveraging modern frameworks, tools, and technologies while contributing to transformative projects in a collaborative environment.
responsibilities
Design and implement scalable cloud-based applications leveraging SAP Business Technology Platform (BTP)
Develop efficient and robust backend solutions using Node.js and related frameworks
Integrate microservice architectures and event-driven design principles to build enterprise-grade applications
Collaborate with cross-functional teams to integrate features, troubleshoot issues, and maintain system performance
Enhance application performance through in-memory database optimization
Promote DevOps practices to streamline deployment pipelines and CI/CD processes
Participate in Agile ceremonies and contribute to iterative development cycles
Implement and refine security best practices for cloud-native applications
Explore AI integration to optimize software development lifecycle processes when applicable
Build APIs and services leveraging standards like REST or GraphQL while ensuring high-quality code
requirements
3+ years of professional experience in software development
Strong expertise in Node.js, including building and deploying scalable, robust solutions
Knowledge of SAP BTP, SAP CAP, SAP Fiori, or UI5 is an advantage
Background in cloud platforms such as AWS, Azure, or GCP
Familiarity with microservice architecture and event-driven systems design
Understanding of in-memory database solutions for performance optimization (e.g., Redis or HANA)
Competency in working within Agile environments and using DevOps tools for automation
Skills in modern API design, including REST or GraphQL
Flexibility to adapt to new technologies and frameworks as needed
nice to have
Expertise in frontend frameworks like React, Vue, or Angular
Familiarity with other SAP products
Practical experience with CI/CD pipelines or automation-focused DevOps practices
Interest in AI-driven approaches within software development and delivery processes
We are seeking a highly-skilled and motivated Senior Java Developer to join our team, focusing on SAP Cloud solutions. This role is perfect for an experienced developer who is passionate about creating scalable, efficient, and innovative web applications while working within a collaborative Agile environment.
responsibilities
Design, develop, and maintain scalable web applications using Java and Spring Boot
Collaborate with cross-functional teams to integrate SAP-related technologies into cloud-based solutions
Work with databases like PostgreSQL, MySQL, or HANA Cloud to ensure reliable data handling
Implement Agile development methodologies, including Scrum or Kanban, to deliver high-quality results
Troubleshoot, debug, and optimize application performance
Participate in code reviews to maintain code quality and best practices
Write unit, integration, and performance tests to ensure system reliability
Provide technical guidance and mentorship to junior team members
Analyze business requirements to propose suitable technical solutions
Ensure proper documentation for all development activities
requirements
3+ years of experience with building web applications in professional settings
Skills in Java, Spring Boot, and relational databases like PostgreSQL, MySQL, or HANA Cloud
Expertise in SAP-related technologies and their integration into development projects
Familiarity with Agile development methodologies, including Scrum or Kanban
Team player with strong communication and collaboration skills
English level of minimum B2 (Upper-Intermediate) for effective communication
nice to have
Experience with event-driven architectures
Familiarity with DevOps technologies and tools such as CI/CD pipelines, Kubernetes, or Docker
Understanding of modern cloud infrastructure concepts
We are looking for an experienced Lead Java Developer to join our team, focusing on SAP Cloud solutions. This role is designed for a seasoned professional who thrives on leading development efforts, driving innovation, and creating scalable, efficient web applications within a collaborative Agile environment.
responsibilities
Lead the design, development, and delivery of scalable web applications using Java and Spring Boot
Collaborate with cross-functional teams and stakeholders to integrate SAP-related technologies into advanced cloud-based solutions
Oversee the use of databases like PostgreSQL, MySQL, or HANA Cloud to ensure robust and efficient data operations
Drive the implementation of Agile development methodologies, such as Scrum or Kanban, ensuring successful team delivery
Identify and resolve complex system issues to maintain optimal application performance
Lead code reviews and enforce coding standards, fostering a culture of technical excellence
Guide teams in the creation of unit, integration, and performance tests for application reliability
Provide mentorship and technical leadership to team members, ensuring skill development and alignment with project goals
Evaluate business requirements to architect and deliver effective, scalable technical solutions
Maintain comprehensive documentation for all development processes and solutions
requirements
5+ years of experience building web applications in professional settings
1+ years of leadership experience in relevant roles
In-depth expertise using Java, Spring Boot, and relational databases such as PostgreSQL, MySQL, or HANA Cloud
Proven ability to incorporate SAP-related technologies into dynamic development projects
Strong hands-on experience with Agile methodologies, including Scrum or Kanban, in a leadership capacity
Demonstrated leadership, collaboration, and communication skills to effectively manage teams and drive results
Excellent command of written and spoken English (B2+ level)
nice to have
Deep familiarity with event-driven architectural patterns and best practices
Proven experience with DevOps technologies such as CI/CD pipelines, Kubernetes, or Docker in an operational setting
Strong understanding of modern cloud infrastructure concepts and their strategic applications
We are seeking a Senior BTP Cloud Developer with exceptional Node.js expertise to join our innovative team, building scalable and high-performing cloud-native applications that empower businesses to succeed in a rapidly changing tech landscape.
responsibilities
Design, develop, and deploy highly performant cloud-native software solutions
Create microservice-based architectures for enterprise-grade applications
Integrate in-memory database technologies for optimized application performance
Collaborate on projects utilizing AI to enhance software development processes
Apply Agile methodologies and DevOps practices to ensure efficient project lifecycles
requirements
5+ years of professional development experience
Strong expertise in Node.js, with advanced skills in building large-scale applications
Background in SAP BTP, SAP CAP, Fiori, or UI5 and understanding their integrations
Knowledge of at least one hyperscaler (AWS, Azure, GCP)
Familiarity with microservices and event-driven designs for enterprise applications
Competency in building and managing APIs with secure and optimized workflows
Showcase of experience in delivering solutions within Agile and DevOps settings
nice to have
Proficiency in frontend frameworks like React, Angular, or Vue
Familiarity with other SAP products
Flexibility to work with tools supporting CI/CD and automation pipelines
Interest in leveraging AI to transform software development methods
Are you an industry visionary and technologist at heart with a passion for designing complex Cloud solutions? Are you able to guide and advise business and IT teams on creating new Cloud-native applications and services? Do you have strong soft skills, solid stakeholder management and technology governance experience? Do you want to transform businesses so that they are able to operate and grow successfully in the cloud-first world? If this sounds like you, this could be the perfect opportunity to join EPAM as a Cloud / DevOps Engineer.
responsibilities
Develop and maintain IaC to provision infrastructure and ensure its integrity
Develop and maintain configuration management and monitoring
Provide operational support related to the whole infrastructure and automation tools
Ensure that any technical issues faced by project teams are resolved
Communicate effectively to all the stakeholders and team members
requirements
At least 2 years of experience in a similar role
Solid understanding of the cloud computing fundamentals and Amazon Web Services
Hands on experience with programmatic infrastructure provisioning with Terraform or alike.
Ability to write declarative pipelines with a CI/CD tool like Github Actions, Gitlab pipeline or Azure Devops or other.
Background in scripting and coding for automation using Bash, Python or PowerShell
Strong analytical and problem-solving skills with attention to detail
Motivation to continuously acquire new skills and adapt to evolving technologies
Demonstrated clarity in communication and a good command of written and spoken English
nice to have
Good knowledge of TCP Networking and monitoring tools
Proficiency in containers and high level orchestrators
Showcase of experience with tools like git and basic familiarity of branching strategies
We are looking for a motivated Data Engineer to join our growing Data Engineering team. You will work as part of one of our Platform Teams, developing and maintaining data pipelines in Databricks with PySpark on Microsoft Azure for the AI Factory team. This role offers the opportunity to work with modern cloud and big data technologies, contributing to reliable and scalable data platforms that support innovation across the organization.
responsibilities
Develop and maintain data pipelines in Databricks with PySpark on Azure
Support the design and implementation of cloud-based analytical solutions using Big Data and NoSQL technologies
Assist in building and maintaining data lakes and warehouses to ensure reliability and performance
Participate in the development of ETL/ELT workflows to collect, clean, and structure data
Help implement data quality, lineage, and monitoring frameworks
Collaborate with ML and analytics teams to deliver clean, production-ready datasets
Participate in code reviews and follow technical standards and best practices
Work with CI/CD methodologies in data engineering workflows using tools like Jenkins or GitLab CI/CD
Collaborate with architects, technical leads, and cross-functional teams to deliver solutions aligned with requirements
Engage with stakeholders to understand processes and ensure deliverable alignment
requirements
2+ years of experience in Data Engineering or a related field
Proficiency in Python and PySpark
Experience with Databricks and Microsoft Azure cloud services
Familiarity with software version control tools (e.g., GitHub, Git)
Exposure to CI/CD frameworks such as Jenkins, Concourse, or GitLab CI/CD
Ability to build reliable and scalable data solutions
Strong problem-solving and analytical skills, with effective communication and teamwork abilities
nice to have
Experience with additional programming languages such as Java, SQL, or Scala
Knowledge of SAP BTP or similar enterprise data platforms
We are seeking an experienced and driven Senior Data Engineer to join our expanding Data Engineering team. This position involves working as part of one of our established Platform Teams, where you will design, develop, and scale data pipelines in Databricks with PySpark on Microsoft Azure, empowering our AI Factory teams to create and deploy advanced machine learning solutions. This role offers the chance to contribute at the nexus of big data, cloud engineering, and MLOps, delivering reliable, scalable, and high-performance data platforms that support innovation across the organization.
responsibilities
Design cloud-native analytical solutions using Big Data and NoSQL technologies
Build scalable data pipelines in Databricks with PySpark on Azure
Collaborate with MLOps and ML Engineering teams to deliver data platforms for AI development and deployment
Support requirements gathering and deliver aligned solutions with architects, technical leads, and cross-functional teams
Assist the SAP Platform Team in utilizing SAP BTP and hyperscaler offerings for enterprise-grade data solutions
Conduct code reviews to ensure technical standards and practices are maintained
Provide mentorship to junior engineers to cultivate a high-performance culture
Incorporate CI/CD methodologies into data engineering workflows using tools like Jenkins or GitLab CI/CD
Engage with stakeholders to understand processes, model input data, and ensure deliverable alignment
requirements
4+ years of experience in Software Engineering with a focus on Data Engineering, Machine Learning, or MLOps
Proficiency in Python and PySpark
Background in Databricks and Microsoft Azure cloud services
Knowledge of software version control tools, including GitHub or Git
Capability to work with CI/CD frameworks such as Jenkins, Concourse, or GitLab CI/CD
Expertise in building scalable, robust, and available data solutions
Competency in problem-solving and analytical skills, along with effective stakeholder engagement
nice to have
Skills in an additional programming language like Java, SQL, or Scala
Understanding of SAP BTP or similar enterprise data platforms
We are seeking a skilled, motivated, and forward-thinking Senior Java Software Engineer to enhance our dynamic team at EPAM Bulgaria. In this position, you'll collaborate with leading professionals, solve advanced technical problems, and support the development of scalable, high-performance solutions for a prominent technology-focused client. This role also offers opportunities to mentor, design, and create systems from the ground up.
responsibilities
Design software components and microservices from the ground up
Cover all phases of the software development lifecycle (SDLC) from a technical perspective
Collaborate with engineers, architects, and product managers to develop and maintain performance-driven platforms
Use advanced technologies within a modern tech stack
Recommend scalable architectural solutions
Foster a productive development workflow emphasizing code quality and maintainability (e.g., TDD, Clean Code, pair programming)
Participate in design discussions, code reviews, and team ceremonies
Mentor and coach junior team members
Communicate with client stakeholders to provide project updates, address priorities, and resolve technical concerns
requirements
5+ years of software development experience with a proven record of producing end-to-end solutions
Expertise in Java and Microservices architecture
Background in the Spring ecosystem, including Spring Boot, Spring Cloud, Spring Data, and Spring Security
Proficiency in REST APIs, Microservices concepts, and relational database design
Knowledge of Design Patterns and their application
Competency in TDD/ATDD and writing testable code
Familiarity with CI/CD tools, particularly Jenkins
Understanding of Clean Code and Software Craftsmanship principles
Strong analytical thinking, problem-solving, and debugging capabilities
Capability to communicate effectively and negotiate in English
Commitment to contributing to collaborative workflows like code reviews and pair programming
Experience mentoring and guiding team members
nice to have
Knowledge of Google Cloud Platform (GCP) or other cloud-native development environments
Background in Guice, Guava, and Protocol Buffers
Skills in Big Data technologies or non-relational databases
Expertise in creating and maintaining real-time business-critical systems
Familiarity with event-driven architecture and distributed systems
We are seeking an experienced and driven Senior Data Engineer to join our expanding Data Engineering team. In this role, you will be a key member of one of our established Platform Teams, designing, developing, and scaling data pipelines in Databricks with PySpark on Microsoft Azure for the AI Factory team. This is an exciting opportunity to work at the intersection of big data and cloud engineering, delivering reliable, scalable, and high-performance data platforms that drive innovation across our organization.
responsibilities
Design cloud-native analytical solutions using Big Data and NoSQL technologies
Build and optimize scalable data pipelines in Databricks with PySpark on Azure
Develop and maintain data lakes and data warehouses to ensure reliability and performance
Design and implement ETL/ELT workflows to collect, clean, and structure data
Implement data quality, lineage, and monitoring frameworks
Collaborate with ML and analytics teams to deliver clean, production-ready datasets
Conduct code reviews to uphold technical standards and best practices
Mentor junior engineers and foster a high-performance, collaborative culture
Integrate CI/CD methodologies into data engineering workflows using tools such as Jenkins or GitLab CI/CD
Support requirements gathering and deliver solutions in alignment with architects, technical leads, and cross-functional teams
Engage with stakeholders to understand business processes, model input data, and ensure deliverables meet requirements
requirements
4+ years of experience in Data Engineering or a related field
Proficiency in Python and PySpark
Hands-on experience with Databricks and Microsoft Azure cloud services
Familiarity with software version control tools (e.g., GitHub, Git)
Experience with CI/CD frameworks such as Jenkins, Concourse, or GitLab CI/CD
Proven ability to build scalable, robust, and highly available data solutions
Strong problem-solving, analytical, and stakeholder engagement skills
nice to have
Experience with additional programming languages such as Java, SQL, or Scala
Knowledge of SAP BTP or similar enterprise data platforms