We are looking for a passionate and capable Java Developer to join our team.
You will work in a dynamic agile setting, utilizing TDD, Pair Programming, and XP practices to create reliable backend solutions.
Responsibilities
- Develop, maintain, and improve backend services using Java (8/11/17) with an emphasis on scalability, reliability, and performance
- Build and advance microservices with Spring Boot, leveraging Spring Integration, Spring Cloud, and Spring Data
- Apply and refine concurrent and multithreaded programming using Java core concurrency tools
- Create and optimize data access layers and complex queries in PostgreSQL, focusing on data integrity and efficiency
- Support data pipelines by enabling data ingestion, transformation, and integration with other systems
- Use Google Cloud Platform tools, including Google Cloud Storage and BigQuery, for cloud-based data storage and processing
- Set up and manage CI/CD pipelines with Jenkins to ensure consistent builds, automated tests, and deployments
- Work closely with product, data, and DevOps teams to deliver solutions that align with business objectives
- Engage in code reviews, technical discussions, and ongoing process improvements
Requirements
- At least 2 years of experience in backend Java development
- Strong practical knowledge of Java 8/11/17, including core Java and concurrency (threads, executors, synchronization)
- Experience with Spring Boot and related technologies such as Spring Integration, Spring Cloud, and Spring Data
- Skilled in SQL and experienced with PostgreSQL, including schema design, query tuning, and transaction handling
- Background working with Google Cloud Platform, particularly Google Cloud Storage and BigQuery
- Proficient in setting up and maintaining CI/CD pipelines using Jenkins
- Understanding of RESTful API design and microservices architecture
- Comfortable working in an Agile environment and communicating with both technical and non-technical colleagues
- Strong English communication skills at B2 level or above
Nice to have
- Awareness of data pipeline concepts, including ETL/ELT, data flow, and the distinction between batch and streaming processing