Choose an option
We are looking for a Data Engineer to join our organization.
You will leverage your broad expertise in data and analytics, focusing on our AWS-based data platform. In this position, you will play a vital role in making clinical trial data available to stakeholders by building and supporting scalable, cloud-driven ETL processes managed through a web interface.
Responsibilities
- Create, improve, and support robust ETL and ELT pipelines to streamline data ingestion and transformation
- Develop, upgrade, and maintain RESTful APIs to enable seamless system integrations
- Contribute to the development and upkeep of frontend components within the data platform
- Provide technical assistance to team members across data, design, product, and executive functions
- Oversee and enhance AWS infrastructure to ensure smooth data operations
- Optimize SQL statements and ETL workflows for better performance
- Discover and implement ways to automate and improve operational efficiency
- Diagnose and resolve data-related issues, reviewing end-to-end pipelines and working with users to address concerns
- Keep up with the latest advancements in technology and industry best practices
Requirements
- Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field
- Minimum 2 years of hands-on experience in data engineering
- Experience in frontend development using JavaScript or TypeScript, with a strong emphasis on backend and data systems
- Advanced SQL skills, preferably with PostgreSQL
- Proficient in Python development, including experience with GIT for source control and version management
- Practical experience with big data, ETL, and cloud solutions, especially AWS tools like Python, Terraform, Glue, S3, and SQS
- Thorough understanding of AWS services and architecture
- DevOps-oriented approach with a background in automation using Github or Github Actions
- Familiarity with automated testing for both frontend and backend systems
- Experience handling and resolving support tickets
- Proven ability to work effectively in a global team setting
- Capable of designing and documenting best practices for development
- Strong background in API design and documentation, especially RESTful APIs
- Excellent analytical thinking and data manipulation skills
- Fluent English proficiency, both written and spoken, at B2+ level or higher
Nice to have
- Experience working with Agile frameworks and tools like Jira
- Knowledge of Snowflake, DBT, Redshift (including Spectrum), and Aurora
- Familiarity with data visualization tools, particularly Power BI
- Understanding of AI code generation technologies, their strengths, and limitations