Back to Search
We are seeking a remote Senior AWS Data Engineer to join our team, working on a cutting-edge data project for one of our delivery teams. You will be responsible for the design, architecture, and end-to-end execution of big data projects. Your expertise in Amazon Web Services and Python will be essential to the successful development of ETL for AWS Glue jobs, AWS Lambda functions, and microservices. Your strong communication skills, coupled with your ability to write complex queries to address business requirements, will be crucial in presenting data insights to stakeholders.
Responsibilities
- Design and architecture of data products, ensuring alignment with business objectives and target market needs
- Develop ETL for AWS Glue jobs and AWS Lambda functions
- Execute end-to-end big data projects with a strong focus on delivering high-quality results
- Collaborate with cross-functional team members to achieve project goals, including developers, business analysts, and stakeholders
- Write complex queries to address business requirements and present data insights to stakeholders
- Continuously optimize Spark jobs performance to ensure maximum efficiency
- Investigate and implement data streaming solutions such as AWS Kinesis, Apache Flink, and Snowflake
- Implement solutions in accordance with IaC principles using tools such as Atlassian Bitbucket, AWS CodeBuild, and Terraform
Requirements
- Bachelor's degree in Computer Science, Engineering, or a related field, or equivalent experience
- 3+ years of experience in data engineering, preferably in an AWS environment
- In-depth understanding of the data domain, including Apache Spark, Python (PySpark), and SQL
- Extensive experience with AWS CLI, S3, Glue, Lambda, Athena, Step Functions, RDS
- Proficiency in microservices, OpenAPI, API Gateway, RDS, and unit tests
- Ability to work within an agile team environment, with a strong focus on delivering high-quality results
- B2+ English level proficiency
Nice to have
- Experience in Spark jobs performance optimization
- Data streaming knowledge: AWS Kinesis / Apache Flink / Snowflake
- CI/CD experience: Atlassian Bitbucket, AWS CodeBuild, Terraform
- Experience with IoT
Benefits
- International projects with top brands
- Work with global teams of highly skilled, diverse peers
- Healthcare benefits
- Employee financial programs
- Paid time off and sick leave
- Upskilling, reskilling and certification courses
- Unlimited access to the LinkedIn Learning library and 22,000+ courses
- Global career opportunities
- Volunteer and community involvement opportunities
- EPAM Employee Groups
- Award-winning culture recognized by Glassdoor, Newsweek and LinkedIn