Skip To Main Content
backBack to Search

Senior Big Data Software Engineer for a Computer Software Company

Data Software Engineering, Big Data
warning.png
Sorry, this position is no longer available

We are currently looking for a remote Senior Big Data Software Engineer with 5+ years of relevant experience in Big Data engineering area to join our team.

The customer is a provider of software as a service and cloud-based remote connectivity services for collaboration, IT management and customer engagement. The company's products give users and administrators access to remote computers.

Responsibilities
  • Apply broad knowledge of technology options, technology platforms, design techniques and approaches across the data warehouse lifecycle phases to design an integrated quality solution that address requirements
  • Ensure completeness and compatibility of the technical infrastructure required to support system performance, availability and architecture requirements
  • Design and plan for the integration for all data warehouse technical components
  • Provide input and recommendations on technical issues to the team
  • Develop implementation and operation support plans
  • Provide data design, data extracts and transforms
  • Build robust and scalable data integration (ETL) pipelines using AWS Services, EMR, Python, Pig and Spark
  • Mentor and develop other junior data engineers
  • Contribute to the high-quality data architecture to support business analysts, data scientists and customer reporting needs
  • Interface with other technology teams to extract, transform, and load data from a wide variety of data sources
Requirements
  • 5+ years of experience in writing complex, highly optimized SQL queries across large data sets
  • Bachelor's degree in Computer Science
  • 5+ years of relevant experience in Big Data engineering area
  • Skills in data modeling, ETL development and data warehousing
  • Skills with AWS services, including S3, EMR, Kinesis and RDS
  • Knowledge of big data stack of technologies, including Hadoop, HDFS, Hive, Spark, Pig, Presto
  • Experience with using Airflow, creating and maintaining DAGs, Operators and Hooks
  • Knowledge of distributed systems as it pertains to data storage and computing
  • Good problem solving and analytical skills
  • Knowledge of software engineering best practices across the development lifecycle, including Agile methodologies, coding standards, code reviews, source management, build processes, testing, and operations
  • English level - B1
Nice to have
  • Master’s degree in Computer Science
Benefits
  • International projects with top brands
  • Work with global teams of highly skilled, diverse peers
  • Healthcare benefits
  • Employee financial programs
  • Paid time off and sick leave
  • Upskilling, reskilling and certification courses
  • Unlimited access to the LinkedIn Learning library and 22,000+ courses
  • Global career opportunities
  • Volunteer and community involvement opportunities
  • EPAM Employee Groups
  • Award-winning culture recognized by Glassdoor, Newsweek and LinkedIn

These jobs are for you