Skip To Main Content
backBack to Search

Lead GCP Network and Data Engineer

DevOps, GCP Networking, Google Cloud Dataflow, Google Cloud Pub/Sub, Kubernetes, Python, Terraform, Data Modeling Fundamentals, Databricks, Microsoft Azure, PySpark, SQL

We are currently looking for a competent Lead GCP Network and Data Engineer to enhance our talented team.

The position involves an expert who will concentrate on the planning, deployment, and optimization of our network and data architecture on the Google Cloud Platform (GCP) and has fundamental experience in Microsoft Azure. The ideal applicant will excel in comprehensive data engineering and demonstrate strong network management capabilities within cloud platforms.

Responsibilities
  • Develop and manage secure, well-regulated cloud infrastructures primarily on GCP and Azure
  • Build and sustain scalable and reliable cloud network structures utilizing GCP Networking
  • Create and implement data pipelines using PySpark, ensuring data quality and governance
  • Employ Google Cloud Dataflow and Google Cloud Pub/Sub for data processing and event-based architectures
  • Apply infrastructure as code using Terraform for consistent and reproducible infrastructure setup
  • Oversee continuous integration and continuous deployment (CI/CD) strategies for data pipelines
  • Analyze and enhance the performance of SQL and Python applications
  • Work collaboratively with the team to develop our Kubernetes environment, focusing on scalability and security
  • Progress the organization's proficiency in data modeling and augment existing data architectures
  • Adhere to security best practices and organizational policies
Requirements
  • Minimum of 5 years in network and data engineering
  • At least 1 year of relevant leadership experience
  • Profound knowledge and experience with GCP cloud computing, networking, and infrastructure
  • Basic skills in Azure Networking and Azure Identity/Principal Management
  • Proficiency in Python, PySpark, and SQL
  • Prior use of Databricks and extensive expertise in Kubernetes
  • Familiarity with Cloud Dataflow, Cloud Pub/Sub, and Cloud Storage
  • Capability to engineer and sustain data pipelines on PySpark
  • Management of data quality and governance requirements
  • Expertise in Infrastructure as Code using Terraform
  • Background in CI/CD processes for data pipelines
Nice to have
  • Qualifications in data modeling
Benefits
  • International projects with top brands
  • Work with global teams of highly skilled, diverse peers
  • Healthcare benefits
  • Employee financial programs
  • Paid time off and sick leave
  • Upskilling, reskilling and certification courses
  • Unlimited access to the LinkedIn Learning library and 22,000+ courses
  • Global career opportunities
  • Volunteer and community involvement opportunities
  • EPAM Employee Groups
  • Award-winning culture recognized by Glassdoor, Newsweek and LinkedIn