Egen Logo

Egen

Senior Data Engineer - GCP

Reposted 12 Days Ago
Be an Early Applicant
Hybrid
Hyderabad, Telangana
Senior level
Hybrid
Hyderabad, Telangana
Senior level
The Senior Data Engineer designs and maintains data pipelines on Google Cloud Platform, focusing on ETL processes while ensuring data quality and collaboration with cross-functional teams.
The summary above was generated by AI

Job Overview:
 
We are looking for a skilled and motivated Senior Data Engineer with strong experience in Python programming and Google Cloud Platform (GCP) to join our data engineering team. The ideal candidate will be responsible for designing, developing, and maintaining robust and scalable ETL (Extract, Transform, Load) data pipelines. The role involves working with various GCP services, implementing data ingestion and transformation logic, and ensuring data quality and consistency across systems.

Experience Level: 7 to 10 years of relevant IT experience
 

Key Responsibilities:

  • Design, develop, test, and maintain scalable ETL data pipelines using Python.
  • Work extensively on Google Cloud Platform (GCP) services such as:
  • Dataflow for real-time and batch data processing
  • Cloud Functions for lightweight serverless compute
  • BigQuery for data warehousing and analytics
  • Cloud Composer for orchestration of data workflows (based on Apache Airflow)
  • Google Cloud Storage (GCS) for managing data at scale
  • IAM for access control and security
  • Cloud Run for containerized applications

Should have experience in the following areas :

  • API framework: Python FastAPI
  • Processing engine: Apache Spark
  • Messaging and streaming data processing : Kafka
  • Storage: MongoDB, Redis/Bigtable
  • Orchestration: Airflow
  • Perform data ingestion from various sources and apply transformation and cleansing logic to ensure high-quality data delivery.
  • Implement and enforce data quality checks, validation rules, and monitoring.
  • Collaborate with data scientists, analysts, and other engineering teams to understand data needs and deliver efficient data solutions.
  • Manage version control using GitHub and participate in CI/CD pipeline deployments for data projects.
  • Write complex SQL queries for data extraction and validation from relational databases such as SQL Server, Oracle, or PostgreSQL.
  • Document pipeline designs, data flow diagrams, and operational support procedures.

Required Skills:

  • 7–10 years of hands-on experience in Python for backend or data engineering projects.
  • Strong understanding and working experience with GCP cloud services (especially Dataflow, BigQuery, Cloud Functions, Cloud Composer, etc.).
  • Solid understanding of data pipeline architecture, data integration, and transformation techniques.
  • Experience in working with version control systems like GitHub and knowledge of CI/CD practices.
  • Experience in Apache Spark, Kafka, Redis, Fast APIs, Airflow, GCP Composer DAGs.
  • Strong experience in SQL with at least one enterprise database (SQL Server, Oracle, PostgreSQL, etc.).
  • Experience in data migrations from on-premise data sources to Cloud platforms.

Good to Have (Optional Skills):

  • Experience working with Snowflake cloud data platform.
  • Experience in deployments in GKE, Cloud Run.
  • Hands-on knowledge of Databricks for big data processing and analytics.
  • Familiarity with Azure Data Factory (ADF) and other Azure data engineering tools.

Additional Details:

  • Excellent problem-solving and analytical skills.
  • Strong communication skills and ability to collaborate in a team environment.

Education:

  • Bachelor's degree in Computer Science, a related field, or equivalent experience.

Top Skills

Airflow
Spark
BigQuery
Cloud Composer
Cloud Functions
Dataflow
Git
Google Cloud Platform (Gcp)
Google Cloud Storage
Kafka
MongoDB
Oracle
Postgres
Python
Redis
SQL Server

Similar Jobs

8 Hours Ago
Hybrid
Hyderabad, Telangana, IND
Senior level
Senior level
Fintech • Financial Services
Lead product management initiatives, conduct market analysis, manage product development, ensure compliance, and mentor junior PMs. Focus on customer needs and strategic goals.
Top Skills: AgileAIAPIsCloud PlatformsData AnalyticsMachine Learning
8 Hours Ago
Hybrid
Hyderabad, Telangana, IND
Senior level
Senior level
Fintech • Financial Services
The Lead Trade Services Processor ensures customer satisfaction by managing Trade Service products, guiding team members, and making recommendations for improvements. The role includes interacting with clients, training staff, and adhering to regulatory policies.
8 Hours Ago
Hybrid
Hyderabad, Telangana, IND
Senior level
Senior level
Fintech • Financial Services
The Senior Trade Services Processor will manage trade services, train staff, handle payments, and ensure compliance while collaborating with team members.

What you need to know about the Hyderabad Tech Scene

Because of its proximity to leading research institutions and a government committed to the city's growth, Hyderabad's tech scene is booming. With plans to establish India's first "AI city," the city is on track to become one of the world's most anticipated tech hubs, with companies like TransUnion, Schrödinger and Freshworks, among others, already calling the city home.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account