Assurant Logo

Assurant

Senior Data Engineer (8+ Years in Python Development + ETL Development + Snowflake Development + 2 Years in ADF)

Reposted 5 Days Ago
Be an Early Applicant
In-Office
2 Locations
Senior level
In-Office
2 Locations
Senior level
Design, develop, and optimize cloud-based data pipelines using Azure Data Factory and Snowflake, collaborating with analytics teams and implementing best practices.
The summary above was generated by AI

Senior Data Engineer, Assurant, GCC-India

Reports To: Director of Product Engineering & Integrations.
 

Position Summary

We are seeking a highly skilled Senior Data Engineer to design, develop, and optimize data pipelines and cloud-based data solutions. The ideal candidate will have strong expertise in Azure Data Factory, Snowflake, and modern ETL/ELT practices, enabling scalable, secure, and high-performance data workflows. This role will collaborate closely with analytics and BI teams to deliver reliable data infrastructure supporting enterprise reporting and advanced analytics.

This position will be in Bangalore/Chennai/Hyderabad at our India location.

Work Time: 3:30 PM IST ~ 12:30 AM IST

What will be my duties and responsibilities in this job?

Data Engineering & ETL Development

  • Design, develop, and maintain ETL/ELT pipelines using Azure Data Factory, Snowflake Tasks, and Snowpipe for real-time and batch ingestion.
  • Implement best practices for data modeling, transformation, and performance tuning within Snowflake.
  • Build and manage pipelines connecting multiple structured and unstructured data sources across cloud and on-prem environments.
  • Automate data quality checks, data lineage tracking, and error handling within ETL workflows.

Snowflake Development & Optimization

  • Develop and maintain Snowflake schemas, views, stored procedures, and materialized views.
  • Configure and optimize Snowpipe for continuous data loading.
  • Utilize Snowsight for monitoring query performance, cost optimization, and workload analysis.
  • Implement role-based access control and ensure data security in Snowflake.

Azure & Cloud Integration

  • Integrate Azure Data Factory with other Azure services (Blob Storage, Synapse, Key Vault).
  • Design scalable cloud architectures and orchestrate pipelines across hybrid environments.
  • Implement CI/CD pipelines for data workflows using GitHub Actions.

Analytics & Reporting Enablement

  • Collaborate with business analysts and BI teams to enable Power BI dashboards backed by optimized Snowflake data models.
  • Create semantic models and data marts to support self-service analytics and reporting.

Scripting & Automation

  • Develop Python scripts for automation, data processing, and custom integrations.
  • Leverage Python-based frameworks (Pandas, PySpark, Airflow) to enhance pipeline capabilities.

What are the requirements needed for this position?

  • Bachelor’s or Master’s degree in Computer Science, Data Engineering, or related field.
  • 8+ years of experience in data engineering, ETL development, and cloud data platforms.
  • Strong proficiency in Snowflake, Azure Data Factory, and Python.
  • Experience with CI/CD, data security, and performance optimization.
  • Familiarity with BI tools (Power BI, Looker, etc.) and data modeling best practices.
  • Excellent problem-solving skills and ability to work in a fast-paced environment.

What are the Preferred requirements needed for this position?

  • Knowledge of Airflow, PySpark, and data orchestration frameworks.
  • Experience with real-time data ingestion and streaming architectures.
  • Understanding of cost optimization in cloud environments.

Top Skills

Airflow
Azure Data Factory
Github Actions
Pandas
Power BI
Pyspark
Python
Snowflake

Similar Jobs

6 Days Ago
Remote or Hybrid
India
Senior level
Senior level
Fintech • Professional Services • Consulting • Energy • Financial Services • Cybersecurity • Generative AI
As a Data Engineer, you will work with GCP technologies, managing big data, utilizing ETL/ELT processes, and collaborating across teams while advancing in your career.
Top Skills: AirflowBigQueryBigtableCloud RunDatastoreETLGCPGcsJavaPysparkPythonSpannerSparkSQL
2 Days Ago
In-Office
Hyderabad, Telangana, IND
Senior level
Senior level
Information Technology
As a Data Engineer, you will design data solutions, create pipelines, ensure quality, and implement ETL processes, collaborating across teams.
Top Skills: Etl ToolsIbm NetezzaSQL
3 Days Ago
In-Office
Hyderabad, Telangana, IND
Senior level
Senior level
Financial Services
The Senior Data Engineer is responsible for developing and supporting data solutions using Databricks on AWS, and collaborating with teams for data pipeline management and ETL processes.
Top Skills: AWSDatabricksETLGitJSONPl/SqlPythonRestSoapSQLT-Sql

What you need to know about the Hyderabad Tech Scene

Because of its proximity to leading research institutions and a government committed to the city's growth, Hyderabad's tech scene is booming. With plans to establish India's first "AI city," the city is on track to become one of the world's most anticipated tech hubs, with companies like TransUnion, Schrödinger and Freshworks, among others, already calling the city home.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account