Synechron Logo

Synechron

Data Engineer (Python, PySpark, Snowflake, Databricks) | Cloud Data Pipelines, Automation & Compliance

Posted 5 Hours Ago
Be an Early Applicant
Remote
Hiring Remotely in Hinjawadi, Pune, Mahārāshtra, IND
Senior level
Remote
Hiring Remotely in Hinjawadi, Pune, Mahārāshtra, IND
Senior level
The Data Engineer will manage Snowflake and Databricks data pipelines, automate workflows, ensure compliance, and optimize performance for data operations.
The summary above was generated by AI

Job Summary
Synechron is seeking an experienced Snowflake Support Engineer with strong expertise in Python, Databricks, and PySpark to support enterprise data platform operations. This role involves managing Snowflake data warehousing solutions, optimizing data workflows, and providing technical support to ensure high availability, security, and performance for critical business processes. The ideal candidate will drive automation, guide technical teams, and stay current with industry trends to enhance data platform reliability and scalability.

Software Requirements
 

Required Software Proficiency:

  • Python (version 3.7+) — extensive experience in scripting, automation, and data processing

  • Databricks — hands-on experience in managing and developing data pipelines within Databricks environments

  • PySpark — strong skills in large-scale data processing and transformation workflows

  • Snowflake — expertise in data warehousing, data loading, schema design, and performance tuning

  • SQL — advanced query writing, optimization, and data validation

  • Git and version control: GitHub, Bitbucket — for code management and collaboration

  • Cloud platforms (AWS, Azure, GCP) — familiarity with deploying and managing data solutions supported by cloud services

Preferred Software Skills:

  • Automation and orchestration tools: Jenkins, Azure DevOps, or Terraform — for CI/CD and infrastructure automation

  • Monitoring tools: Prometheus, Grafana, CloudWatch — for system health and performance monitoring

  • Data modeling and metadata management tools supporting data governance standards

Overall Responsibilities

  • Support and maintain scalable, secure, and high-performance data pipelines in Snowflake and Databricks environments

  • Automate operational workflows, data load processes, and pipeline health monitoring to improve efficiency and reliability

  • Troubleshoot and resolve data ingestion, transformation, and performance issues promptly

  • Collaborate with data engineers, data scientists, and business analysts to translate requirements into optimized data workflows

  • Support schema design, data validation, and data security compliance in line with industry regulations (GDPR, HIPAA, etc.)

  • Monitor, analyze, and optimize resource consumption and performance of data workflows

  • Develop and maintain documentation including process flows, operational procedures, and best practices

  • Lead efforts in automation and continuous delivery through CI/CD pipelines and infrastructure as code frameworks

  • Support data migration, system upgrades, and support audits for data security and compliance

Technical Skills (By Category)

  • Languages & Data Tools (Essential):

    • Python (3.7+), PySpark, SQL — for scripting, data transformation, and automation

    • Snowflake — data warehousing, schema management, and query optimization

  • Databases & Data Management:

    • Snowflake, relational databases, data modeling, and schema design

    • Metadata and data lineage tools supporting data governance and compliance

  • Cloud & Infrastructure:

    • AWS, Azure, or GCP data services supporting cloud-native data architectures (preferred)

    • Infrastructure as Code tools: Terraform, CloudFormation (preferred)

  • Monitoring & Automation:

    • Prometheus, Grafana, CloudWatch — for system monitoring and alerting

    • Jenkins, Azure DevOps, or similar tools for CI/CD pipeline automation

  • Security & Data Governance:

    • Knowledge of encryption standards, access controls, and compliance frameworks (GDPR/HIPAA)

Experience Requirements

  • 6+ years supporting enterprise data platforms with a focus on Snowflake, Databricks, and cloud data workflows

  • Proven experience optimizing large-scale data pipelines for performance and cost efficiency

  • Demonstrated ability to troubleshoot, automate, and support high-availability data systems in cloud environments

  • Experience supporting data migration projects, schema management, and data security compliance in regulated industries (preferred)

  • Familiarity with enterprise metadata management and data governance standards

Day-to-Day Activities

  • Develop, support, and optimize data pipelines in Snowflake and Databricks environments

  • Automate ingestion, transformation, and validation workflows supporting business analytics and compliance

  • Monitor system health, troubleshoot errors, and implement performance tuning and security improvements

  • Collaborate with data engineers, data scientists, and business stakeholders to refine workflows

  • Support data migration, configuration, and operational readiness activities

  • Conduct root cause analysis, incident response, and performance reviews

  • Maintain operational documentation including architecture diagrams, runbooks, and data policies

  • Support infrastructure automation and CI/CD pipelines for deployment and upgrades

Qualifications

  • Bachelor’s or Master’s degree in Data Engineering, Computer Science, or related field

  • 6+ years of experience supporting, deploying, and managing enterprise data solutions in cloud environments supported by Snowflake and Databricks

  • Relevant certifications such as SnowPro, AWS Data Analytics, or GCP Professional Data Engineer are a plus

  • Experience working in regulated industries with data security, privacy, and compliance requirements

Professional Competencies

  • Strong analytical and troubleshooting skills for complex data systems

  • Leadership qualities to guide junior team members and support best practices in automation and performance optimization

  • Excellent communication for stakeholder engagement and documentation

  • Adaptability to evolving cloud technologies, data security standards, and industry regulations

  • Focus on operational security, data quality, and system reliability

  • Time management skills to handle multiple tasks and ensure timely delivery in a fast-paced environment

S​YNECHRON’S DIVERSITY & INCLUSION STATEMENT
 

Diversity & Inclusion are fundamental to our culture, and Synechron is proud to be an equal opportunity workplace and is an affirmative action employer. Our Diversity, Equity, and Inclusion (DEI) initiative ‘Same Difference’ is committed to fostering an inclusive culture – promoting equality, diversity and an environment that is respectful to all. We strongly believe that a diverse workforce helps build stronger, successful businesses as a global company. We encourage applicants from across diverse backgrounds, race, ethnicities, religion, age, marital status, gender, sexual orientations, or disabilities to apply. We empower our global workforce by offering flexible workplace arrangements, mentoring, internal mobility, learning and development programs, and more.

All employment decisions at Synechron are based on business needs, job requirements and individual qualifications, without regard to the applicant’s gender, gender identity, sexual orientation, race, ethnicity, disabled or veteran status, or any other characteristic protected by law.

Candidate Application Notice

Similar Jobs

3 Hours Ago
Remote or Hybrid
India
Junior
Junior
Security • Cybersecurity
As a QA Engineer in DevOps, you will test and analyze requirements, write test cases, debug environments, and investigate customer scenarios.
Top Skills: Ci/CdDockerGoJavaJenkinsKotlinKubernetesLinuxPython
Junior
Artificial Intelligence • Healthtech • Machine Learning • Natural Language Processing • Biotech • Pharmaceutical
The Associate Manager develops global medical content under supervision, collaborates with teams to ensure quality and compliance, and tracks project progress.
Top Skills: Generative Ai Technology PlatformsMedical Content ToolsMultimedia
3 Hours Ago
Easy Apply
Remote or Hybrid
India
Easy Apply
Senior level
Senior level
Big Data • Cloud • Software • Database
Lead the development of a foundational platform for application modernization, focusing on infrastructure, cloud services, and observability metrics. Drive technical decisions and product scope in a remote setting.
Top Skills: AbacAWSAzureCi/CdDockerGCPKubernetesOauth 2.0OidcRbacSAML

What you need to know about the Hyderabad Tech Scene

Because of its proximity to leading research institutions and a government committed to the city's growth, Hyderabad's tech scene is booming. With plans to establish India's first "AI city," the city is on track to become one of the world's most anticipated tech hubs, with companies like TransUnion, Schrödinger and Freshworks, among others, already calling the city home.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account