3Pillar Global Logo

3Pillar Global

Senior Data Engineer (Snowflake)

Sorry, this job was removed at 08:06 a.m. (IST) on Thursday, Aug 28, 2025
Be an Early Applicant
Remote
Hiring Remotely in India
Remote
Hiring Remotely in India

Similar Jobs

2 Days Ago
In-Office or Remote
Noida, Gautam Buddha Nagar, Uttar Pradesh, IND
Senior level
Senior level
Software
Lead the design and implementation of scalable ETL pipelines using Python and Snowflake, ensuring efficient data processing and architecture for cloud-native solutions.
Top Skills: AWSEc2EmrPythonRdsS3SnowflakeSnowparkSnowpipeSnowsql
9 Days Ago
In-Office or Remote
India, Sardarpur, Dhar, Madhya Pradesh, IND
Senior level
Senior level
Information Technology • Consulting
Lead the design and implementation of ETL pipelines and data solutions on the Snowflake platform, optimizing performance and ensuring data quality and insights for business decision-making.
Top Skills: AWSAzureGCPPower BIPythonSnowflakeSQLTableau
An Hour Ago
Remote or Hybrid
Mumbai, Maharashtra, IND
Mid level
Mid level
Cloud • Software
The Solutions Engineer designs and validates enterprise solutions, providing technical expertise to clients while collaborating with sales teams and conducting engaging presentations.
Top Skills: Catalyst Switching & RoutingCisco Catalyst CenterCisco SecurityCloud-Managed SolutionsDuoIot NetworkingMerakiSd-WanThousandeyesUmbrella
Embark on an exciting journey into the realm of software development with 3Pillar! We extend an invitation for you to join our team and gear up for a thrilling adventure. At 3Pillar, our focus is on leveraging cutting-edge technologies that revolutionize industries by enabling data driven decision making.

As a Senior Data Engineer, you will hold a crucial position within our dynamic team, actively contributing to thrilling projects that reshape data analytics for our clients, providing them with a competitive advantage in their respective industries.
If your passion for data engineering solutions that make a real-world impact, consider this your pass to the captivating world of Data Science and Engineering! 🌍🔥

Responsibilities:

  • Lead the design and implementation of scalable ETL/data pipelines using Python and Luigi for data processing.
  • Ensure efficient data processing for high-volume clickstream, demographics, and business data.
  • Guide the team in adopting best practices for data pipeline development, code quality, and performance optimization.
  • Configure, deploy, and maintain AWS infrastructure, primarily AWS EC2, S3, RDS, and EMR, to ensure scalability, availability, and security.
  • Support data storage and retrieval workflows using S3 and SQL-based storage solutions.
  • Provide architectural guidance for cloud-native data solutions and infrastructure design.
  • Oversee legacy framework maintenance, identify improvement areas, and propose comprehensive cloud migration or modernization plans.
  • Lead the strategic planning, execution, and optimization of large-scale data migration initiatives to Snowflake, ensuring data integrity, security, and minimal business disruption.
  • Coordinate infrastructure changes with stakeholders to align with business needs and budget constraints.
  • Develop and implement robust monitoring solutions to track system health, performance, and data pipeline accuracy.
  • Set up alerts and dashboards for proactive issue detection and collaborate with cross-functional teams to resolve critical issues.
  • Lead efforts in incident response, root cause analysis, and post-mortem processes for complex data engineering challenges.
  • Document workflows, troubleshooting procedures, and code for system transparency and continuity.
  • Provide mentoring and training to team members on best practices and technical skills.
  • Foster a culture of continuous learning, knowledge sharing, and technical excellence within the data engineering team.

Qualification

  • Experience: 6 years of experience in data engineering or a related technical field, with at least 4 years in a Snowflake projects.
  • Cloud Data Warehousing: Proven expertise with Snowflake data warehousing, including schema design, efficient data loading (e.g., Snowpipe, COPY into), performance tuning, and robust access control mechanisms.
  • Programming & Scripting: Strong programming skills in Python for automation and data workflows.
  • Data Processing & Storage: Expertise in managing SQL databases for data storage and query optimization.
  • Monitoring & Alerting Tools: Familiarity with monitoring solutions for real-time tracking and troubleshooting of data pipelines.
  • Experience with the AWS Cloud and data engineering services

What you need to know about the Hyderabad Tech Scene

Because of its proximity to leading research institutions and a government committed to the city's growth, Hyderabad's tech scene is booming. With plans to establish India's first "AI city," the city is on track to become one of the world's most anticipated tech hubs, with companies like TransUnion, Schrödinger and Freshworks, among others, already calling the city home.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account