Cigna Logo

Cigna

Data Engineering Associate Advisor - HIH - Evernorth

Posted Yesterday
Be an Early Applicant
In-Office
Hyderabad, Telangana
Senior level
In-Office
Hyderabad, Telangana
Senior level
The Data Engineering Associate Advisor develops and maintains scalable data pipelines, ensuring data accuracy and reliability for analysis and reporting, while collaborating in an agile environment.
The summary above was generated by AI

Position Summary:

Cigna, a leading Health Services company, is looking for data engineers/developers in our Data & Analytics organization. The Data Engineer would be responsible for the delivery of a business need, end-to-end starting from understanding the requirements to deploying the software into production. This role requires you to be fluent in some of the critical technologies with proficiency in others and have a hunger to learn on the job and add value to the business. Critical attributes of being a Data Engineer, among others, is ownership & accountability. In addition to delivery, the Engineer should have an automation first and continuous improvement mindset. He/She should drive the adoption of CI/CD tools and support the improvement of the tools sets/processes. This role requires close collaboration with various teams to gather data requirements, build scalable data pipelines, and ensure the overall solution is reliable, available, and optimized for high performance.

Behaviors of a Associate Data Engineer:

Associate Data Engineers are able to articulate clear business objectives aligned to technical specifications and work in an iterative, agile pattern daily. They have ownership over their work tasks, and embrace interacting with all levels of the team and raise challenges when necessary. We aim to be cutting-edge engineers – not institutionalized developers.

Job Description & Responsibilities:

Engineer will work on a team of truly talented individuals and be responsible for design, build and testing (using automation scripts but not limited to unit/integration/performance testing) of Individual & Family Retail Health Insurance applications and its components that interface with complex enterprise systems and external vendor systems. Engineer will have solid knowledge working with complex distributed applications and passion for technologies by solving data integration challenges. Individual will have zeal and drive to be up to speed with latest development in tech arena.

Engineer must be a highly motivated, well-rounded, team player and self-starter that works best in a collaborative, dynamic, agile environment.  Excellent communication skills both written and oral are also essential as this position will interface with remote scrum teams, business owners, enterprise architects, security, infrastructure, and end users via email, phone, IM, desktop sharing, and wiki. They should be able to demonstrate the qualities listed below

Qualities:

  • Analytical Skills: Candidate must have strong analytical skills and be able to recognize the needs of customers and create simple solutions that answer those needs.

  • Communication: Candidate must be able to clearly communicate their ideas to peers, stakeholders, and management.

  • Creativity: Creativity is needed to help invent new ways of approaching problems and developing innovative applications as well as bringing experience from other industries.

  • Customer-Service: If dealing directly with clients and customers, candidate would need good customer service skills and consultant mentality to answer questions and fix issues.

  • Attention to Detail: Applications have many parts and all must work together for the application to function.

  • Problem-Solving: As issues come up, candidate must be able to make decisions that move the project forward.

  • Teamwork: Candidate must work well with others as part of a distributed agile (SAFe) team of engineers, analysts, product owner, and scrum master.

  • Technical Skills: Candidate must be adept in computer languages and have strong technical aptitude. Must have a solid knowledge with common design patterns should seek out opportunities to implement design patterns.

  • Leadership Skills: Candidate is expected to lead by example, exhibiting technical excellence and development of expert level business domain knowledge. Influences technical direction within the ART and across ARTs. Advocates for a shared technical vision, enables others to act to fulfill the vision. Challenges existing processes through relentless improvement.

 Responsibilities:

  • Develop and maintain scalable data pipelines using AWS services and Databricks. These processes are designed to extract data from multiple sources, transform it into meaningful formats, and load it into data lakes or downstream systems.

  • A key aspect of this responsibility is ensuring that data is accurately cleaned, transformed, and loaded to enable consistent and reliable analytics and reporting along with designing data storage systems that scale, perform well, and meet the organization’s purge and archive requirements.

  • Implement monitoring and alerting systems to proactively detect and resolve issues within data pipelines and oversee data storage and processing activities, ensuring optimal performance by addressing operational challenges.

  • Enable a 360-degree view of customer-centric information through integration of a multitude of internal/external systems, mobile apps, devices, and data marts.

  • Ability to implement test driven development with strong focus on automation testing.

  • Support and enhance existing Individual & Family Retail Health Insurance applications used by consumers as well as operations staff.

  • Participate in all agile ceremonies effectively.

Skillset:

Associate Engineers should have 8 to 11 years of experience and should be very familiar with advanced concepts and have relevant, hands-on experience in many of the following areas to be a successful contributor on the team:

  • Proven experience in AWS Cloud Services such as AWS Glue, ECS/EKS, Fargate, Redshift/Aurora/DocumentDB, ECR, S3, Lambda, SNS, IAM etc.

  • Proficient in Databricks DLT (Delta Live Tables) and workflows, SQL, Spark, Python and/or PySpark, Java or other relevant programming languages for data engineering workloads.

  • Experience with Databricks orchestration frameworks – Internal Databricks workflows or external Apache Airflows (DAGS).

  • Solid understanding of data integration strategies, ELT/ETL techniques, and data warehousing principles. Prior experience of ELT/ETL workflows such as Informatica would be useful in analysis of existing systems and their pain points to mitigate them in the future state data architecture.

  • Strong analytical and problem-solving capabilities, with expertise in diagnosing complex data issues and optimizing pipeline performance and reliability.

  • Apply data transformation, quality and observability techniques for production-grade pipelines.

  • Knowledge of reporting/BI tools such as Tableau would be plus.

  • Knowledge of data governance, lineage, and catalog tools such as Alation, Colibra, or any other would be plus.

  • Engineering mindset with a passion for excelling in all areas of the software development life cycle such as analysis, design, development, automated testing, and DevOps.

Desired (but not required)

  • Hands-on experience in migrating and setting up large-scale cloud data pipelines using Databricks.

  • Knowledge of data modeling and various structured/non-structured databases would be desired but not required.

  • Experience in performance tuning of applications that involves threading, indexing, clustering, and caching.

  • Expertise in loading billions of files across millions of partitions and resolving performance bottlenecks on AWS Databricks.

  • Strong knowledge of partitioning strategies, metadata management, and file formats (Delta Lake, Iceberg, Apache Hudi).

  • Experience with document databases (e.g., DocumentDB) and JSON/XML source data.

  • Exposure to Lakehouse concept and dimensional modeling & Data Vault for scalable data structures.

  • Skilled in real-time data ingestion, transformation, and data quality checks.

  • Experience in converting on-premises pipelines to Databricks DLT (Delta Live Tables) for efficient cloud processing.

  • Understanding of OLTP, OLAP concepts, normalization, denormalization, how business use cases drive data model design and implementation.

Qualifications:

Bachelor’s degree in Computer Science or a related discipline strongly preferred, typically eight or more years of solid, diverse work experience in IT with a minimum of six years’ experience application program development, or the equivalent in education and work experience. Ideal candidate will have relevant experience either as a consultant or working for a start-up company.

About Evernorth Health Services

Evernorth Health Services, a division of The Cigna Group, creates pharmacy, care and benefit solutions to improve health and increase vitality. We relentlessly innovate to make the prediction, prevention and treatment of illness and disease more accessible to millions of people. Join us in driving growth and improving lives.

Top Skills

Alation
Aurora
Aws Glue
Colibra
Databricks
Documentdb
Ecr
Ecs
Eks
Fargate
Iam
Informatica
Java
Lambda
Pyspark
Python
Redshift
S3
Sns
Spark
SQL
Tableau

Similar Jobs

Yesterday
In-Office
Hyderabad, Telangana, IND
Senior level
Senior level
Healthtech • Insurance
The Data Engineering Associate Advisor will design, build, and maintain scalable data pipelines, manage AWS services, and ensure data quality for health applications. The role involves collaboration with teams to enhance data management and reporting capabilities.
Top Skills: AuroraAws GlueDatabricksDocumentdbEcrEcsEksFargateIamJavaLambdaPysparkPythonRedshiftS3SnsSparkSQLTableau
Yesterday
In-Office
Hyderabad, Telangana, IND
Senior level
Senior level
Healthtech • Insurance
The Data Engineer is responsible for developing and maintaining data pipelines using AWS and Databricks, ensuring data reliability and performance, and participating in agile ceremonies. Responsibilities also include cleaning and transforming data, designing data storage systems, and enhancing existing applications.
Top Skills: Aws AuroraAws DocumentdbAws EcrAws EcsAws EksAws FargateAws GlueAws IamAws LambdaAws RedshiftAws S3Aws SnsDatabricks DltJavaPysparkPythonSparkSQL
58 Minutes Ago
Hybrid
Hyderabad, Telangana, IND
Mid level
Mid level
Big Data • Fintech • Information Technology • Business Intelligence • Financial Services • Cybersecurity • Big Data Analytics
Develop applications by integrating existing solutions, focusing on high-quality production code. Collaborate with the engineering team to enhance performance and scalability.
Top Skills: BigQueryGCPGerritGitHibernateJavaJenkinsNoSQLPythonSparkSpring BootSQL

What you need to know about the Hyderabad Tech Scene

Because of its proximity to leading research institutions and a government committed to the city's growth, Hyderabad's tech scene is booming. With plans to establish India's first "AI city," the city is on track to become one of the world's most anticipated tech hubs, with companies like TransUnion, Schrödinger and Freshworks, among others, already calling the city home.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account