Optum Logo

Optum

Software Engineer

Posted Yesterday
Be an Early Applicant
In-Office
Chennai, Tamil Nadu
Mid level
In-Office
Chennai, Tamil Nadu
Mid level
The Software Engineer will design and operate data pipelines, ensure data quality, develop cloud data solutions, and mentor engineers.
The summary above was generated by AI
Requisition Number: 2355010
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together.
Own the design, build, and operation of reliable data pipelines and curated data products that power analytics, reporting, and downstream applications.
Set engineering standards for data quality, observability, security, and cost/performance across the data platform.
Primary Responsibilities:
  • Design and implement scalable ETL/ELT pipelines (batch and/or streaming) with strong guarantees: idempotency, incremental processing/CDC, backfills, and schema evolution
  • Model and publish well-defined datasets in a cloud data warehouse/lakehouse (dimensional modeling, partitioning/clustering, performance tuning)
  • Build orchestration workflows (Airflow/Dagster/Prefect) with robust dependency management, retries, SLAs, and operational runbooks
  • Establish data quality and testing practices (unit/integration tests for transformations, validation rules, anomaly detection, and data contract checks)
  • Implement end-to-end observability (structured logging, metrics, lineage/metadata where available, alerting on freshness/volume/drift/failure modes)
  • Partner with Analytics, Product, and Engineering to translate requirements into maintainable data products and clear dataset contracts/documentation (e.g., data dictionaries)
  • Apply security and governance controls for sensitive data (PII/PHI), including least-privilege access, auditing, and retention policies
  • Drive CI/CD and infrastructure-as-code for data workloads (Git-based workflows, automated checks, environment promotion)
  • Mentor engineers, review designs/code, and lead incident response and postmortems for data reliability issues
  • Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so

Required Qualifications:
  • BTech / MCA
  • Proven experience building and operating data pipelines at scale (incremental loads, CDC patterns, backfills, late-arriving data handling)
  • Experience with workflow orchestration (Airflow/Dagster/Prefect) and operational ownership (SLAs, on-call readiness, runbooks)
  • Experience with CI/CD and IaC (Terraform/CloudFormation) for reproducible, auditable deployments
  • Hands-on experience with a modern cloud data stack (AWS/GCP/Azure) and a warehouse/lakehouse (e.g., Snowflake/BigQuery/Redshift/Databricks)
  • Familiarity with distributed processing (Spark or equivalent) and performance debugging (partitioning, shuffles, skew)
  • Solid understanding of data modeling (star/snowflake, SCDs) and dataset lifecycle management
  • Solid fundamentals in security/compliance for data (encryption, access control, auditing; handling PII/PHI)
  • Proven advanced SQL: complex joins, window functions, query optimization, and warehouse performance tuning
  • Proven solid programming skills in Python (preferred) with production engineering practices (packaging, testing, linting, code review)

Preferred Qualifications:
  • Experience in Streaming/event-driven architectures (Kafka/Kinesis/Pub/Sub), exactly-once/at-least-once tradeoffs, watermarking concepts
  • Experience in Data governance/metadata tooling (catalogs, lineage, ownership) and data contract frameworks
  • Domain experience in healthcare/claims/EHR data (or other regulated environments) and associated compliance practices

What success looks like (first 3-6 months)
  • Critical pipelines are reliable, observable, and meet freshness SLAs with clear ownership and runbooks
  • New/updated datasets ship with quality checks and documentation, and changes are managed via versioned contracts
  • Platform cost/performance improves measurably through optimization and better operational discipline

At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.

Top Skills

Airflow
AWS
Azure
BigQuery
CloudFormation
Dagster
Databricks
Elt
ETL
GCP
Prefect
Python
Redshift
Snowflake
Spark
SQL
Terraform

Optum Hyderabad, Telangana, IND Office

Hyderabad, India, India

Similar Jobs at Optum

Yesterday
In-Office
Junior
Junior
Artificial Intelligence • Big Data • Healthtech • Information Technology • Machine Learning • Software • Analytics
Design and develop APIs using Python/Java, build microservices, create UI components with React, and manage applications on cloud platforms.
Top Skills: AIAirflowAWSAzureCloudwatchDjangoDockerFastapiFlaskGCPGitJavaKubernetesMicroservicesMlNoSQLPrometheusPythonReactRestful ApisSpring BootSQLTerraform
3 Days Ago
In-Office
Junior
Junior
Artificial Intelligence • Big Data • Healthtech • Information Technology • Machine Learning • Software • Analytics
The Associate Software Engineer II will design, develop, and maintain mobile applications using React Native, integrating with backend services, and collaborating within agile teams.
Top Skills: GraphQLJavaScriptNode.jsReact NativeRestTypescriptZustand
Yesterday
In-Office
Senior level
Senior level
Artificial Intelligence • Big Data • Healthtech • Information Technology • Machine Learning • Software • Analytics
This position involves full-stack development, data analysis, CI/CD integration, and collaboration in an agile environment, leveraging Python and Snowflake for health optimization applications.
Top Skills: APIsAWSAzureGCPGitGithub ActionsJenkinsKubernetesMicroservicesPythonSnowflakeSQL

What you need to know about the Hyderabad Tech Scene

Because of its proximity to leading research institutions and a government committed to the city's growth, Hyderabad's tech scene is booming. With plans to establish India's first "AI city," the city is on track to become one of the world's most anticipated tech hubs, with companies like TransUnion, Schrödinger and Freshworks, among others, already calling the city home.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account