The Principal Data Engineer leads data initiatives, architects data infrastructure, optimizes performance, ensures data quality, and mentors engineers while collaborating with cross-functional teams.
Requisition Number: 2344011
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together.
Primary Responsibilities:
Required Qualifications:
Preferred Qualifications:
At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone - of every race, gender, sexuality, age, location and income - deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together.
Primary Responsibilities:
- Delivery Leadership: Own end-to-end delivery of data initiatives, driving backlog prioritization, sprint planning, and execution in Agile/Scrum teams
- Design & Build Data Infrastructure: Architect, develop, and maintain scalable ETL/ELT pipelines for large-scale data processing
- Data Architecture Leadership: Lead architectural decisions for data warehousing, lakehouse, and big-data solutions
- Performance Optimization: Improve database performance, query efficiency, and data workflows
- Data Quality & Governance: Implement and enforce data quality, validation, monitoring, and compliance frameworks
- Cross-Functional Collaboration: Work closely with product owners, data scientists, analysts, and business teams to deliver analytics and AI/ML solutions
- Mentorship & Best Practices: Mentor engineers, conduct design reviews, and promote data engineering standards
- Automation & Deployment: Build automated testing, CI/CD, and deployment pipelines for data systems
- Reliability & Troubleshooting: Lead resolution of complex data issues and ensure platform reliability and scalability
- Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regard to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so
Required Qualifications:
- Agile Delivery: Solid hands-on experience working in Agile/Scrum teams, including sprint planning, story estimation, and delivery tracking
- Programming & Data Platforms: Azure Databricks, Python/Scala, Azure Data Factory (ADF), Airflow, SQL; Snowflake experience is a plus
- Data Modeling & Warehousing: Expertise in relational and NoSQL databases and enterprise data modeling
- Big Data & ETL: Hands-on with Spark, Hadoop, or similar frameworks
- Cloud Platforms: Azure experience (GCP exposure is a plus).
- Containerization & Orchestration: Docker, Kubernetes
- Distributed Systems: Data partitioning, parallel processing, and distributed architectures
- Data Security & Compliance: Knowledge of GDPR, HIPAA, or similar regulatory standards
Preferred Qualifications:
- Streaming technologies such as Kafka
- Snowflake optimization and advanced usage
- AI and ML enablement, including RAG development and agent creation
- CI/CD tools such as Jenkins for data pipelines
At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone - of every race, gender, sexuality, age, location and income - deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.
Top Skills
Airflow
Azure
Azure Data Factory
Azure Databricks
Docker
Hadoop
Kubernetes
Python
Scala
Snowflake
Spark
SQL
Similar Jobs at Optum
Artificial Intelligence • Big Data • Healthtech • Information Technology • Machine Learning • Software • Analytics
Lead and coordinate Six Sigma projects across business teams: define scope, manage risks and vendors, analyze data to solve issues, track/report portfolio performance, present to leadership, and ensure on-time delivery.
Top Skills:
ExcelMs ProjectPowerPointSharepointSix SigmaVisio
Artificial Intelligence • Big Data • Healthtech • Information Technology • Machine Learning • Software • Analytics
Build and maintain scalable data pipelines and backend services with Python and Databricks, create responsive React front ends, integrate APIs and data visualizations, collaborate with cross-functional teams, participate in CI/CD and code reviews, and support operationalization of ML models.
Top Skills:
SparkAWSAzureDatabricksGCPGitJavascript (Es6+)JSONPythonReactRestful Apis
Artificial Intelligence • Big Data • Healthtech • Information Technology • Machine Learning • Software • Analytics
Lead development and research of advanced AI/ML solutions, fine-tune LLMs, design agentic systems, prototype novel algorithms, optimize training pipelines, ensure ethical AI, publish research, mentor juniors, and integrate enterprise AI into production.
Top Skills:
Agentic FrameworksAi FoundryApplication InsightsAzure Document IntelligenceAzure FunctionsAzure MonitorAzure OpenaiDatabricksEnterprise SearchGitLang FuseLlm GatewayM365 CopilotNlpOcrOpenai-Compatible ApisPythonPyTorchRRagRetrieval SystemsScikit-LearnSemantic SearchSpeech AiTensorFlowVector DatabasesVision Ai
What you need to know about the Hyderabad Tech Scene
Because of its proximity to leading research institutions and a government committed to the city's growth, Hyderabad's tech scene is booming. With plans to establish India's first "AI city," the city is on track to become one of the world's most anticipated tech hubs, with companies like TransUnion, Schrödinger and Freshworks, among others, already calling the city home.

