Develop and maintain data pipelines using Python and Azure technologies. Collaborate on data integration, ensure data quality, and improve data governance practices.
Requisition Number: 2346518
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together.
Primary Responsibilities:
Required Qualifications:
Preferred Qualification:
At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone - of every race, gender, sexuality, age, location and income - deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together.
Primary Responsibilities:
- To develop, and maintain scalable data pipelines and solutions using Python, APIs, and cloud technologies (Azure, Databricks, Snowflake)
- Collaborate with cross-functional teams to implement robust data integration workflows, leveraging ADF, Azure Databricks, and Github for version control and automation
- Understand data modeling, ETL processes, and performance optimization to ensure high-quality, reliable data delivery for business needs
- Ensure adherence to security, compliance, and data governance standards across all data engineering activities
- Drive continuous improvement by evaluating and adopting new tools and technologies, such as Terraform, Kafka streaming, and Generative AI, where applicable
- Troubleshoot and resolve complex data issues, ensuring system reliability and scalability
- Actively participate in project planning, code reviews, and stakeholder communications to deliver successful project outcomes
- Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regard to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so
Required Qualifications:
- Bachelor's degree in Computer Science, Engineering, or a related field
- 3+ years of hands-on experience in data engineering, with solid expertise in Python programming
- Experience with cloud platforms, especially Azure, and tools such as Azure Data Factory, Azure Databricks, and Snowflake
- Experience in designing and developing APIs and integrating data solutions across diverse systems
- Experience with Databricks, Snowflake, and data pipeline orchestration
- Experience with Generative AI skills, LLMs understanding and hands-on knowledge in building agents
- Familiarity with version control systems, particularly Github
- Good understanding of data modeling, ETL processes, and data governance best practices
- Proven excellent problem-solving, communication, and leadership skills
- Ability to collaborate effectively with other team members
Preferred Qualification:
- Experience with Generative AI technologies.
At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone - of every race, gender, sexuality, age, location and income - deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.
Top Skills
Adf
Azure
Databricks
Git
Kafka
Python
Snowflake
Terraform
Similar Jobs at Optum
Artificial Intelligence • Big Data • Healthtech • Information Technology • Machine Learning • Software • Analytics
Lead controllership for SG&A and revenue accounting, manage month-end close, prepare journal entries and reconciliations, analyze variances, ensure compliance, coach accounting team, and use enterprise tools for reporting and analytics.
Top Skills:
AribaBlacklineConcurHyperion SmartviewExcelMicrosoft PowerpointMicrosoft WordPeoplesoftPower BISQL
Artificial Intelligence • Big Data • Healthtech • Information Technology • Machine Learning • Software • Analytics
Design engaging, accessible web and mobile experiences; create prototypes and design artifacts; collaborate with product, research, and development teams; iterate from user feedback and analytics to deliver high-quality UX.
Top Skills:
FigmaPattern LibraryWcag
Artificial Intelligence • Big Data • Healthtech • Information Technology • Machine Learning • Software • Analytics
The Senior Quality Engineer will write and execute automated tests, customize automation frameworks, and collaborate in Agile teams. Responsibilities include ensuring quality through various testing methodologies and frameworks while adhering to enterprise standards.
Top Skills:
Api TestingAtddEclipseEgitGherkinGitIntellijJavaJenkinsMavenRestSeleniumSoapTestng
What you need to know about the Hyderabad Tech Scene
Because of its proximity to leading research institutions and a government committed to the city's growth, Hyderabad's tech scene is booming. With plans to establish India's first "AI city," the city is on track to become one of the world's most anticipated tech hubs, with companies like TransUnion, Schrödinger and Freshworks, among others, already calling the city home.

