Oportun Logo

Oportun

Senior Software ML Engineer - R12388

Job Posted 11 Days Ago Reposted 11 Days Ago
Be an Early Applicant
Remote
Hiring Remotely in IN
Senior level
Remote
Hiring Remotely in IN
Senior level
The role involves platform engineering focusing on real-time ML deployment, data pipeline development, and CI/CD automation. Responsibilities include designing scalable infrastructure and collaborating with teams to optimize workflows.
The summary above was generated by AI
ABOUT OPORTUN

Oportun (Nasdaq: OPRT) is a mission-driven fintech that puts its 2.0 million members' financial goals within reach. With intelligent borrowing, savings, and budgeting capabilities, Oportun empowers members with the confidence to build a better financial future. Since inception, Oportun has provided more than $16.6 billion in responsible and affordable credit, saved its members more than $2.4 billion in interest and fees, and helped its members save an average of more than $1,800 annually. Oportun has been certified as a Community Development Financial Institution (CDFI) since 2009.

 

WORKING AT OPORTUN


Working at Oportun means enjoying a differentiated experience of being part of a team that fosters a diverse, equitable and inclusive culture where we all feel a sense of belonging and are encouraged to share our perspectives. This inclusive culture is directly connected to our organization's performance and ability to fulfill our mission of delivering affordable credit to those left out of the financial mainstream. We celebrate and nurture our inclusive culture through our employee resource groups.


Position Overview:

 

We are seeking a highly skilled Platform Engineer with expertise in building self-serve platforms that combine real-time ML deployment and advanced data engineering capabilities. This role requires a blend of cloud-native platform engineering, data pipeline development, and deployment expertise. The ideal candidate will have a strong background in designing data workflows and scalable infrastructure for ML pipelines while enabling seamless integrations and deployments.

 

Responsibilities:

  • Platform Engineering
    Design and build self-serve platforms that support real-time ML deployment and robust data engineering workflows.
    Develop microservices-based solutions using Kubernetes and Docker for scalability, fault tolerance, and efficiency.
    Create APIs and backend services using Python and FastAPI to manage and monitor ML workflows and data pipelines.
  • Real-Time ML Deployment
    Architect and implement platforms for real-time ML inference using tools like AWS SageMaker and Databricks.
    Enable model versioning, monitoring, and lifecycle management with observability tools such as New Relic.
  • Data Engineering
    Build and optimize ETL/ELT pipelines for data preprocessing, transformation, and storage using PySpark and Pandas.
    Develop and manage feature stores to ensure consistent, high-quality data for ML model training and deployment.
    Design scalable, distributed data pipelines on platforms like AWS, integrating tools such as DynamoDB, PostgreSQL, MongoDB, and MariaDB.
    Implement data lake and data warehouse solutions to support advanced analytics and ML workflows.
  • CI/CD and Automation
    Design and implement robust CI/CD pipelines using Jenkins, GitHub Actions, and other tools for automated deployments and testing.
    Automate data validation and monitoring processes to ensure high-quality and consistent data workflows.
  • Documentation and Collaboration
    Create and maintain detailed technical documentation, including high-level and low-level architecture designs.
    Collaborate with cross-functional teams to gather requirements and deliver solutions that align with business goals.
    Participate in Agile processes such as sprint planning, daily standups, and retrospectives using tools like Jira.


Required Qualifications
5+ years of experience in platform engineering, DevOps, or data engineering roles.
Hands-on experience with real-time ML model deployment and data engineering workflows.Technical Skills
Strong expertise in Python and experience with Pandas, PySpark, and FastAPI.
Proficiency in container orchestration tools such as Kubernetes (K8s) and Docker.
Advanced knowledge of AWS services like SageMaker, Lambda, DynamoDB, EC2, and S3.
Proven experience building and optimizing distributed data pipelines using Databricks and PySpark.
Solid understanding of databases such as MongoDB, DynamoDB, MariaDB, and PostgreSQL.Proficiency with CI/CD tools like Jenkins, GitHub Actions, and related automation frameworks.
Hands-on experience with observability tools like New Relic for monitoring and troubleshooting.

We are proud to be an Equal Opportunity Employer and consider all qualified applicants for employment opportunities without regard to race, age, color, religion, gender, national origin, disability, sexual orientation, veteran status or any other category protected by the laws or regulations in the locations where we operate.

 

California applicants can find a copy of Oportun's CCPA Notice here:  https://oportun.com/privacy/california-privacy-notice/.

 

We will never request personal identifiable information (bank, credit card, etc.) before you are hired. We do not charge you for pre-employment fees such as background checks, training, or equipment. If you think you have been a victim of fraud by someone posing as us, please report your experience to the FBI’s Internet Crime Complaint Center (IC3).

Top Skills

Aws Sagemaker
Docker
DynamoDB
Fastapi
Github Actions
Jenkins
Kubernetes
Mariadb
MongoDB
New Relic
Pandas
Postgres
Pyspark
Python

Similar Jobs

32 Minutes Ago
Remote
2 Locations
Senior level
Senior level
Artificial Intelligence • Big Data • Cloud • Software
As a Solution Engineer, you will engineer customer-centric technical solutions using the Vendavo framework, configuring pricing frameworks and integrating systems with client requirements.
Top Skills: Azure DevopsCsvETLGitGitJavaScriptMongoDBMySQLNode.jsOopPythonRest ApisSftpSoapSQLSshT-SqlTypescript
10 Hours Ago
Remote
Bengaluru, Karnataka, IND
Senior level
Senior level
Cloud • Information Technology • Productivity • Security • Software • App development • Automation
Develop and implement machine learning algorithms, train models, and collaborate with teams to enhance AI/ML functionality for customer support and sales.
Top Skills: AWSDatabricksJavaPythonSparkSQL
21 Hours Ago
Remote
Hybrid
Bengaluru, Karnataka, IND
Mid level
Mid level
Cloud • Fintech • Information Technology • Machine Learning • Software • App development • Generative AI
As a Software Engineer, you'll design and develop cloud-based data integration solutions, optimizing the application's performance and collaborating with teams to innovate new features while mentoring less experienced engineers.
Top Skills: AngularApi GatewayAWSAzureData IntegrationErp IntegrationEvent BusGCPIdentity ManagementJob SchedulerKubernetesMesosOdata ServicesRestRuby On RailsSearchSoapSQLWorkflow Orchestration

What you need to know about the Hyderabad Tech Scene

Because of its proximity to leading research institutions and a government committed to the city's growth, Hyderabad's tech scene is booming. With plans to establish India's first "AI city," the city is on track to become one of the world's most anticipated tech hubs, with companies like TransUnion, Schrödinger and Freshworks, among others, already calling the city home.
By clicking Apply you agree to share your profile information with the hiring company.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account