IDT Logo

IDT

Senior Data/ML Engineer

Sorry, this job was removed at 04:09 p.m. (IST) on Tuesday, Sep 23, 2025
Be an Early Applicant
In-Office or Remote
12 Locations
In-Office or Remote
12 Locations

Similar Jobs

6 Days Ago
In-Office or Remote
12 Locations
Senior level
Senior level
Other
Design and maintain data pipelines, work on LLM-driven applications, and optimize workflows while collaborating with stakeholders in the BI team.
Top Skills: SparkAWSAzureDatastax AstradbHadoopHugging Face TransformersKafkaLangchainLlamaindexPlsqlPythonRedshiftSnowflakeSQL
6 Hours Ago
Remote
18 Locations
Mid level
Mid level
Blockchain • Software • Cryptocurrency • NFT • Web3 • App development
The Blockchain Engineer will design, develop, and deploy smart contracts, ensure multi-chain interoperability, and build secure staking systems while collaborating with various teams.
Top Skills: AnchorEthers.JsFoundryHardhatNode.jsPythonRustSolidityTruffleTypescriptWeb3.Js
6 Hours Ago
Remote or Hybrid
São Paulo, BRA
Junior
Junior
Productivity • Sales • Software
This role involves generating pipeline for the Enterprise Account Executive team through outbound engagement and leveraging existing accounts, requiring sales expertise and relationship-building skills.
Top Skills: SaaS
This is a full-time work from home opportunity for a star Data/ML Engineer from LATAM.

IDT(www.idt.net) is an American telecommunications company founded in 1990 and headquartered in New Jersey. Today it is an industry leader in prepaid communication and payment services and one of the world’s largest international voice carriers. We are listed on the NYSE, employ over 1300 people across 20+ countries, and have revenues in excess of $1.5 billion.

We are looking for a skilled Data/ML Engineer to join our BI team and take an active role in designing, building, and maintaining the end-to-end data pipeline, architecture and design that powers our warehouse, LLM-driven applications, and AI-based BI. If you're looking for a company that will give you the maximum flexibility in choosing a location to work, this opportunity is for you!

Responsibilities:

  • Design, develop, and maintain scalable data pipelines to support ingestion, transformation, and delivery into centralized feature stores, model-training workflows, and real-time inference services.
  • Build and optimize workflows for extracting, storing, and retrieving semantic representations of unstructured data to enable advanced search and retrieval patterns.
  • Architect and implement lightweight analytics and dashboarding solutions that deliver natural language query experience and AI-backed insights.
  • Define and execute processes for managing prompt engineering techniques, orchestration flows, and model fine-tuning routines to power conversational interfaces.
  • Oversee vector data stores and develop efficient indexing methodologies to support retrieval-augmented generation (RAG) workflows.
  • Partner with data stakeholders to gather requirements for language-model initiatives and translate into scalable solutions.
  • Create and maintain comprehensive documentation for all data processes, workflows and model deployment routines.
  • Should be willing to stay informed and learn emerging methodologies in data engineering, MLOps and LLM operations.

Requirements:

  • 8+ years of experience as a Data Engineer with 2+ years focused on MLOps.
  • Excellent English communication skills.
  • Effective oral and written communication skills with BI team and user community.
  • Demonstrated experience in utilizing python for data engineering tasks, including transformation, advanced data manipulation, and large-scale data processing.
  • Deep understanding of vector databases and RAG architectures, and how they drive semantic retrieval workflows.
  • Skilled at integrating open-source LLM frameworks into data engineering workflows for end-to-end model training, customization, and scalable inference. 
  • Experience with cloud platforms like AWS or Azure Machine Learning for managed LLM deployments.
  • Hands-on experience with big data technologies including Apache Spark, Hadoop, and Kafka for distributed processing and real-time data ingestion. 
  • Experience designing complex data pipelines extracting data from RDBMS, JSON, API and Flat file sources.
  • Demonstrated skills in SQL and PLSQL programming, with advanced mastery in Business Intelligence and data warehouse methodologies, along with hands-on experience in one or more relational database systems and cloud-based database services such as Snowflake/Redshift.
  • Understanding of software engineering principles and skills working on Unix/Linux/Windows Operating systems, and experience with Agile methodologies.
  • Proficiency in version control systems, with experience in managing code repositories, branching, merging, and collaborating within a distributed development environment.
  • Interest in business operations and comprehensive understanding of how robust BI systems drive corporate profitability by enabling data-driven decision-making and strategic insights. 

Pluses

  • Experience with vector databases such as DataStax AstraDB, and developing LLM-powered applications using popular open source frameworks like LangChain and LlamaIndex–including prompt engineering, retrieval-augmented generation (RAG), and orchestration of intelligent workflows. 
  • Familiarity with evaluating and integrating open-source LLM frameworks–such as Hugging Face Transformers/LLaMA-4 across end-to-end workflows, including fine-tuning and inference optimization.
  • Knowledge of MLOps tooling and CI/CD pipelines to manage model versioning and automated deployments.

Please attach CV in English.
The interview process will be conducted in English.

Only accepting applicants from LATAM.

What you need to know about the Hyderabad Tech Scene

Because of its proximity to leading research institutions and a government committed to the city's growth, Hyderabad's tech scene is booming. With plans to establish India's first "AI city," the city is on track to become one of the world's most anticipated tech hubs, with companies like TransUnion, Schrödinger and Freshworks, among others, already calling the city home.

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account