The Databricks Architect designs and develops data solutions using Databricks, leads technical delivery, and mentors teams in best practices.
Job Description
- We’re seeking a hands-on Resident Solution Architect (Databricks)with deep technical expertise in building and optimizing Lakehouse-based data and AI solutions. This is a contract role based in Singapore, offering flexibility to work on-site with clients or remotely within the region.
- In this role, you’ll design, develop, and operationalize Delta Lakehouse architectures using Databricks, driving real-world outcomes for enterprise customers. You’ll take ownership of implementation tasks, lead technical delivery, and mentor engineering teams in best practices across data engineering, governance, and AI.
Key Responsibilities
- Design and implement scalable data pipelines using Delta Live Tables (DLT), Spark SQL, Python, or Scala.
- Optimize ETL, streaming, and ML workloads for performance, cost efficiency, and reliability.
- Administer and configure Databricks Workspaces, Unity Catalog, and cluster policies for secure, governed environments.
- Automate infrastructure and deployments using Terraform, Git, and CI/CD pipelines.
- Implement observability, cost optimization, and monitoring frameworks using tools like Splunk, Prometheus, or CloudWatch.
- Collaborate with customers to build AI and LLM solutions leveraging MLflow, DBRX, and Mosaic AI.
Required Skills & Experience
- Strong hands-on experience with Databricks, including workspace setup, notebooks, clusters, and job orchestration.
- Expertise in Delta Lake, DLT, Unity Catalog, and SQL Warehouses.
- Proficiency in Python or Scala for data engineering and ML workflows.
- Strong understanding of AWS, Azure, or GCP cloud ecosystems.
- Experience with Terraform automation, DevOps, and MLOps practices.
- Familiarity with monitoring and governance frameworks for large-scale data platforms.
Nice to Have
- Experience developing AI/LLM pipelines and RAG architectures on Databricks.
- Exposure to Bedrock, OpenAI, or Hugging Face integrations.
- Databricks certifications (Data Engineer, Machine Learning, or Solutions Architect) preferred.
Top Skills
AWS
Azure
Ci/Cd
Cloudwatch
Databricks
Delta Lake
Dlt
GCP
Git
Prometheus
Python
Scala
Spark Sql
Splunk
Terraform
Similar Jobs
Information Technology • Consulting
Lead the design and implementation of scalable AI/ML platforms using Databricks, focusing on architecture, data ingestion, and MLOps frameworks. Collaborate with stakeholders to create production-ready AI solutions.
Top Skills:
Ai SolutionsAWSAzureData EngineeringDatabricksGCPGenerative AiHugging FaceLakehouse ArchitectureLangchainLlm EcosystemMachine LearningOpenaiPythonSparkSQL
Blockchain • Fintech • Payments • Consulting • Cryptocurrency • Cybersecurity • Quantum Computing
As Product Manager, you will own the vision and roadmap for Mastercard's Responsible AI and Privacy Enhancing Technologies Toolkit, ensuring product features align with governance, privacy, and regulatory expectations, while driving adoption and product outcomes.
Top Skills:
AIDataGovernance ToolingPrivacy Enhancing Technologies
Artificial Intelligence • Hardware • Information Technology • Machine Learning
As a transformative engineer at Micron, you will lead initiatives for process capability improvements and collaborate with multi-functional teams to enhance product integration and yield.
Top Skills:
JmpPythonStatistical Machine LearningY3
What you need to know about the Hyderabad Tech Scene
Because of its proximity to leading research institutions and a government committed to the city's growth, Hyderabad's tech scene is booming. With plans to establish India's first "AI city," the city is on track to become one of the world's most anticipated tech hubs, with companies like TransUnion, Schrödinger and Freshworks, among others, already calling the city home.

.jpeg)