Nexaminds
Senior Databricks Engineer (Databricks ,PySpark , Apache Spark, Python, SQL, Azure, Devops)
Unlock Your Future with Nexaminds!
At Nexaminds, we're on a mission to redefine industries with AI. We're passionate about the limitless potential of artificial intelligence to transform businesses, streamline processes, and drive growth.
Join us on our visionary journey. We're leading the way in AI solutions, and we're committed to innovation, collaboration, and ethical practices. Become a part of our team and shape the future powered by intelligent machines. If you're driven by ambition, success, fun, and learning, Nexaminds is where you belong.
🚀 Are you a PRO at developing Databricks Pipelines / enhance or support existing pipelines?
Are you strong with Databricks, Pyspark,Apache Spark, SQL, Python debugging skills dealing with business-critical data with extensive exposure to Azure?
Have you worked hands-on in E-Commerce or Retail domains?
Then don’t wait any further — your next big opportunity is here! 🌟
Join us at Nexaminds and be part of an exciting journey where innovation meets impact. The benefits are unbelievable — and so is the experience you’ll gain!
Role Summary
We are seeking a Data Engineer with strong Databricks expertise to design, build, and maintain scalable, high-performance data pipelines on cloud platforms. The role focuses on developing production-grade ETL/ELT pipelines, enabling data modernization initiatives, and ensuring data quality, governance, and security across enterprise data platforms.
You will work closely with data engineers, analysts, and business stakeholders to deliver reliable, cost-efficient, and scalable data solutions, primarily on Azure.
Key Responsibilities
- Design, build, and maintain scalable data pipelines using Databricks (Pyspark, Apache Spark, Delta Lake, SQL, Python).
- Develop and optimize ETL/ELT workflows for performance, reliability, and cost efficiency.
- Implement data quality, data profiling, governance, and security best practices.
- Design and maintain data models to support analytics, reporting, and downstream consumption.
- Collaborate with data engineers, analysts, and business stakeholders to define and implement data requirements.
- Troubleshoot and resolve issues across data workflows, Spark jobs, and distributed systems.
- Support cloud data platform modernization and migration initiatives.
- Automate workflows using Databricks Workflows / Jobs and scheduling tools.
- Participate in code reviews and contribute to engineering best practices.
- Work within Agile/Scrum teams to deliver data solutions iteratively.
Must-Have Skills & Experience
- 10+ years of experience in Data Engineering
- Solid understanding with Azure Cloud platform.
- Strong hands-on expertise with Databricks, Pyspark, Apache Spark, Delta Lake, Databricks SQL
- Excellent programming skills in Python and SQL.
- Experience building production-grade ETL/ELT pipelines.
- Experience in Data Modeling, Data Profiling, Data Warehousing , distributed computing concepts
- Working knowledge of Shell Scripting for automation.
- Experience with Azure Event Hub / Github/Terraform
- Experience using JFrog Artifactory or any other similar antifactory for artifact management.
- Understanding of cloud security and access controls.
Nice-to-Have Skills
- Exposure to CI/CD pipelines for data engineering workloads.
- Knowledge of streaming data processing.
- Familiarity with Azure DevOps or similar tools.
- Experience supporting large-scale analytics or enterprise data platforms.
Soft Skills
- Strong analytical and problem-solving skills.
- Ability to work independently and in cross-functional teams.
- Excellent communication skills to interact with technical and non-technical stakeholders.
- Proactive mindset with attention to data accuracy and reliability.
What you can expect from us
Here at Nexaminds, we're not your typical workplace. We're all about creating a friendly and trusting environment where you can thrive. Why does this matter? Well, trust and openness lead to better quality, innovation, commitment to getting the job done, efficiency, and cost-effectiveness.
- Stock options 📈
- Remote work options 🏠
- Flexible working hours 🕜
- Benefits above the law
- But it's not just about the work; it's about the people too. You'll be collaborating with some seriously awesome IT pros.
- You'll have access to mentorship and tons of opportunities to learn and level up.
Ready to embark on this journey with us? 🚀🎉 If you're feeling the excitement, go ahead and apply!



