The Data Engineer will design, build, and maintain data pipelines for Data Warehouses, utilizing AWS, GCP, and SQL, while ensuring data quality and implementing CI/CD best practices.
Job Overview:
We are looking for a skilled Data Engineer to join our dynamic team. You will be responsible for designing, building, and maintaining high-quality data pipelines and workflows, ensuring reliable and accurate data for our Data Warehouses. This is an exciting opportunity to work with cutting-edge cloud technologies and data tools in a collaborative environment.
Start Date: Late March 2026
Time Zones: Sydney, AUS & London
Duration: 4-6 Months
Experience: 7years+
Key Responsibilities:
- Design, develop, and maintain data pipelines for Data Warehouses.
- Build scalable solutions using AWS and GCP.
- Write efficient and optimized SQL, including analytical and window functions.
- Transform and model data using dbt and BigQuery.
- Orchestrate workflows using Airflow.
- Ensure data quality and implement testing for pipelines.
- Work in an Agile Delivery environment using JIRA and Confluence.
- Follow CI/CD best practices using GitHub or Bitbucket.
Required Skills:
- Strong proficiency in AWS and GCP.
- Hands-on experience delivering data pipelines for Data Warehouses.
- Advanced SQL skills, including analytical/window functions.
- Experience with dbt and BigQuery.
- Workflow orchestration experience using Airflow.
- Knowledge of CI/CD processes and tools.
- Familiarity with data quality checks and testing of pipelines.
- Experience working in an Agile framework.
What We Offer:
- Join a talented international team in a friendly, creative, and dynamic environment that fosters collaboration and support.
- Opportunities for professional growth and development.
- Enjoy the flexibility of working 100% remotely from anywhere in the world while contributing to cutting-edge projects.
- Work 5 days a week (40 hours: Monday to Friday; Office Hours: 9 AM - 5 PM EST). Office time is flexible by 1 hour.
- Competitive compensation: Receive a salary package commensurate with your experience and skill set.
- Internet bill reimbursement.
- The right candidate will receive training (all training and probation periods offered at AGT are fully paid; we value all candidates' time).
Candidates must be based in India or Bangladesh
Top Skills
Airflow
AWS
BigQuery
Bitbucket
Confluence
Dbt
GCP
Git
JIRA
SQL
AGT Software Partners Hyderabad, Telangana, IND Office
Hyderabad, Telangana, India
Similar Jobs
Fintech • Information Technology • Insurance • Financial Services • Big Data Analytics
The Big Data Engineer II is responsible for designing and maintaining data pipelines on cloud and on-premises, ensuring data quality and compliance, collaborating with stakeholders, and utilizing big data frameworks and cloud platforms.
Top Skills:
SparkAzure Data FactoryAzure FunctionsCosmos DbDatabricksEvent HubHadoopHbaseHiveMongoDBNifiNoSQLPythonScalaSQLSynapseUnix
Information Technology • Consulting
The Data Engineer is responsible for implementing business logic in the Data Warehouse, designing ETL pipelines, optimizing data loading processes, and collaborating with stakeholders to improve data architecture.
Top Skills:
Azure Data FactoryAzure DevopsData WarehousingDatabricksDbtDimensional ModelingGitAzureMs Sql ServerSQLT-Sql
Cloud • Database
Design, build, and maintain scalable data solutions on Azure. Collaborate with teams for data requirements, implement ETL processes, and ensure data quality.
Top Skills:
Apache HadoopSparkAzure Data FactoryAzure Data LakeAzure DatabricksAzure SynapseCosmosdbAzurePythonScalaSnowflakeSQL
What you need to know about the Hyderabad Tech Scene
Because of its proximity to leading research institutions and a government committed to the city's growth, Hyderabad's tech scene is booming. With plans to establish India's first "AI city," the city is on track to become one of the world's most anticipated tech hubs, with companies like TransUnion, Schrödinger and Freshworks, among others, already calling the city home.


