Scope:
● Uses professional concepts and company objectives to resolve complex issues in creative and effective ways
● Works on complex issues where analysis of situations or data requires an in-depth evaluation of variable factors
Your Roles and Responsibilities:
● Create and maintain optimal data pipeline architecture
● Assemble large, complex data sets that meet functional / non-functional business requirements.
● Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
● Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources
● Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
● Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
● Create data tools for team members to assist them in building and optimizing analytics production.
● Work with data and analytics experts to strive for greater functionality in our data systems.
● Other duties as required. This list is not meant to be a comprehensive inventory of all responsibilities assigned to this position
Required Qualifications/Skills:
● Bachelor’s degree (B.S/B.A) from four-college or university and 8+ years’ Data Engineering experience, focusing on building and maintaining data environments.
-
Demonstrate 8+ years' of experience in designing and constructing ETL/ELT processes, managing data solutions within an SLA-driven environment.
-
Exhibit a strong background in developing data products, APIs, and maintaining testing, monitoring, isolation, and SLA processes.
-
Possess advanced knowledge of SQL/NoSQL databases (such as Snowflake, Redshift, MongoDB). • Proficient in programming with Python or other scripting languages.
-
Experience in building ELT/ETL processes using tools like dbt, Airflow, Fivetran, CI/CD using GitHub, and reporting in Tableau.
-
Understanding of Data Modelling and warehousing concepts.
● Networks with key contacts outside own area of expertise
● Determines methods and procedures on new assignments and may coordinate activities of other personnel
● Proficiency in languages: Python and Javascript
● Data Warehousing – Hadoop, MapReduce, HIVE, Presto
● Operating system: UNIX, Linux
● Follows standard practice and procedures when analyzing situations or data
Preferred Qualifications:
● SAS and MatLab
● Basic Machine Learning
● Certified GCP Data Engineer
Physical Demand & Work Environment:
● Must have the ability to perform office-related tasks which may include prolonged sitting or standing
● Must have the ability to move from place to place within an office environment ● Must be able to use a computer
● Must have the ability to communicate effectively
● Some positions may require occasional repetitive motion or movements of the wrists, hands, and/or fingers
What can Astreya offer you?
● Employment in the fast-growing IT space providing you with a variety of career options ● Opportunity to work with some of the biggest firms in the world as part of the Astreya delivery network
● Introduction to new ways of working and awesome technologies
● Career paths to help you establish where you want to go
● Focus on internal promotion and internal mobility - we love to build teams from within ● Free 24/7 accessible Professional Development through LinkedIn Learning and other online courses to give you opportunities to upskill at your own pace
● Education Assistance
● Dedicated management to provide you with on point leadership and care ● Numerous on the job perks
● Market competitive compensation and insurance, health and wellness benefits
Top Skills
Astreya Hyderabad, Telangana, IND Office
Tower-2.3, Level-1A, Waverock Building, Survey Number 115 (Part), TSIIC IT/ITES SEZ, Nanakramguda Village, , Hyderabad, Telangana , India