The Data Engineer will manage data models and architecture, ensuring data security and compliance, while collaborating with teams to mitigate risks and improve security measures.
- Work collaboratively with application development, data protection, information security, and risk management teams to understand and implement data security and management solutions.
- Continuously improve security & observability telemetry services based on input from a diverse network of internal and external stakeholders, and technology teams as well as the IT industry at large.
- Data Management: Define and manage data models, schemas, metadata, and security rules. Design, create, deploy, and manage databases and data structures on premise and in the cloud to fulfill business requirements.
- Threat Analysis: Identify and mitigate potential security risks in the organization's data architecture.
- Compliance: Ensure compliance with data privacy laws and regulations.
- Risk Management: Conduct risk assessments and take appropriate actions to mitigate the risks associated with data security.
- Training and Development: Train and educate stakeholders about our data
- Collaboration: Collaborate with other IT team members, stakeholders, and executives to ensure the security of data architectures.
Requirements
- Minimum 6+ years of Data Ingestion, Integration, ETL, or security engineering experience with large scale implementations distributed globally
- Extensive knowledge of a globally distributed environment across multiple platforms such as AWS, Azure and GCP
- Data-driven mindset
- Strong understanding of Data Management or Data Engineering
- Strong grounding in data analysis and related processes
- Experienced in Agile methods, experience in Atlassian stack (i.e., JIRA) or related tools
- Ability to Develop roadmaps and the underlying strategies for the data centric products and services
- Experience with standard monitoring frameworks and observability products
- Experience with hybrid environment data sources, data collectors and instrumentation
- Expertise in the use SIEM solutions for basic and advanced detection methods, including cloud-based data sources
- Experience with security monitoring & observability solutions such as Splunk, Sumo, Datadog, New Relic, AppDynamics
- Experience of working with cloud and data security in a DevSecOps/IRE and agile working environments.
- Expertise in at least one scripting language (PowerShell, Python, Bash)
- Experience in container/container orchestration technologies - Docker and Kubernetes
- Experience w/systems configuration orchestration tools - Ansible or Terraform
- Understanding of infrastructure as a code and concepts
- Related security certifications (e.g. CISSP, CCSP, SABSA, ITIL etc.).
- Familiarity and experience with the Splunk SPL query language.
- 3+ years familiarity and experience with Linux / Ubuntu / Mac systems
- Experience in creating dashboards, queries, alerts in Splunk, Data Dog, Sumo
- Experience in troubleshooting issues with Splunk connectivity including, networking, server (windows and Linux), and application tiers.
Top Skills
Ansible
Appdynamics
AWS
Azure
Bash
Datadog
Docker
GCP
JIRA
Kubernetes
Linux
New Relic
Powershell
Python
Splunk
Sumo
Terraform
Ubuntu
Similar Jobs
Fintech • Information Technology • Insurance • Financial Services • Big Data Analytics
Design, develop, implement and maintain Big Data solutions for Disability & Absence products. Build batch, speed, and serving layers using Spark, Hive, NoSQL, SOLR and Kafka; optimize Hadoop jobs; automate with Shell/Python; support deployments, CI/CD, runbooks, and cross-team collaboration.
Top Skills:
Spark,Hive,Nosql,Solr,Scala,Pig,Kafka,Hbase,Nifi,Change-Data-Capture,Hadoop,Ci/Cd,Shell,Python,Azure,Google Cloud
Fintech • Professional Services • Consulting • Energy • Financial Services • Cybersecurity • Generative AI
As a Data Engineer, you will work with GCP technologies, managing big data, utilizing ETL/ELT processes, and collaborating across teams while advancing in your career.
Top Skills:
AirflowBigQueryBigtableCloud RunDatastoreETLGCPGcsJavaPysparkPythonSpannerSparkSQL
Artificial Intelligence • Cloud • HR Tech • Information Technology • Productivity • Software • Automation
Architect and build large-scale, high-performance distributed data platform components and ingestion pipelines. Lead complex technical initiatives, design reusable libraries, troubleshoot production issues, and collaborate across teams to deliver scalable, reliable Data Lake and streaming solutions.
Top Skills:
Java,Jvm,Kubernetes,Apache Kafka,Kafka Connect,Apache Flink,Apache Spark,Apache Iceberg,Parquet,Orc,Avro,Oracle,Mysql,Postgresql
What you need to know about the Hyderabad Tech Scene
Because of its proximity to leading research institutions and a government committed to the city's growth, Hyderabad's tech scene is booming. With plans to establish India's first "AI city," the city is on track to become one of the world's most anticipated tech hubs, with companies like TransUnion, Schrödinger and Freshworks, among others, already calling the city home.


