IN.JobDiagnosis logo

Job Title:

Senior Data Engineer

Company: R Systems

Location: Gurgaon, Haryana

Created: 2026-03-06

Job Type: Full Time

Job Description:

We are seeking a highly skilled Google Cloud Platform (GCP) Data engineer role with SAP data integration expertise to design, implement, and oversee enterprise-grade data solutions. The ideal candidate will combine deep expertise in cloud data platforms, data governance, security, and data modeling with hands-on experience in ETL/ELT pipelines, SAP data extraction, system migrations, and analytics. This role will collaborate with business stakeholders, and engineering teams to create a robust, scalable, and cost-effective data ecosystem that bridges SAP and GCP environments.Key ResponsibilitiesLead and mentor a team of data engineers in building ETL/ELT pipelines for SAP and other ERP sources into GCPSet engineering standards, best practices, and coding guidelines.Provide technical direction, code reviews, and support for complex data solutions.Collaborate with project managers, provide the estimates, track the progress, remove roadblocks to ensure timely completion of work.Collaborate with BI teams, Data analyst to enable reporting solution.2. Data Architecture & ModelingDesign conceptual, logical, and physical data models to support analytics and operational workloads.Implement star, snowflake, and data vault models for analytical systems.3. Google Cloud Platform ExpertiseDesign data solutions on GCP using BigQuery, Cloud Storage, Dataflow, and Dataproc.Implement cost optimization strategies for GCP workloads.4. Data Pipelines & IntegrationDesign and orchestrate ETL/ELT pipelines using Apache Airflow (Cloud Composer), and DataflowIntegrate data from multiple systems including SAP BW, SAP HANA, Business Objects using tools like SAP SLT, or Google Cortex Framework.Leverage integration tools such as Boomi for system interoperability.5. Programming & AnalyticsDevelop complex SQL queries for analytics, transformations, and performance tuning.Build automation scripts and utilities in Python.6. System MigrationLead on-premise to cloud migrations for enterprise data platforms [SAP BW/Bobj]Manage migration of SAP datasets to GCP ensuring data integrity and minimal downtime.8. DevOps for DataImplement CI/CD pipelines for data workflows using GitHub Actions, Cloud Build, and Terraform.Apply infrastructure-as-code principles for reproducible and scalable deployments.9. Data ModellingDesign and develop conceptual, logical, and physical data models for enterprise systems.Translate business requirements into data entities, attributes, relationships, and constraints.Build and maintain dimensional models (Star/Snowflake schema) for data warehouses and BI reporting.Develop data models for data lake / lakehouse environments (BigQuery, Snowflake, Azure Synapse, Databricks).Define and document data standards, naming conventions, and data definitions.Collaborate with Data Engineering teams to ensure models are implemented accurately in ETL/ELT pipelines.Work with BI teams to optimize models for reporting tools such as Power BI, Tableau, SAP BW, etc.Support integration across multiple source systems (SAP, Salesforce, Oracle, etc.).Ensure data models comply with data governance, security, and compliance requirements.Create and maintain documentation including ERDs, data dictionaries, and lineage diagramsPreferred Skills4-6 years of proven experience with GCP BigQuery, Composer, Cloud Storage, Pub/Sub, Dataflow.Strong SQL and Python programming skills.Hands-on experience with SAP data extraction, modeling, and integration from ERP, BW, and/or HANA systems.Knowledge of data governance frameworks and security best practices.Familiarity with DevOps tools for data.Understanding of Google Cortex Framework for SAP-GCP integrations.

Apply Now

➤
Home | Contact Us | Privacy Policy | Terms & Conditions | Unsubscribe | Popular Job Searches
Use of our Website constitutes acceptance of our Terms & Conditions and Privacy Policies.
Copyright © 2005 to 2026 [VHMnetwork LLC] All rights reserved. Design, Develop and Maintained by NextGen TechEdge Solutions Pvt. Ltd.