IN.JobDiagnosis logo

Job Title:

GCP DataOps Engineer

Company: Philodesign Technologies Inc

Location: Noida, Uttar Pradesh

Created: 2026-01-17

Job Type: Full Time

Job Description:

Job Title: GCP DataOps EngineerExperience: 7+ yearsWork Mode: RemoteBudget: ₹1,00,000 per monthPosition OverviewWe are seeking a highly experienced GCP DataOps Engineer to serve as a Subject Matter Expert (SME) in cloud data technologies for the enterprise and its subsidiaries. This individual contributor role involves designing, implementing, and optimizing Google Cloud-based data infrastructure, supporting data engineering initiatives, and driving cloud strategy across the organization.The ideal candidate will demonstrate deep experience in GCP Data Engineering, DevOps, cloud-native ETL/ELT pipelines, and enterprise data platforms. Professional certifications such as Google Certified Professional Data Engineer or equivalent are strongly preferred.Key ResponsibilitiesProvide cloud data infrastructure support for data engineers, data analysts, and data scientists.Act as a technical thought leader for GCP Data Platform initiatives across the enterprise.Define best practices and automation standards for Cloud Data Infrastructure.Design and build scalable solutions that integrate structured and unstructured data sources.Act as an escalation point for critical production and cloud infrastructure issues.Implement Infrastructure-as-Code (IaC) and automation to drive platform reliability.Lead enterprise-level cross-functional projects related to cloud data and platform modernization.Contribute to long-term cloud data platform strategy and roadmap.Required Experience & Technical Skills7+ years of IT experience with hands-on work in data engineering and cloud data platforms.Strong experience architecting, deploying, and maintaining enterprise cloud data platforms.Hands-on experience building CI/CD and IaC pipelines using:✔ Terraform✔ ARM Templates✔ Azure DevOps✔ GitHub Actions or similarProficient in programming languages for data and backend development:✔ Python✔ SQL / NoSQL✔ C# or R (preferred)Experience writing test cases, test scripts, and performing data quality assurance.Expert-level experience with GCP Data Platform components:✔ Dataprep✔ Dataflow✔ Dataproc✔ Composer (Airflow)✔ BigQuery & GCSExperience with cloud data warehouses:✔ Snowflake✔ Databricks✔ BigQueryData Engineering & Architecture SkillsProven experience with real-time and batch ingestion pipeline frameworks.Strong understanding of:✔ ETL/ELT design✔ Data modeling✔ Data lake & lakehouse architectures✔ Medallion architecture patternsExperience with AI/ML productivity tooling such as:✔ Vertex AI✔ DialogflowExperience supporting Data Science & Analytics teams in AI/ML model deployment workflows.Knowledge of OLTP & OLAP systems, schema design, and query optimization.Governance, Security & OperationsWorking knowledge of governance & metadata tools such as:✔ Informatica✔ Alation✔ DataIkuImplement lineage, monitoring, alerting, and quality frameworks across the platform.Ensure compliance with data security controls and regulatory standards.Enable cost monitoring, system reliability, and resource efficiency across cloud usage.Educational RequirementsBachelor’s degree in Computer Science, Engineering, or related fieldOREquivalent practical training and experience in cloud data engineering (minimum +2 years).Preferred CertificationsGoogle Professional Data Engineer (strongly preferred)Google Cloud ArchitectDevOps-oriented certifications (AWS, Azure, Kubernetes, Terraform, etc.)

Apply Now

➤
Home | Contact Us | Privacy Policy | Terms & Conditions | Unsubscribe | Popular Job Searches
Use of our Website constitutes acceptance of our Terms & Conditions and Privacy Policies.
Copyright © 2005 to 2026 [VHMnetwork LLC] All rights reserved. Design, Develop and Maintained by NextGen TechEdge Solutions Pvt. Ltd.