IN.JobDiagnosis logo

Job Title:

Senior Data Engineer (Databricks | Insurance | Data Migration)

Company: PwC India

Location: Mumbai, Maharashtra

Created: 2025-12-13

Job Type: Full Time

Job Description:

Job Description – Senior Data Engineer (Databricks | Insurance | Data Migration)Location: Pan IndiaExperience: 7–12 yearsRole OverviewWe are looking for a highly skilled Senior Manager / Manager /Senior Data Engineer with deep expertise in Databricks data management, logical & physical data modelling, and insurance domain data workflows. The candidate will work on a strategic data migration initiative for a leading UK-based insurance company, moving data from Guidewire into Databricks Silver and Gold layers with strong governance, lineage, and scalability standards.Key ResponsibilitiesDatabricks Data Engineering & ManagementDesign, build, and optimize Silver and Gold layer data pipelines in Databricks using PySpark, SQL, Delta Lake, and Workflow orchestration.Implement data quality, lineage, schema evolution, and governance controls across curated layers.Optimize Databricks jobs for performance, scalability, and cost efficiency.Guidewire → Databricks MigrationLead the end-to-end migration of large-scale insurance data from Guidewire PolicyCenter/ClaimCenter/BillingCenter into Databricks.Map and transform complex Guidewire entity structures into normalized and star-schema models.Data Modelling & ArchitectureDevelop robust logical and physical data models aligned to insurance business processes.Build high-quality curated data marts (Gold) for analytics, reporting, pricing, underwriting, and claims.Define standards for metadata, naming conventions, partitioning, and model documentation.Insurance Domain ExpertiseUnderstand core insurance data entities such as policy, claims, billing, customer, underwriting, rating, and product hierarchies.Apply domain knowledge to rationalize Guidewire data structures and create business-ready datasets.Solutioning & IdeationCollaborate with client SMEs, architects, and business analysts to shape data solutions and propose design improvements.Ability to ** ideate, simplify complex data flows**, and contribute to overall solution architecture.Required Skills & ExperienceTechnical7–12 years of experience in data engineering, data modelling, and data management.Strong hands-on experience in Databricks, Delta Lake, PySpark, Spark SQL, and ETL/ELT pipelines.Expertise in logical & physical data modelling (3NF, Star Schema, Data Vault preferred).Practical knowledge of Guidewire data model and prior migration experience (mandatory).Experience working with large-scale insurance datasetsStrong understanding of data quality frameworks, lineage, cataloging, and governance.Soft SkillsStrong problem-solving and conceptualization / ideation capability.Excellent communication and stakeholder-management for UK client environment.Ability to work in fast-paced delivery tracks with cross-functional global teams.Preferred QualificationsCertifications in Databricks, Azure/AWS, and Data Migration are added advantages.Experience delivering enterprise-grade data lakes or lakehouse architectures.Why Join This Role?Work on a flagship insurance data modernisation project for a top UK carrier.Opportunity to shape enterprise-scale data models on the Databricks Lakehouse.High-visibility role with strong career growth in insurance data engineering.

Apply Now

➤
Home | Contact Us | Privacy Policy | Terms & Conditions | Unsubscribe | Popular Job Searches
Use of our Website constitutes acceptance of our Terms & Conditions and Privacy Policies.
Copyright © 2005 to 2026 [VHMnetwork LLC] All rights reserved. Design, Develop and Maintained by NextGen TechEdge Solutions Pvt. Ltd.