IN.JobDiagnosis logo

Job Title:

Senior Data Engineer

Company: United Technologies & Information Services LLC

Location: Mumbai, Maharashtra

Created: 2026-04-04

Job Type: Full Time

Job Description:

ABOUT UTISUnited Technologies & Information Services LLC (UTIS) is a boutique IT staffing and consulting firm headquartered in Irvine, California. We specialize in data engineering, ETL architecture, and enterprise analytics solutions built on the Microsoft data stack. Our consultants deliver hands-on technical work for Fortune 500 clients across semiconductor, financial services, and real estate technology verticals.We are investing in AI-assisted development workflows — including Claude Code by Anthropic — to accelerate delivery and set a new standard for how ETL modernization projects are executed.ROLE OVERVIEWWe are seeking an experienced Senior Data Engineer based in India to lead the conversion of approximately 100 legacy SSIS (SQL Server Integration Services) packages to Microsoft Fabric notebooks and pipelines. This is a hands-on technical role, not a management position.You will build and operate an AI-augmented migration framework using Claude Code that automates the repetitive structural conversion of SSIS packages, while applying your data engineering expertise to handle the edge cases, validate output quality, and ensure production-grade results.This role combines deep SSIS knowledge with modern PySpark/Fabric skills and comfort with agentic AI development tooling. You will work directly with the UTIS founder and U.S.-based client stakeholders.KEY RESPONSIBILITIESMigration Framework Development (Weeks 1–3)Analyze and inventory the existing SSIS package portfolio (~100 .dtsx files), documenting all data flows, control flows, connection managers, variables, expressions, and dependencies.Design and build an agentic migration accelerator using Claude Code with a comprehensive CLAUDE.md rules file encoding SSIS-to-Fabric component mappings, coding standards, and validation criteria.Implement a multi-agent orchestration workflow: Parser Agent (DTSX XML → structured JSON), Mapper Agent (component-level translation rules), Generator Agent (PySpark notebook / Fabric pipeline JSON output), and Validator Agent (reconciliation checks).Establish a repeatable, version-controlled pipeline for processing packages in batches with confidence scoring (auto-converted vs. needs-review).Package Conversion & Validation (Weeks 4–12+)Execute the migration of ~100 SSIS packages to Microsoft Fabric using the accelerator framework, triaging packages by complexity tier (simple linear flows, lookup/transform-heavy, complex control flow).Manually convert or refine packages that require human judgment: Script Tasks (C# → Python), custom components, CDC/Change Tracking patterns, and transaction-scoped logic.Translate SSIS expressions, Lookup transforms, Derived Columns, Conditional Splits, and Data Conversions into equivalent PySpark operations (.join(), .withColumn(), .filter(), .cast()).Map SSIS Catalog environment variables to Fabric workspace parameters and SQL Agent job schedules to Fabric Pipeline orchestration and triggers.Perform reconciliation testing: row counts, data type validation, transformation logic parity, and end-to-end data quality checks against source SSIS package behavior.Documentation & Knowledge TransferGenerate side-by-side conversion documentation for each package with confidence scoring and manual review notes.Produce a migration runbook and reusable Fabric notebook templates for ongoing development post-migration.Participate in regular sync calls with U.S.-based stakeholders during overlapping hours (IST evenings / U.S. mornings).REQUIRED QUALIFICATIONS5+ years of hands-on SSIS development, including complex Data Flow and Control Flow design, package deployment (SSISDB Catalog), and SQL Agent scheduling.Strong T-SQL proficiency and SQL Server architecture knowledge (2016+), including Change Tracking, indexed views, and stored procedure development.Working proficiency in Python and PySpark for data transformation and notebook-based pipeline development.Familiarity with Microsoft Fabric (Notebooks, Pipelines, Lakehouse, Warehouse) or demonstrated experience with Azure Data Factory / Azure Databricks as a close equivalent.Experience reading and parsing XML structures — specifically SSIS .dtsx package format.Solid understanding of data migration methodologies, reconciliation testing, and ETL validation frameworks.Ability to produce clear, structured technical documentation and migration runbooks.Fluent English communication (written and verbal) for daily collaboration with U.S.-based team.Availability for 2–3 hours of overlap with U.S. Pacific Time (e.g., IST 8:30 PM – 11:30 PM or early morning IST).PREFERRED QUALIFICATIONSHands-on experience using AI-assisted coding tools (Claude Code, GitHub Copilot, Cursor, or similar) in professional development workflows.Prior experience with at least one SSIS-to-cloud migration project (SSIS → ADF, SSIS → Databricks, or SSIS → Fabric).Knowledge of Change Data Capture (CDC), incremental load patterns, and SCD Type 2 implementations in both SSIS and Spark environments.Experience with Fabric REST API for programmatic pipeline deployment.Microsoft certifications: Fabric Analytics Engineer (DP-600), Azure Data Engineer (DP-203), or equivalent.WHAT SUCCESS LOOKS LIKEWeek 3: Migration accelerator framework operational. First 10 packages converted and validated as proof of concept.Week 8: 70+ packages converted with reconciliation testing complete. Edge cases identified and documented.Week 12: Full portfolio migrated. Side-by-side documentation delivered. Fabric pipelines running in production with monitoring in place.

Apply Now

➤
Home | Contact Us | Privacy Policy | Terms & Conditions | Unsubscribe | Popular Job Searches
Use of our Website constitutes acceptance of our Terms & Conditions and Privacy Policies.
Copyright © 2005 to 2026 [VHMnetwork LLC] All rights reserved. Design, Develop and Maintained by NextGen TechEdge Solutions Pvt. Ltd.