Job Title:
Data Engineer - Databricks
Company: Sparix Global.
Location: uttar pradesh
Created: 2026-05-12
Job Type: Full Time
Job Description:
Job Summary (List Format): 1. Design and implement data migration solutions from SQL Server to Databricks. 2. Use Databricks, Apache Spark, and Airflow for data engineering tasks. 3. Work extensively with MS SQL & Azure SQL databases. 4. Utilize Data Lake & Delta Lake for storage and processing. 5. Implement and manage Unity Catalog for data governance. 6. Build and maintain data marts and data warehouses using star and snowflake schemas. 7. Write and optimize advanced SQL queries, stored procedures, views, and triggers (5+ years). 8. Perform query performance tuning (5+ years). 9. Handle both DDL and DML operations (5+ years). 10. Design enterprise database systems, preferably with Microsoft SQL Server/Azure SQL (5+ years). 11. Manage and execute data migration projects. 12. Apply knowledge of Lakehouse architecture (preferred). 13. Participate in rapid prototyping and proof-of-concept development. 14. Contribute to technical architecture and best practice implementation. 15. Assist in developing and refining functional and non-functional requirements. 16. Participate in iteration and release planning activities. 17. Ensure compliance with relevant state and federal regulations. 18. Deliver high-quality, functioning solutions on time. 19. Estimate and plan tasks accurately. 20. Collaborate closely with a small, agile team. 21. Adapt to rapid iteration, tight deadlines, and changing priorities. 22. Take on diverse tasks and support team as needed. 23. Communicate technical concepts clearly, both verbally and in writing. 24. Accept direction from leadership and work independently. 25. Apply source/version control systems, including branching and merging strategies. 26. Demonstrate knowledge of Web APIs, REST, and JSON. 27. Create unit tests for developed solutions. 28. Hold a Bachelor s Degree and/or 5+ years of relevant work experience. 29. Adhere strictly to Information Security Management policies and procedures. Mandatory Skillset: Databricks, Unity Catalog, Data Migration, Azure Data Factory (ADF), Data Lake Location: Remote Notice Period: Immediate