Job Title:
Senior GenAI Data Engineer
Company: Cargill
Location: Ajmer, Rajasthan
Created: 2026-03-10
Job Type: Full Time
Job Description:
Job Purpose and ImpactThe Senior Professional, Data Engineering job designs, builds and maintains complex data systems that enable data analysis and reporting. With minimal supervision, this job ensures that large sets of data are efficiently processed and made accessible for decision making.Key AccountabilitiesDATA INFRASTRUCTURE: Prepares data infrastructure to support the efficient storage and retrieval of data.DATA FORMATS: Examines and resolves appropriate data formats to improve data usability and accessibility across the organization.DATA & ANALYTICAL SOLUTIONS: Develops complex data products and solutions using advanced engineering and cloud based technologies, ensuring they are designed and built to be scalable, sustainable and robust.DATA PIPELINES: Develops and maintains streaming and batch data pipelines that facilitate the seamless ingestion of data from various data sources, transform the data into information and move to data stores like data lake, data warehouse and others.DATA SYSTEMS: Reviews existing data systems and architectures to identify areas for improvement and optimization.STAKEHOLDER MANAGEMENT: Collaborates with multi-functional data and advanced analytic teams as well as with business teams to gain requirements and ensure that data solutions meet the functional and non-functional needs of various partners.DATA FRAMEWORKS: Builds complex prototypes to test new concepts and implements data engineering frameworks and architectures that improve data processing capabilities and support advanced analytics initiatives.AUTOMATED DEPLOYMENT PIPELINES: Develops automated deployment pipelines improving efficiency of code deployments with fit for purpose governance.DATA MODELING: Performs complex data modeling in accordance to the datastore technology to ensure sustainable performance and accessibility.QualificationsMinimum requirement of 6 years of relevant work experience. TECHNICAL SKILLS REQUIRED:Data Platform Design - Designing scalable ELT data platforms on Snowflake supporting batch and real-time workloadsAdvanced Python Engineering - Building production-grade Python pipelines and reusable data frameworks, with working knowledge of .NET services and integrationsSnowflake & Relational Database Expertise - Deep knowledge of Snowflake architecture, advanced SQL, and experience working with Oracle, SQL Server, and PostgreSQLBatch & Real-Time Processing - Designing and operating reliable batch and streaming / real-time data pipelines using Apache Kafka and Apache PulsarPerformance & Cost Optimization - Optimizing Snowflake queries, warehouse usage, and Python workloads for efficiency and scaleSecurity & Governance - Implementing access controls, data protection, and secure data-sharing patterns across data platformsReliability & Data Quality - Ensuring pipeline resilience, monitoring, and data quality across critical datasetsGenAI Enablement - Enabling GenAI use cases through high-quality data pipelines, including preparation of structured and unstructured data, embeddings, and integration with OpenAI (e.g., RAG-style workflows)PREFERED COMPETENCIESProven experience working in the Trading and / or Finance industryProven experience with MS Power BI and Tableau