IRIS Data Engineer – (Azure & Snowflake, Python, SOX, CPLEX /Gurobi) Job Location: 100% remote India Job Duration: 6 Months and Hrs/Wk:40.00 Position is 100% remote, seeking folks offshore or nearshore as long as they can work US core business hourAs IRIS Data Engineer, you will work with Data Scientists and Data Architects to translate prototypes into scalable solutions.A degree is not required, as long as they have the right skillset and can commit to the work/projects assigned. • There will likely be 2 rounds of interviews for this position.Key Skills Required • Programming: Python, SQL, Spark • Cloud Platforms: Azure, Snowflake • Data Tools: DBT, Erwin Data Modeler, Apache Airflow , API Integrations, ADF • Governance: Data masking, metadata management, SOX compliance • Soft Skills: Communication, problem-solving, stakeholder engagement Proficiency in SQL and any one other programming language (e.g., R, Python, C++, Minitab, SAS, Matlab, VBA – knowledge of optimization engines such as CPLEX or Gurobi.Key Responsibilities: 1. Data Pipeline Design & Development Data Engineers are responsible for designing and building robust, scalable, and high-quality data pipelines that support analytics and reporting needs. This includes: • Integration of structured and unstructured data from various sources into data lakes and warehouses. • Build and maintain scalable ETL/ELT pipelines for batch and streaming data using Azure Data Factory, Databricks, Snowflake and Azure SQL Server, control -M. • Collaborate with data scientists, analysts, and platform engineers to enable analytics and ML use cases. • Design, develop, and optimise DBT models to support scalable data transformations. 2. Cloud Platform Engineering They operationalize data solutions on cloud platforms, integrating services like Azure, Snowflake, and third-party technologies. • Manage environments, performance tuning, and configuration for cloud-native data solutions. 3. Data Modeling & Architecture • Apply dimensional modeling, star schemas, and data warehousing techniques to support business intelligence and machine learning workflows. • Collaborate with solution architects and analysts to ensure models meet business needs. 4. Data Governance & Security • Ensure data integrity, privacy, and compliance through governance practices and secure schema design. • Implement data masking, access controls, and metadata management for sensitive datasets. 5. Collaboration & Agile Delivery • Work closely with cross-functional teams including product owners, architects, and business stakeholders to translate requirements into technical solutions. • Participate in Agile ceremonies, sprint planning, and DevOps practices for continuous integration and deployment.Technical Skills: •7+ years of data engineering or design experienc e, designing, developing, and deploying scalable enterprise data analytics solutions from source system through ingestion and reporting. •5+ years of experience in ML Lifecycle using Azure Kubernetesservice, Azure Container Instance service, Azure Data Factory, Azure Monitor, Azure DataBricks building datasets, ML pipelines, experiments, logging, and monitoring. (Including Drifting, Model Adaptation and Data Collection). • 5+ years of experience in data engineering using Snowflake. • Experience in designing, developing & scaling complex data & feature pipelines feeding ML models and evaluating their performance. • Experience in building and managing streaming and batch inferencing. •Proficiency in SQL and any one other programming language (e.g., R, Python, C++, Minitab, SAS, Matlab, VBA – knowledge of optimization engines such as CPLEX or Gurobi. • Strong experience with cloud platforms(AWS, Azure, etc.)and containerization technologies (Docker, Kubernetes). • Experience withCI/CD tools such as GitHub Actions, GitLab , Jenkins, or similar tools. • Familiarity with security best practices inDevOps and ML Ops. • Experience in developing and maintainingAPIs (e.g.: REST) • Agile/Scrum operating experience usingAzure DevOps. • Experience with MS Cloud - ML Azure Databricks, Data Factory, Synapse, among others.Professional Skills: • Strong analytical and problem-solving skills and passion for product development. • Strong understanding of Agile methodologies and open to working in agile environments with multiple stakeholders. • Professional attitude and service orientation; team player. • Ability to translate business needs into potential analytics solutions. • Strong work ethic: ability to work at an abstract level and gain consensus. • Ability to build a sense of trust and rapport to create a comfortable and effective workplace.
Job Title
IRIS Data Engineer