IN.JobDiagnosis logo

Job Title:

Data Engineer (Analyst)

Company: Canopus Infosystems - A CMMI Level 3 Company

Location: Gurgaon, Haryana

Created: 2026-02-28

Job Type: Full Time

Job Description:

Job Title: Data Engineer (Analyst) Experience: 2.5 to 5 Years Location: PAN India (Remote/On-site as applicable) About the Role: We are looking for Data Engineer (Analyst) to build and maintain reliable data pipelines and analytics-ready datasets that power BI reporting, product insights, and business decision-making. You’ll work across multiple data sources, model clean reporting layers, and ensure data quality end-to-end. Key Responsibilities: Build and maintain scalable ETL/ELT pipelines (batch and incremental) using SQL + Python Integrate data from databases, APIs, SaaS tools, event data, and flat files Design analytics-ready data models (star schema/marts) for self-serve reporting Create and optimize transformations in a cloud warehouse/lakehouse (e.g., Snowflake, BigQuery, Redshift, Synapse, Databricks) Partner with stakeholders to define KPIs, metric logic, and reporting requirements Maintain dashboards and reporting outputs in tools like Power BI, Tableau, Looker, or Sigma Implement data quality checks, monitoring, alerts, and documentation to keep datasets trusted Tune performance and cost (incremental loads, partitioning, query optimization, file formats) Required Skills: Strong SQL skills (CTEs, window functions, joins, aggregations, optimization) Strong Python skills for transformations and automation Hands-on experience with at least one cloud platform: AWS / Azure / GCP Experience with a modern data warehouse/lakehouse (Snowflake/BigQuery/Redshift/Synapse/Databricks) Solid understanding of ETL/ELT patterns (incremental loads, retries, idempotency, basic CDC) Comfort with data modeling for analytics and BI reporting Experience building stakeholder-friendly reporting in a BI tool (Power BI/Tableau/Looker/Sigma)  Nice to have:Orchestration tools: Airflow, dbt, Dagster, Prefect, ADF, Glue, etc. Streaming/event data: Kafka, Kinesis, Pub/Sub Monitoring/logging: CloudWatch, Azure Monitor, GCP Monitoring, Datadog CI/CD + Git-based workflows for data pipelines 

Apply Now

➤
Home | Contact Us | Privacy Policy | Terms & Conditions | Unsubscribe | Popular Job Searches
Use of our Website constitutes acceptance of our Terms & Conditions and Privacy Policies.
Copyright © 2005 to 2026 [VHMnetwork LLC] All rights reserved. Design, Develop and Maintained by NextGen TechEdge Solutions Pvt. Ltd.