IN.JobDiagnosis logo

Job Title:

Analytics Engineer

Company: AARC Environmental

Location: Noida, Uttar Pradesh

Created: 2025-09-26

Job Type: Full Time

Job Description:

About the Role We’re hiring a hands-on Data Analyst to turn business questions into reliable datasets, clear visuals, and automated workflows. You’ll build and maintain Power BI dashboards, shape data models, and orchestrate pipelines using Microsoft Fabric. You’ll also own automation for API-based data acquisition and notifications, perform web scraping (ethically and compliantly), and write Python to automate, cleanse, and enrich data. The ideal candidate is equal parts analyst and problem-solver, comfortable moving from stakeholder requirements to production-grade dashboards and automations. What You’ll Do Business Intelligence & Visualization Build, iterate, and maintain Power BI dashboards; design robust data models (Power Query, DAX), optimize performance, and manage Row-Level Security (RLS). Translate stakeholder requirements into insightful visuals and clear data stories; support production refreshes and user inquiries. Data Engineering with Power BI & Fabric Use Microsoft Fabric (e.g., Dataflows Gen2/Lakehouse/Pipelines) or Gen1 Power BI dataflows to build reliable ETL/ELT processes and refresh schedules. Query and validate data with SQL (plus KQL/PySpark where applicable for larger or log-style datasets). Automation with & Power Automate Design, build, and maintain scenarios (HTTP modules, routers, iterators, data stores) to integrate third‑party REST APIs, webhooks, and internal systems; implement scheduling, logging, alerting, retries, and error handling. Use Power Automate for complementary internal workflow automation and notifications. API Integration & Data Acquisition Connect to external services via REST/Graph APIs (OAuth/API keys), handle pagination and rate limits, and ingest JSON into curated datasets for BI and analytics. Configure inbound/outbound webhooks between , , and other tools to streamline processes. Web Scraping Build targeted scrapers ( HTTP + parsing or Python with requests/beautifulsoup4/playwright/scrapy) to collect publicly available data. Python for Analytics & Ops Write production‑minded Python for data cleaning, reconciliation, and transformations (e.g., pandas); package repeatable scripts, add unit tests where practical, and schedule jobs via Fabric or . Platform & Stakeholder Support Provide first‑line support for BI dashboards, dataset refreshes, and workflow issues; help administer (permissions, simple automations, troubleshooting). Document data models, metrics, pipelines, and scenarios; champion data quality and governance. What You’ll Bring Required Qualifications Bachelor’s degree in Computer Science (or equivalent experience). 2–5 years in data analytics/BI with hands‑on Power BI (data modeling, DAX, refreshes, RLS). Practical experience building automations in (or similar: Power Automate/Zapier/n8n), integrating REST APIs and webhooks. Python skills for ETL/automation/scraping (e.g., pandas, requests, bs4, playwright/scrapy). Working SQL proficiency; familiarity with Microsoft Fabric concepts and ETL in Power Query. Strong troubleshooting, communication, and stakeholder support skills (especially for dashboards and workflows). Preferred Qualifications Experience with KQL (Kusto) or PySpark; Azure Data Factory; integrating data via REST APIs into Power BI. Exposure to Copilot in Fabric or Azure OpenAI to accelerate BI workflows. Certifications: PL‑300 (Power BI Data Analyst), DP‑600 (Fabric Analytics Engineer), AI‑900 (Azure AI Fundamentals). administration experience; understanding of access controls and basic governance.

Apply Now

➤
Home | Contact Us | Privacy Policy | Terms & Conditions | Unsubscribe | Popular Job Searches
Use of our Website constitutes acceptance of our Terms & Conditions and Privacy Policies.
Copyright © 2005 to 2025 [VHMnetwork LLC] All rights reserved. Design, Develop and Maintained by NextGen TechEdge Solutions Pvt. Ltd.