Job Title:
Data Engineer
Company: Wissen Technology
Location: Mumbai, Maharashtra
Created: 2025-12-15
Job Type: Full Time
Job Description:
About the RoleWe are looking for a highly skilled Data Engineer with strong hands-on experience in SQL, Informatica (PowerCenter / IICS), and Python. The ideal candidate will design, develop, and maintain scalable ETL pipelines and database solutions while collaborating with cross-functional teams to support business data needs, analytics initiatives, and platform integration.If you enjoy solving complex data challenges, optimizing pipelines, and working across modern data ecosystems, we would love to meet you!Key Responsibilities- Design, develop, and optimize SQL queries, stored procedures, views, and triggers for large and complex datasets. - Build and maintain ETL workflows using Informatica PowerCenter / Informatica Cloud (IICS). - Develop custom Python scripts / APIs / automation components, and integrate data flows between systems. - Perform detailed data validation, profiling, and troubleshooting to ensure data quality and consistency. - Collaborate with Business Analysts, Data Engineers, and Application Teams to translate data requirements into scalable solutions. - Apply strong performance tuning, indexing strategies, and query optimization techniques. - Maintain high standards of data security, documentation, and compliance for all ETL and data engineering processes. - Work within Agile/Scrum teams and version control workflows (Git).Required Skills & Experience- 4–7 years of hands-on experience as a SQL Developer / ETL Developer / Data Engineer. - Strong proficiency in SQL (Oracle / SQL Server / PostgreSQL). - Solid experience with Informatica PowerCenter and/or Informatica Cloud (IICS). - Strong Python programming skills (preferred over Java) for automation, API integrations, data transformation, and utility scripts. - Good understanding of ETL concepts, data warehousing, data modeling, and best practices. - Experience with Git, CI/CD practices, and Agile project methodologies.Good to Have- Exposure to cloud data platforms (AWS, Azure, or GCP). - Familiarity with Unix/Linux shell scripting. - Knowledge of Spark / Airflow is a plus (not mandatory).Interview Process1️⃣ Python Assessment (Online Test)2️⃣ Technical Round 13️⃣ Technical Round 24️⃣ Technical Round 3 (Final Round – Face-to-Face at Office)Who Can Apply?✔ Candidates currently serving notice✔ Able to join within 15–30 days✔ Looking for opportunities in Bangalore or MumbaiHow to ApplyInterested candidates can apply directly via LinkedIn or share their updated resume at:rutuja.patil@