IN.JobDiagnosis logo

Job Title:

Data Engineer - Pentaho Data Integration & Modernization - PERMANENT, REMOTE

Company: datavruti

Location: Bareilly, Uttar pradesh

Created: 2025-12-18

Job Type: Full Time

Job Description:

Hiring for: A subsidiary of a global software investor with a 100B+ market capitalization across portfolio companies.Role: Data Engineer - Pentaho Data Integration & Modernization - On-prem & CloudExperience: 3 to 6 years in Pentaho, overall could be moreLocation: Permanent Remote, IndiaSalary: Based on fitmentNotice Period: 30 days preferredOverview:Our client is modernizing our data integration landscape and is seeking a skilled Pentaho Developer / Data Engineer who is excited about transforming legacy ETL systems into scalable, cloud ready data pipelines using industry-leading technologies. You will help maintain and enhance existing Pentaho Data Integration (PDI/Kettle) processes while playing a key role in their migration to modern data engineering platforms. You will collaborate with technical and  business teams across their portfolio to ensure data flows are robust, efficient, and future ready. This role is ideal for a developer / data engineer who enjoys solving complex data problems and is  passionate about modernization, performance, and continuous improvement.Key Responsibilities Develop, maintain, and optimize ETL workflows using Pentaho Data Integration  (PDI/Kettle). Document and analyze existing Pentaho jobs, data flows, dependencies, and  performance bottlenecks. Contribute to and execute the migration strategy from Pentaho to modern ETL/data  integration platforms, such as: Talend / Talend Cloud, Informatica Cloud (IICS), Azure Data Factory, AWS Glue, dbt (Data Build Tool), Snowflake pipelines (Tasks, Streams, Snowpipe).Work with architects and engineering leads to shape target-state data integration  architecture.Implement data validation, quality checks, and reconciliation processes. Assist in building scalable, maintainable, and secure data pipelines across on-prem  and cloud environments. Support production workloads, resolve incidents, and ensure reliability of critical  data processes. Produce clear technical documentation and migration runbooks. Required Qualifications 3 to 6+ years of hands-on experience with Pentaho PDI/Kettle or similar ETL platforms. Strong SQL development skills and solid understanding of relational database concepts. Experience with at least one modern ETL/cloud data integration tool, including: Talend, Informatica Cloud (IICS), Azure Data Factory, AWS Glue, dbt.Understanding of cloud ecosystems (Azure, AWS, or GCP). Familiarity with Git-based version control, CI/CD, and ETL/job scheduling tools.Excellent problem-solving skills, attention to detail, and ability to work  collaboratively in a distributed team. Preferred Skills Experience with modern cloud data warehouses: Snowflake, BigQuery, Redshift,  etc. Exposure to data modeling (dimensional, star schema, SCD patterns).Python scripting for automation and data engineering tasks. Experience supporting ETL modernization or cloud migration projects.

Apply Now

➤
Home | Contact Us | Privacy Policy | Terms & Conditions | Unsubscribe | Popular Job Searches
Use of our Website constitutes acceptance of our Terms & Conditions and Privacy Policies.
Copyright © 2005 to 2025 [VHMnetwork LLC] All rights reserved. Design, Develop and Maintained by NextGen TechEdge Solutions Pvt. Ltd.