IN.JobDiagnosis logo

Job Title:

Data Engineer [T500-25348]

Company: Costco IT

Location: Mumbai, Maharashtra

Created: 2026-04-22

Job Type: Full Time

Job Description:

About Costco Wholesale:Costco Wholesale is a multi-billion-dollar global retailer with warehouse club operations in eleven countries. They provide a wide selection of quality merchandise, plus the convenience of specialty departments and exclusive member services, all designed to make shopping a pleasurable experience for their members. About Costco Wholesale India:At Costco Wholesale India, we foster a collaborative space, working to support Costco Wholesale in developing innovative solutions that improve members’ experiences and make employees’ jobs easier. Our employees play a key role in driving and delivering innovation to establish IT as a core competitive advantage for Costco Wholesale. Position Title: Data Engineer.Role Summary: The Data Engineer is responsible for developing data pipelines and/or data integrations of for Costco’s enterprise certified data sets that are used for business-critical data consumption use cases (i.e. Reporting, Data Science/Machine Learning, Data APIs, etc.). The Data Engineer will partner with product owners, data architects, and data platform teams to design, build, test, and automate data pipelines that are relied upon across the company as the single source of truth. Roles & Responsibilities:Develops and operationalizes data pipelines to bring data into Costco’s GCP landscape for the delivery of certified data sets.Works in tandem with Data Architects, Data Stewards, and Data Quality Engineers to design data pipelines and recommends ongoing optimization of data storage, data ingestion, data quality and orchestration. Designs, develops, and implements ETL/ELT/CDC processes using Data Build Tool (DBT) and other native GCP Services (BigQuery Subscriptions, Dataproc, Dataflow, etc.). Uses GCP services such as BigQuery, AlloyDB, Spanner, Dataplex, Pub/Sub, Cloud Storage, etc. to improve and speed delivery of our data products and services. Identifies, designs, and implements internal process improvements: automating manual processes, optimizing pipeline delivery and support. Identifies ways to improve data reliability, efficiency, and quality of data management.Participate in off hours 24/7 on call support on a rotational basis.Experience Required: 5+ years of experience.Minimum Qualifications: 5+ years’ experience engineering and operationalizing data pipelines with large and complex datasets. 5+ years’ hands-on experience with Data Build Tool (DBT), Dataflow & Dataproc (Spark).5+ years’ experience working with BigQuery, Google Cloud Storage, AlloyDB and Spanner. 5+ years’ experience with Data Pipeline, ETL/ELT, and Data Warehousing. Effective use of AI and/or LLMs to increase efficiency in deliverables.Extensive experience working with various data sources (DB2, SQL,Oracle, flat files (csv, delimited), APIs, XML, JSON).Experience implementing data integration techniques such as event/message based integration (Kafka, Google Pub/Sub). Advanced SQL skills; solid understanding of relational databases and business data; ability to write complex SQL queries against a variety of data sources. Strong understanding of database storage concepts (data lake, relational databases, NoSQL, Graph, data warehousing). Experience with Git / Azure DevOps / Jira.CDMP (Certified Data Management Professional) Certification.

Apply Now

➤
Home | Contact Us | Privacy Policy | Terms & Conditions | Unsubscribe | Popular Job Searches
Use of our Website constitutes acceptance of our Terms & Conditions and Privacy Policies.
Copyright © 2005 to 2026 [VHMnetwork LLC] All rights reserved. Design, Develop and Maintained by NextGen TechEdge Solutions Pvt. Ltd.