Job Title:
GCP Architect
Company: Impetus
Location: Bijapur, Karnataka
Created: 2026-01-16
Job Type: Full Time
Job Description:
About ImpetusImpetus Technologies is a digital engineering company focused on delivering expert services and products to help enterprises achieve their transformation goals. We solve the analytics, AI, and cloud puzzle, enabling businesses to drive unmatched innovation and growth.Founded in 1991, we are cloud and data engineering leaders providing solutions to fortune 100 enterprises, headquartered in Los Gatos, California, with development centers in NOIDA, Indore, Gurugram, Bengaluru, Pune, and Hyderabad with over 3000 global team members. We also have offices in Canada and Australia and collaborate with a number of established companies, including American Express, Bank of America, Capital One, Toyota, United Airlines, and Verizon.Job Title- GCP ArchitectJob Location- Bangalore, Noida, Pune, Gurgaon, Hyderabad, IndoreJob Description- Have 10+ years of experience in design, architecture, implementation, and optimization of data engineering solutions over large volume (TB, PB scale) of data. - Have expertise in designing and implementing end-to-end data architectures using Google Cloud Dataproc, including data ingestion pipelines, transformation logic, and data warehousing strategies to handle large-scale batch and real-time data processing. - Proven expertise in GCP services including Dataproc, Dataflow, Cloud Storage, BigQuery, Cloud Composer, and Cloud Functions. Experience building scalable data lakes and pipelines. - Strong hands-on experience processing large volumes of data; proficiency in PySpark, Python, Spark SQL, and automating workflows. - Have good exposure to implementing robust data governance using Dataplex and security measures. - Have proficiency in requirements analysis, solution design, development, testing, deployment, and ongoing support, including cloud migration projects for large-scale data platforms.Roles & Responsibilities- Have 10+ years of experience in design, architecture, implementation, and optimization of data engineering solutions over large volume (TB, PB scale) of data. - Proven expertise in GCP services including Dataproc, Dataflow, Cloud Storage, BigQuery, Cloud Composer, and Cloud Functions. Experience building scalable data lakes and pipelines. - Proficiency in PySpark, Python, Spark SQL, and automating workflows. - Hands-on exposure in configuring and managing Dataproc clusters, workspaces, and Dataplex for optimal performance, cost management, and scalability in processing terabyte-scale datasets. - Have good exposure to writing optimized SQL (BigQuery SQL preferred). - Have good communication and problem-solving skills. - Able to create POCs to achieve solutions and participate in proposals and RFPs. - Have good understanding of GenAI technologies and able to implement solutions using GenAI.