Job description: About Senzcraft: Founded by IIM Bangalore and IEST Shibpur Alumni, Senzcraft is a hyper-automation company. Senzcraft vision is to Radically Simplify Today's Work. And Design Business Process For The Future. Using intelligent process automation technologies. We have a suite of SaaS products and services, partnering with automation product companies. Please visit our website - for more details Our AI Operations SaaS platform – Senzcraft on linkedin -> Senzcraft is awarded by Analytics India Magazine in it’s report “State of AI in India” as a “Niche AI startup”. Senzcraft is also recognized by NY based SSON as a top hyper-automation solutions provider. About the role: We are seeking a Sr. Data Engineer to own the architecture, delivery, and operations of complex distributed systems and data pipelines. This role combines leadership, backend engineering, and data engineering expertise, with a focus on building scalable microservices, resilient data flows, and cloud-native solutions on Java/Spring Boot, Hadoop ecosystem, and Google Cloud Platform (GCP). Key Responsibilities: Architecture & Delivery : Lead design and end-to-end delivery of scalable, secure, and resilient microservices and data-intensive systems. Backend Development : Build robust RESTful APIs and services in Java (8+) / Spring Boot , integrating with messaging/streaming platforms. Data Engineering : Design and implement reliable data ingestion, transformation, and delivery pipelines (batch & streaming) using Hadoop ecosystem (HDFS, Hive/Spark, Oozie/Sqoop) and GCP data services (BigQuery, Dataproc, Dataflow, Composer). Cloud & DevOps : Define and maintain CI/CD pipelines , containerized deployments (Docker/Kubernetes), and automation workflows; collaborate with DevOps and QA to ship features safely and frequently. Reliability & Observability : Drive root-cause analysis for complex production issues; improve system reliability, cost-efficiency, and observability (metrics, logging, tracing). Leadership & Mentorship : Set technical direction, review designs/code, and coach engineers to raise the bar on engineering excellence. Collaboration : Translate business requirements into technical solutions and partner with stakeholders across product, UX, and platform functions. Technical Requirements: Strong expertise in Java (8+) and Spring Boot , with proven delivery of high-scale microservices. Hands-on experience with Hadoop ecosystem (HDFS, Hive, Spark, Oozie, Sqoop) and production-grade data pipelines . Proficiency with Google Cloud Platform (BigQuery, Dataproc, Dataflow, Cloud Storage, IAM, monitoring/logging) . Strong grasp of relational and NoSQL databases , data modeling, and performance optimization. Experience with CI/CD workflows, Git-based development, containerization (Docker/Kubernetes) , and cloud deployment patterns. Solid understanding of system observability, reliability, and security best practices . Nice to Have: Experience with schema management, data quality, lineage, and cost optimization in cloud data platforms. Exposure to event-driven architecture, messaging/streaming platforms (Kafka/PubSub). Familiarity with infrastructure-as-code (Terraform) and advanced DevOps practices. Performance/load testing and capacity planning in distributed systems. Leadership & Ways of Working: 8–12+ years of professional software engineering experience, with 3+ years in a lead role . Proven ability to mentor and guide engineers , foster collaboration, and set technical direction. Strong communication skills with experience operating in Agile environments . Bachelor’s/Master’s in Computer Science or related field (or equivalent practical experience). Role Details: Location: Bangalore (Hybrid) Experience: 4-12 years Notice Period: Immediate - 15 days preferred
Job Title
Senior Data Engineer - GCP | 4-12 years experience | Hybrid role | Immediate - 15 days notice period preferred