IN.JobDiagnosis logo

Job Title:

Snap logic Developer

Company: ValueLabs

Location: Belgaum, Karnataka

Created: 2025-12-20

Job Type: Full Time

Job Description:

Note: Looking for Immediate Joiners Job Title: Data Engineering Specialist (Streaming & Integration)Department: Data & Analytics Reports To: Data Engineering Manager Location: Remote (Global) Employment Type: Full-TimeOverview: We are seeking a highly skilled Data Engineering Specialist with deep expertise in real-time data streaming and integration platforms to join our growing data team. The ideal candidate will have hands-on experience with SnapLogic and Confluent Kafka, and will be responsible for designing, building, and maintaining robust, scalable data pipelines that enable real-time analytics, operational intelligence, and seamless integration across enterprise systems.Key Responsibilities:Design, develop, and maintain high-throughput, low-latency data pipelines using SnapLogic and Confluent Kafka.Architect and implement event-driven systems using Kafka for real-time data ingestion, processing, and distribution across microservices and downstream analytics platforms.Configure and manage SnapLogic integration workflows for secure, reliable, and automated data movement between SaaS, on-premise, and cloud applications.Collaborate with data scientists, analysts, and application teams to understand data needs and deliver scalable integration solutions.Optimize Kafka cluster performance, monitor stream health, and ensure data durability, consistency, and fault tolerance.Implement data quality checks, schema evolution strategies, and observability using tools like Confluent Control Center, Grafana, and Prometheus.Ensure security and compliance in data flows through encryption, access control, and audit logging.Participate in agile ceremonies and contribute to technical documentation, release planning, and CI/CD practices.Stay current with evolving trends in streaming data, integration platforms, and cloud-native data architectures.Required Qualifications:Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.4+ years of professional experience in data engineering with a focus on streaming data and integration.Proven experience with Confluent Kafka: building producers/consumers, managing topics, handling partitioning, replication, and stream processing using Kafka Streams or KSQL.Extensive hands-on experience with SnapLogic, including building, testing, and deploying integrations using the SnapLogic Integration Cloud.Strong understanding of data modeling, ETL/ELT processes, and data pipeline orchestration.Experience with cloud platforms (AWS, Azure, or GCP) and containerized environments (Docker, Kubernetes).Proficiency in scripting languages (Python, Bash) and familiarity with infrastructure as code (Terraform, CloudFormation).Knowledge of data security, governance, and compliance standards (e.g., GDPR, SOC 2).Excellent communication skills and ability to work in a collaborative, remote-first environment.Note: Looking for Immediate Joiners

Apply Now

➤
Home | Contact Us | Privacy Policy | Terms & Conditions | Unsubscribe | Popular Job Searches
Use of our Website constitutes acceptance of our Terms & Conditions and Privacy Policies.
Copyright © 2005 to 2025 [VHMnetwork LLC] All rights reserved. Design, Develop and Maintained by NextGen TechEdge Solutions Pvt. Ltd.