Role Overview:We are looking for an experienced Senior Data Engineer to design, build, and optimize data pipelines and platforms. The ideal candidate will have strong expertise in Databricks, Snowflake, Airflow, PySpark, and Python, with a proven track record of delivering scalable data solutions.Key Responsibilities:Design and develop robust, scalable, and efficient data pipelines using Databricks and PySpark.Implement ETL processes and data workflows leveraging Airflow for orchestration.Manage and optimize data storage and processing in Snowflake.Collaborate with data scientists, analysts, and business teams to deliver high-quality data solutions.Ensure data quality, integrity, and security across all platforms.Monitor and troubleshoot data pipelines to maintain high availability and performance.Contribute to architecture decisions and best practices for data engineering.Required Skills & Qualifications:6–8 years of experience in data engineering or related roles.Strong hands-on experience with Databricks, Snowflake, Airflow, PySpark, and Python.Solid understanding of distributed data processing and big data technologies.Experience with SQL and performance tuning in Snowflake or similar data warehouses.Familiarity with CI/CD pipelines and DevOps practices for data workflows.Knowledge of data modeling, data governance, and security best practices.Preferred Qualifications:Experience with cloud platforms (AWS, Azure, or GCP).Exposure to streaming technologies (Kafka, Spark Streaming) is a plus.Certifications in Databricks or Snowflake are an advantage.
Job Title
Senior Data Engineer