Experience with good knowledge of Knowledge of Pyspark, Big Data experience, AWS EMR, S3, IAM, Lambda, SNS, SQS, Python and AWS services · Good python, Pyspark scripting skills · Spark UI/Optimization/debugging techniques · Intermediate SQL exposure – Subquery, Joins, CTE’s · Good understanding of Database technologies · Excellent communication skills to liaise with Business & IT stakeholders. · Expertise in planning execution of a project and efforts estimation. · Understanding of Data Vault, data mesh and data fabric architecture patterns. · Exposure to working in Agile ways of working.