<br><br> KPMG entities in India are professional services firm(s). These Indian member firms are affiliated with KPMG International Limited. KPMG was established in India in August 1993. Our professionals leverage the global network of firms and are conversant with local laws, regulations, markets, and competition. KPMG has offices across India in Ahmedabad, Bengaluru, Chandigarh, Chennai, Gurugram, Hyderabad, Jaipur, Kochi, Kolkata, Mumbai, Noida, Pune, Vadodara, and Vijayawada. <br><br>Responsibilities <br>- Experience in Data and Analytics and overseen end-to-end implementation of data pipelines on cloud-based data platforms. <br>- Strong programming skills in Python, Pyspark and some combination of Java, Scala (good to have). <br>- Experience writing SQL, structuring data, and data storage practices. <br>- Experience in Pyspark for Data Processing and transformation. <br>- Experience building stream-processing applications (Spark streaming, Apache-Flink, Kafka, etc.). <br>- Maintaining and developing CI/CD pipelines based on Gitlab. <br>- Involved in assembling large, complex structured and unstructured datasets that meet functional/non-functional business requirements. <br>- Experience working with cloud data platform and services. <br>- Conduct code reviews, maintain code quality, and ensure best practices are followed. <br>- Debug and upgrade existing systems. <br>- Knowledge in DevOps (nice to have). <br><br>Qualifications <br> To be considered for this role, you should meet the following qualifications: <br>- Bachelor’s degree in computer science or related field. <br>- Experience in Snowflake and knowledge in transforming data using Data Build Tool. <br>- Strong programming skills in Python, Pyspark, and some combination of Java, Scala (good to have). <br>- Experience in AWS and API Integration in general with knowledge of data warehousing concepts. <br>- Excellent communication and team collaboration skills.