Skillset
- Experience: 5 to 10 years
- Must have: Snowflake (certification is mandatory), ADF, Python, SQL, ETL & DWH concepts, DevOps.
- Good to have: Pyspark.
- Mandatory certification: SnowPro® Advanced: Data Engineer certification
Core responsibilities
- Design, develop, and maintain scalable Python-based data pipelines using Azure Data Factory, Databricks (PySpark), and Snowflake.
- “Design, Develop and Maintain scalable data pipelines using Azure Data Factory, Databricks and Snowflake. Experience in Python programming and SQL is a must with exposure to data warehousing concepts.”
- Develop and orchestrate data workflows and cloud artifacts including batch/streaming pipelines, integrating with Azure services (Event Hubs, Function Apps, EMS).
- Implement, test, document, and manage code with CI/CD practices via Azure DevOps, including pull requests and environment deployments.
- Troubleshoot and debug cloud-based data solutions, ensuring performance, security, and reliability across pipeline executions.
- Collaborate cross-functionally with data architects, analysts, data engineers, and business stakeholders to ensure successful and timely project delivery.