Requirements:
- Minimum 3–5 years of relevant professional experience in Data Engineering, BI, or Data Platform roles.
- Strong understanding of the data lifecycle value chain and its core components: ETL/ELT, Data Warehouse, Data Lake.
- Hands-on experience with the Azure data ecosystem, including: Azure Ingestion Framework, Azure Databricks, Azure Synapse Analytics, Delta Lake
- Good knowledge of functional and technical architecture principles.
- Strong programming skills in: Python, SQL, DAX, Power BI
- Experience with Test-Driven Development (TDD) and data quality best practices.
Considered a plus:
- Experience in domains such as retail, marketing, supply chain, or finance.
- Previous experience working in a scaled agile environment (e.g. SAFe).
Responsibilities:
- Design, develop, and maintain scalable and reliable data pipelines using Azure technologies.
- Implement and optimize ETL/ELT processes for data ingestion, transformation, and storage from multiple data sources.
- Build and maintain Data Warehouse and Data Lake models, ensuring performance, consistency, and data quality.
- Collaborate closely with business stakeholders and cross-functional teams to translate business requirements into technical solutions.
- Develop and optimize Power BI reports and dashboards, leveraging DAX for analytical insights.
- Ensure data quality, reliability, and performance through automated testing, monitoring, and validation.
- Apply a DevOps mindset, taking full ownership of solutions from development to production.
- Document technical solutions, data models, and processes.