Requirements:
- Bachelor’s or Master’s degree in Computer Science, Engineering, or a related technical field.
- Minimum 3 years of experience in Data Engineering, Data DevOps, or related roles, preferably in banking or telecommunications.
- Advanced expertise in SQL, PL/SQL, Python, PySpark and ETL tools such as ODI or similar.
- Experience in data modeling, source system analysis, and database design.
- Skilled in designing, developing, testing, optimizing, and deploying ETL pipelines (ODI packages, Databricks jobs, Oracle stored procedures).
- Familiarity with data warehouse concepts, data cataloging, profiling, and mapping.
- Experience with databases such as Oracle, PostgreSQL, or similar large-scale systems.
- Knowledge of data visualization and exploration tools.
- Understanding of microservices architectures, cloud solutions (AWS, Databricks), and CI/CD practices (Git, Jenkins, GitHub Actions, Ansible/Terraform).
- Professional level of English (spoken and written).
- Fast learner, proactive, and eager to explore new technologies.
- Familiarity with Agile–Scrum methodology is a plus.
Responsibilities:
- Collaborate with business stakeholders and IT teams to understand, document, and design data warehouse processes.
- Contribute to the definition, development, and implementation of data warehouse solutions.
- Design, develop, test, optimize, and deploy ETL pipelines and related data transformations.
- Create and maintain data mapping logic to transfer content from various source systems into the data warehouse.
- Plan and coordinate ETL and database rollouts alongside project teams.
- Provide support, maintenance, troubleshooting, and resolution for ETL processes.
- Implement Data Products in a Data Mesh architecture and optimize pipelines for production-ready workflows.
- Administer and manage data environments, tech stack, and traditional databases.
- Implement automation and CI/CD practices to ensure efficient development and deployment.
- Participate in diagnosing and solving complex data-related problems, documenting configurations, and maintaining best practices.
- Collaborate effectively with cross-functional teams to deliver high-quality data solutions that meet business requirements.