About
Join our team as a Data Developer and play a key role in connecting data analytics with software engineering. You will help design, develop, and optimize cutting-edge data infrastructure, with a focus on Databricks and cloud environments, while supporting and improving existing applications. You will collaborate closely with business stakeholders, IT teams, and data engineers to deliver high-quality, scalable, and efficient data solutions.
Relocation package
Job rotation
Learning through Arnia Academy
Attractive projects
Flexibile working hours
Performance bonuses
Medical benefits
Trainings
Competitive compensation package
Referral program
International work experience

Requirements:

  • Bachelor’s or Master’s degree in Computer Science, Engineering, or a related technical field.
  • Minimum 3 years of experience in Data Engineering, Data DevOps, or related roles, preferably in banking or telecommunications.
  • Advanced expertise in SQL, PL/SQL, Python, PySpark and ETL tools such as ODI or similar.
  • Experience in data modeling, source system analysis, and database design.
  • Skilled in designing, developing, testing, optimizing, and deploying ETL pipelines (ODI packages, Databricks jobs, Oracle stored procedures).
  • Familiarity with data warehouse concepts, data cataloging, profiling, and mapping.
  • Experience with databases such as Oracle, PostgreSQL, or similar large-scale systems.
  • Knowledge of data visualization and exploration tools.
  • Understanding of microservices architectures, cloud solutions (AWS, Databricks), and CI/CD practices (Git, Jenkins, GitHub Actions, Ansible/Terraform).
  • Professional level of English (spoken and written).
  • Fast learner, proactive, and eager to explore new technologies.
  • Familiarity with Agile–Scrum methodology is a plus.

Responsibilities:

  • Collaborate with business stakeholders and IT teams to understand, document, and design data warehouse processes.
  • Contribute to the definition, development, and implementation of data warehouse solutions.
  • Design, develop, test, optimize, and deploy ETL pipelines and related data transformations.
  • Create and maintain data mapping logic to transfer content from various source systems into the data warehouse.
  • Plan and coordinate ETL and database rollouts alongside project teams.
  • Provide support, maintenance, troubleshooting, and resolution for ETL processes.
  • Implement Data Products in a Data Mesh architecture and optimize pipelines for production-ready workflows.
  • Administer and manage data environments, tech stack, and traditional databases.
  • Implement automation and CI/CD practices to ensure efficient development and deployment.
  • Participate in diagnosing and solving complex data-related problems, documenting configurations, and maintaining best practices.
  • Collaborate effectively with cross-functional teams to deliver high-quality data solutions that meet business requirements.

Other Job Openings:

QA Automation Engineer
Full time Bucuresti Mid-level
DevOps in Security
Full time Bucharest Senior
SAP BTP (Business Technology Platform)
Full time remote Senior

General application

Are you looking for a job and it doesn’t seem to be on our job openings list?

Don’t panic! You can send us your resume and we’ll get back to you!