Data Engineer

Date Posted

June 26, 2025

Location

Amsterdam

Job Description

Are you a Data Engineer and do you have experience with PySpark?

How about a project at De Nederlandsche Bank in Amsterdam?

General information
Start date:                         No later than 1st of August 2025, preferably by 21st of July
End date:                           For 12 months with option to extend
Working hours:                36 hours per week with a minimum of 32 hours
Location:                           Amsterdam (50% in the office)
Contract type:                  Payroll
                            
At DNB, an employee can use various tooling within our organization for data science-related work. These (scalable) environments include software such as Matlab, VScode and Databricks. The environment has over 800 users. For this solution, DNB is looking for an experienced Customer Facing Data Engineer who is able to resolve data and code issues, provide user support and contribute to advising on how to improve performance based on usage patterns
To strengthen this new team, we are looking for an enthusiastically motivated team member who will maintain close contact with the end user. For example, you are able to advise end users on efficient programming in o.a. SQL, Python and PySpark. You will also take advantage of opportunities to optimize data science processes through the use of scalable compute solutions (e.g. via Databricks). Furthermore, you know how to help users on their way as soon as they run into technical issues

Requirements:

  • HBO education in computer science or university study IT.
  • At least 1 year of experience as a data engineer or scientist
  • At least 3 years of experience in programming in Python (PySpark) and Pandas
  • At least 2 years of experience in setting up CI/CD pipelines within DevOps;
  • At least 2 years of experience in deploying Databricks/Fabric
  • At least 2 years of experience with SQL
  • Experience in advising how to improve performance based on user patterns

Preferred

  • At least 3 years of experience writing complex SQL
  • At least 1 year of experience with programming in R;
  • At least 2 years of experience using various Azure cloud storage solutions (Azure SQL vs Azure Blob storage).
  • At least 2 years of experience with the use of various data formats Delta, Parquet and CSV and also knows the differences.
  • Knowledge of modern data architecture
  • One or more relevant certificates, for example with regard to: Databricks;
  • One or more certifications in relation to the programming language ‘Python’
  • Certification regarding the establishment and maintenance of CI/CD pipelines
  • Microsoft DevOps Certification

Are you interested in the role? Send me a recent CV and your contact details, so that we can discuss this vacancy further via a telephone conversation!
Also check out our website for other vacancies: Jobs – List V1 – Magno IT

 

Contact Profile Picture

Contact

Debby de Groot

Email

debby@magno-it.nl

Contact

Debby de Groot

Email

debby@magno-it.nl