Data Engineer
Salary:
€60k- 72k - Per Annum
Locations:
utrecht, Utrecht, - None Specified -
Type:
Permanent
Published:
April 8, 2026
Contact:
Tana Mushambadope
Ref:
20360
Required Skills:
Python,Data Engineering,Java,Azure,AWS
Share this job
Apply

Job title: Data Engineer (Fluent in Dutch)

Job type: Perm 

Salary: 60k-72k

Role Location: Utrecht

Role: 

As a Data Engineer, you will work on challenging (often agile) projects for leading clients in the Netherlands. You will be responsible for designing, building, and maintaining scalable data environments and pipelines. Your responsibilities will include:

  • Developing and managing data architectures (such as data lakehouses, data warehouses, and platforms), along with associated CI/CD and ETL/ELT pipelines.
  • Integrating and analyzing data from multiple sources (both batch and streaming) to generate actionable insights and reports.
  • Collaborating within multidisciplinary teams, translating business needs into technical solutions, and keeping stakeholders informed on progress. The role requires clear communication, reliability, and a calm, team-oriented approach, with the ability to explain technical challenges effectively.
  • Continuously improving and innovating data solutions. You enjoy solving complex problems and are always seeking smart, future-proof approaches.

Job requirements:

  • Cloud & Data Platforms: Experience with AWS or Microsoft Azure (including services such as Databricks, Event Hubs, and Data Factory), as well as platforms like Snowflake and Databricks.
  • Infrastructure-as-Code & Orchestration: Proficiency with Terraform, Kubernetes (e.g., AKS or EKS), Helm, and tools such as Argo Workflows or ArgoCD.
  • CI/CD & Automation: Familiarity with GitHub Actions, Azure DevOps/TFS, and Docker for containerization and automation.
  • Data Engineering Tools: Strong programming skills in Python, Java, Kotlin, Go, and/or SQL for data processing. Experience with DBT for data transformations and modeling (e.g., Data Vault) and Apache Airflow for workflow orchestration.
  • Monitoring & Observability: Experience using Prometheus and Grafana to monitor and visualize data pipelines.
  • Messaging & Streaming: Knowledge of event streaming technologies such as Kafka (and optionally Azure Event Hub).
  • Machine Learning (Optional): Experience with MLflow for model management and a basic understanding of machine learning concepts is a plus.

Accessibility Statement:

We make an active choice to be inclusive towards everyone every day.? Please let us know if you require any accessibility adjustments through the application or interview process.

Our Commitment to Diversity, Equity, and Inclusion:

Signify’s mission is to empower every person, regardless of their background or?circumstances, with an equitable chance to achieve the careers?they deserve. Building a diverse future, one placement at a?time. Check out our DE&I page here

Apply