How do you personally and professionally relate to water and/or space technologies?
Professionally, I am currently active in water, predominantly from the side of natural hazards. I have a part-time role as a researcher at Twente University in the Netherlands and founded this spinoff Fast Hazard about two years ago. Finally, I am also currently a consultant for the World Bank.
On a personal level, I have a big passion to use modelling and innovative model development to do something useful. It was after my studies that I was really attracted to ITC (Faculty of Geo-Information Science and Earth Observation). They have a rather unique focus on both research and capacity building, with students from all around the world, and research projects that aim to solve problems for vulnerable communities. After an internship there, I was able to start my PhD Research on hazard modelling. This really brought a focus on flood processes, and their interactions with other hazards. I have deeply enjoyed coding since my teens, so model development was a great way to combine my passions.
The link with space technologies is inspired partly by all the work done at ITC on the subject matter. More specifically, it’s driven by the great opportunity for unification that space-based datasets offer. Many of the -based observations are available equally, across continents, scales, wealth or borders. No matter if you are having flood issues in a small village on the plains in south Ethiopia, or rather in a busy city neighborhood in the Netherlands, space-based datasets can be leveraged to analyze water processes and natural hazards. We have been able to use these technologies to further develop tools that can be used both easily and rapidly in modelling all around the world. When I find people using these tools to support vulnerable communities anywhere around the globe, it means the world to me.
How have your studies in Astronomy and astrophysics influenced your research?
My studies in physics definitely helped in forming a set of skills in analyzing the physics and mathematics of water related processes. Perhaps even more generally for problem-solving and reducing complicated issues in smaller parts. It was definitely a challenge to catch up with the vast practical knowledge that is so valuable in Earth sciences, but I like to think I am getting there. In any case, whatever has aided me should be a testament to multi-disciplinary approaches to research, and I definitely don’t regret it.
Can you tell us about your current position as an Assistant Professor at Twente University?
For about 4 years, I have been active at Twente University, as Assistant Professor on multi-hazard modelling. I am part of the Centre For Disaster Resilience group at ITC, which features a broad expertise on disaster risk, resilience and natural hazards.
You are the founder of FastHazard, a company developing the FastFlood.org model. Can you tell us about the FastFlood.org model?
FastFlood.org tool is an innovative flood simulation tool. At the beginning, I developed and maintained it myself, but in the FastHazard spinoff, I was lucky to now have the support of the incredible David Meijvogel, Faheed Jasin Kolaparambil and Katherine van Roon. FastFlood enables fast and efficient flood scenario calculations. It was originally developed in 2022 as part of my research at Twente University, but the model has undergone continuous improvements. In the numerous case studies so far, it features over 98% accuracy compared to traditional models, while being up to 5000 times faster than traditional flood modelling methods. This speed makes it an ideal solution for quick risk assessments, interactive adaptation planning, emergency response and initial analyses. The model is available as a web-based application and offers free unlimited simulations. Users can easily download relevant data from global datasets or upload their own data for more detailed simulations. FastFlood provides various features such as mitigation design, setting boundary conditions for coastal and river areas, running precipitation scenarios, and simulating infiltration and other dynamics. It also incorporates the CMIP6 downscaled climate datasets to easily model future scenarios. The model has proven useful for generating initial flood scenarios and is a powerful addition to discussions on flood risk and adaptation strategies. An example simulation result from the River Rhine developed with FastFlood is provided in Figure 1 below (simulation time below 1 second on a typical laptop)

What led to the development of this model? And what differentiates it from other models for flood mapping?
The development started mostly as an escalated hobby-project. I stumbled upon some spatial algorithms that made some surprisingly fast flood-like maps. At the same time, we were working more and more with space-based datasets. I really thought it would be cool if this could be as simple as going to a website. No setup, no installation, just go there, set it up, and you get a flood simulation. Since no project was there at the moment to fund this idea, I just started working on a web-interface for these algorithms in the evenings. About half a year later, the first rough version of FastFlood.org came online.
So, the model is mostly unique in the way it goes about calculating the flood maps. Because of this, its typically able to do it within a couple seconds (somewhere around 5000 times faster than traditional approaches). In addition, since we pre-process and make these global (often space-based) datasets available, users can also very quickly set up a model.
This really has allowed for the use of flood modelling in entire new settings (interactive workshops), new timelines (rapid emergency mapping) and new markets (construction or real estate and urban planning).
Has Fastflood been used to map or simulate Glacial Lake Outburst Floods, if so how satisfied were you with the results?
Yes, there is some ongoing research collaboration with the Chengdu University of Technology in China to generate these kind of flood scenarios with the model. It has been working really well. Generally, we see that when the same boundary condition is used, the model is just really accurate compared to observed/reference models. Mostly then it comes down to how the lake outflow and breach dynamics are characterized.
The Lisem Integrated Spatial Earth Modeller (LISEM) allows to manipulate geo-spatial data, as well as to develop complex models. Can you elaborate in which case this is the right tool? What differentiates it from similar software?
LISEM was a variation on the OpenLISEM hydrology/erosion model developed by Victor Jetten. During my doctoral research I developed a separate version of this focused on integrated simulation of multi-hazard dynamics. Since this often comes with so many data needs, (and I really love tinkering with code and algorithms), I added data preparation/processing tooling. It also features some scripting functionality to easily work with geospatial datasets, and a variety of algorithms to better deal with the pre-processing.
This work was definitely a spiritual successor to FastFlood (and the FastSlide landslide model) and is still receiving some minor updates.
The multi-hazard model brought a lot of innovation but was a complex research tool. This made it somewhat difficult to set up and use. Still, for research and analysis to complex multi-hazard chains, there is no comparable tool to LISEM.
Many of your publications and projects focus on multi-hazard risk. What are the challenges and gaps that you encounter the most both in developed as well as in developing countries?
There have been incredible advancements in multi-hazard modelling in the last decade. Still, it has some major steps to make, going from analysis and research to operational and robust forecasting tools. Currently, uncertainty and data limitations inhibit the reliability of outcomes of integrated multi-hazard models. Flood models, such as FastFlood, have been a part of the industry-standard for a long time, but multi-hazard tools still have to get there.
Perspective and collaboration – I think we need to work more with the complexity of multiple natural hazards as the starting point, both in terms of risk reduction projects and research. Due to data needs or responsible agencies, multi-hazard activities often require involvement of multiple sectors. Starting from a mindset of collaboration and shared responsibility to reduce the impact of multi-hazard events could be a major step in the coming decade.
Finally, data is always an issue for scientists. We always would like to have better or more data. Still, for multi-hazards, it is often worse. Disasters are, by definition, somewhat rare events. Besides this, the destruction they cause often limit data collection. For two specific hazards (e.g. landslides and floods), their interaction happens even less often, and often their compounding impact results in an even larger catastrophe, destroying a wider region.
What are advantages of multi-hazard disaster management as opposed to traditional disaster management?
I think there is really an urgency to consider the multi-hazard angle in any disaster risk reduction project. I increasingly see the need to expect the unexpected. For example, extending simple isolated work on floods to include other processes and a multi-hazard perspective (e.g. including sediments, wind, vegetation, coastal processes and more). But then, even go much further with combined natural/technical risk, with chemicals, pollution and other types of security issues, including the social and societal dimension.
All these triggers and consequences within a disaster often cascade and impact in surprising ways. Often, these kinds of cascading impacts bring the most destruction, precisely because the preparations that were taken did not account for them. And it’s true that these complex impacts are incredibly difficult to prepare for. Scenario exploration with stakeholders can help pinpoint critical vulnerabilities, but you have to look at all these various processes and sectoral viewpoints together .
Can you explain in a nutshell how space technologies are used for water-related hazard modelling?
There are two primary methods,
-Model setup (parameterization)
Models often require a lot of data on the landscape, its properties of aspects like soil, land cover, crops, roads and many more. Satellites can help us generate these datasets automatically.
-Model verification (calibration)
Models always have some intrinsic uncertainty, either due to model assumptions, data issues and limitations or other reasons. It is important to also look at historical events to see how well the model performs on those. This can give confidence for future applications and verify reliability. Satellites can help produce data on river water levels, flood impact regions and others, which can be used to carry out this calibration process.
Models seem to be a strong focus of your work, what advice do you have for someone who has never developed a model? What do they need to focus on to get reliable results?
Calibration is key, often facilitated by local expertise and field visits!
After months of modelling for the same region, I can get a bit lost in the data.
Speaking to local experts, obtaining some observational dataset or just visiting the region always slaps some sense into me.
Simple examples are culverts underneath roads and bridges, or drainage systems in slums. Typically, no data is available for these and seeing the region can be crucial in understanding the processes at work. Hearing how particular issues cause problems is also vital, like tree debris from storm winds filling the culverts and causing flooding.
What do you think is the best starting point to learn about flood modeling online?
For the water principles, there are a lot of amazing open online course materials to follow. For water/hydrology and geodata, you can find a plethora of online materials that are amazing resources.
For the physics, I would suggest online math communicators. YouTube hosts some amazing visual explanations of differential equations, Navier-Stokes and Saint-Venant. (Think the Summer of Math Exposition from a couple of years back)
Which experiences or skills do you think benefit you most in your research and professional work?
I’d have to say my passion for programming as a hobby, and in particular video games. A hobby I can share with my kids, and during my PhD surprisingly as well with my Promotor!
I think the ability to face data/modelling issues and just having to power through to implement changes or completely new algorithms can free up a lot of avenues for research.

What do you need to innovate?
For me, and I think for many others, one of the biggest obstacles to innovation is a busy calendar with deliverables, project deadlines and other activities. For me, it has always been crucial to also just have time to experiment, even without any real goal. This kind of blue-sky research has lost a lot of popularity. Don’t get me wrong, I think it’s crucial that research finds applications and has societal impact within short timeframes. But innovation requires creativity, and that’s not something you can always plan.
Besides this, I think I have been incredibly lucky with all the resources I have had given to me. In particular, the internet is such a wealth of knowledge and data.
What is your favourite aggregate state of water?
I would have to say liquid, my wife and kids (and me) love swimming in lakes, sea or pools. If the weather allows for it!