Hizirwan holding eeg
Case study

Operating with a supercomputer on your nose: hunting for the source of epilepsy

How do you find, in a complex network like the human brain, the one spot that causes a short circuit? For neurologist and professor Maeike Zijlmans (UMC Utrecht), this is part of her daily work. Together with Erasmus MC and SURF, she worked on a method that uses augmented reality (AR) to make invisible details visible during surgery.

Key facts

Who: Prof Maeike Zijlmans
Position: professor of neurology
Institution: UMC Utrecht, Centre of expertise in sleep medicine and epilepsy SEIN
Service: Visualisation
Challenge: During brain surgery, grids with electrodes are placed on the brain to measure electrical signals, but it is difficult to determine the precise location of the electrodes relative to the brain.
Solution: Using a VR headset and intelligent software, the electrodes can be localised with millimetre precision.

Maeike Zijlmans

Maeike Zijlmans

“Epilepsy is essentially a network disorder,” Zijlmans explains. “To achieve a cure, we need to know exactly where the fault in that network lies.” For people with severe, drug-resistant epilepsy, brain surgery is often the last resort. The aim of such an operation is to remove the piece of tissue that causes the seizures, without damaging healthy functions.

Surgeons place mats with electrodes (‘electrode grids’) directly on the brain to measure electrical signals. But there is a problem. These mats are small, can deform, and sometimes partially slide under the edge of the skull or the dura mater. Zijlmans explains: “You need to know exactly where each electrode is located in relation to the brain in order to interpret the measured signals correctly. With the naked eye, this requires a great deal of time and expertise.”

Electrode grid on brain

An example of electrode grid placed on a patient's brain.

More accurate measurement with smart software

A possible solution lies in AR-assisted neurosurgery, as already developed by UMCU for other neurosurgical applications. The idea? A surgeon wears an AR headset, the HoloLens 2, which displays a virtual layer over reality showing precisely where the electrodes are located in relation to the brain. Correctly identifying the position of the electrode grid is essential for this.

To make this possible, Zijlmans’ research group collaborated with engineers from Erasmus MC and SURF. In a recent scientific paper, the team describes how they used super-resolution and AI for this purpose. In simple terms: smart software that sharpens blurry or incomplete images from the headset’s cameras into crystal-clear information. This made it possible to localise the electrodes with millimetre accuracy.

Hololens op standaard

A surgeon wears an AR headset, the HoloLens 2, which displays a virtual layer exactly where the electrodes are located in relation to the brain.

A data stream of more than 20 Netflix movies

It sounds like science fiction, and technically, for everyday clinical practice, it still is. The AI models required to process the images live turned out to be far too demanding for the processor of an AR headset. “You’re essentially asking a mobile phone to do the work of a large workstation,” says Hizirwan Salim, technical adviser at SURF.

The solution was found in the cloud, or more specifically: the national supercomputer Snellius. The team built a successful prototype in which the video images from the AR headset were sent directly to Snellius. The heavy AI models ran there, after which the results were immediately sent back to the headset, 30 times per second.

This application, however, brought enormous infrastructural challenges. The video streams from the headset reached more than 1,000 Mbps. By comparison, that is over 20 times the bandwidth required for a 4K Netflix stream. A stuttering film is annoying, but a delay in the operating theatre is unacceptable.

Tomorrow's digital highway

With the delivery of the prototype application and the publication of the paper, this specific collaboration between the three parties has come to an end. The results provide a foundation for follow-up steps. For Zijlmans, the search for better methods for detection and visualisation continues, while SURF is investigating whether the insights gained can be used for possible future applications.

The insights are not just about pure speed, but about the whole chain:

  • Computing power: How much GPU power (processing power of graphics cards) is needed to run multiple AI models in real time?
  • Network: How do you guarantee that the connection between a hospital and the supercomputer is stable and secure?
  • Data streams: How do you ensure that multiple data streams from different sources can be integrated seamlessly?

This research is crucial, because technology does not stand still. Hizirwan: "We are actually building the digital highway needed for tomorrow's applications now."

This article is based on the publication: 'Super-resolution for localising electrode grids as small, deformable objects during epilepsy surgery using augmented reality headsets'', the final result of a collaboration between UMCU, ErasmusMC and SURF.

Related topics: