How augmented reality can improve tumour removal

Title: In situ tissue pathology from spatially encoded mass spectrometry classifiers visualized in real time through augmented reality

Authors: Michael Woolman, Jimmy Qiu, Claudia M. Kuzan-Fischer, Isabelle Ferry, Delaram Dara,   Lauren Katz, Fowad Daud, Megan Wu, Manuela Ventura, Nicholas Bernards, Harley Chan, Inga Fricke, Mark Zaidi, Brad G. Wouters, James T. Rutka, Sunit Das, Jonathan Irish,  Robert Weersink, Howard J. Ginsberg, David A. Jaffray and Arash Zarrine-Afsar

Journal: Chemical Science

Year: 2020

Schematics of the system components. (From M. Woolman, et al., Chem. Sci., 2020, 11, 8723 – Published by The Royal Society of Chemistry, under Creative Commons Attribution-NonCommercial 3.0 Unported Licence)

Associating mass spectrometry to flying molecules (as an undergrad student) and analysis of DNA traces (as a fan of crime scene investigation tv shows) is not uncommon, and definitely not wrong. However, technological advancements have made it so much more than that.

In the past years, many types of hand-held mass spectrometry probes have been developed allowing fast surface analysis. Commonly, they are being tested to improve surgery via in vivo sampling of tissues. In just a few seconds, these probes can determine whether the analysed tissue is a tumour or, on the contrary, is healthy.

Their work by vaporizing molecules from the sample surface and directing them towards a mass analyser (flying molecules can very much still describe this process). The software compares the obtained mass spectrum with a library of spectra relative to tumour tissues of known histological nature to determine if there is a match. Many of these systems operate in a non-destructive way and can safely be used during surgery.

Recently, a team of scientists from Canada combined a hand-held probe (picosecond infrared laser-based mass spectrometry sampling probe, PIRL-MS) with an optical navigation system (ONS) based on infrared (IR) cameras and an IR sensor, an example of ONS is the optical mouse, where camera and sensors were both included in its body.

The navigation systems are commonly employed during optical image-guided surgery; they exist in different configuration and work with various methodologies and techniques and improve the outcomes of tumours removal, by increasing the ability of the surgeon to see and identify its margins. [1]

They had previously tested PIRL-MS with analyses in a murine model of brain tissue and shown that it was not more dangerous than the surgical scalpel.

To enhance the recognition further, the researchers developed an augmented reality platform (a 3D visualization software) that, superimposed to the optical navigation system’s display, distinguishes healthy tissue or tumour in two different colours, allowing quick and precise identification with a probe resolution of ∼1 mm2.

The researchers assessed their system for pixel errors and the identification of the correct tumour. In the first case, in vitro test performed on a border between cancer and healthy tissue (with a clear separation between the two) showed a correct detection. On the other end, via in situ analysis of two different cell lines of medulloblastoma tumour, they obtained an identification success rate of 84 %.

Ex vivo cancer border assessment. (From M. Woolman, et al., Chem. Sci., 2020, 11, 8723 – Published by The Royal Society of Chemistry, under Creative Commons Attribution-NonCommercial 3.0 Unported Licence)

Further in vivo studies were performed on mice’s brain to assess the eventual damage induced by the PIRL-MS probe, demonstrating – at the same time – how the presence of circulating blood in the tissues would not interfere with the measurements. Once again, their test suggests that this method is not more aggressive than the common surgical scalpel.

These systems are not able to identify single cancer cells dispersed in healthy tissues yet, however, they may improve cancer removal and patients’ survival rate. Moreover, this platform could be adapted to other types of hand-held probes with different characteristics and spatial resolution. In this fight, we should take all the help we can get! 

[1]https://www.bioopticsworld.com/bioimaging/article/16429153/imageguided-surgery-optical-surgical-navigation-group-targets-meaningful-tool-comparison


Leave a Reply