AI in surgery rooms might increase safety – and surveillance


Controversial AI technology records the surgeon’s every move to ensure safety. Critics say it might have devastating effects on healthcare workers.

Dr. Teodor Grantcharov, a Stanford University professor, has developed the OR Black Box, a platform that aims to measure and enhance efficiency, safety, and adherence to standard surgery procedures.

With multiple sensors, panoramic cameras, microphones, and anesthesia monitors stuffed into the surgery room, Black Box technology records everything that’s going on while AI analyzes and interprets the data to make sense of it.

"The OR Black Box was created to make surgery more transparent, precise, predictable, and ultra-safe. Our goal is to replace dogmas and traditions with data-driven decisions,” writes Grantcharov on the company’s website.

AI evaluates the surgeon’s performance

The black box platform uses computer vision models to produce short video clips and a dashboard of statistics detailing blood loss, instruments used, and auditory disruptions.

It highlights key procedure segments, allowing surgeons to skip to critical moments like major bleeding or equipment failures. An algorithm anonymizes participants by distorting voices and blurring faces, ensuring privacy.

Another AI model evaluates performance. It checks compliance with the surgical safety checklist, which is traditionally done verbally in the surgery room.

More advanced algorithms are being trained to detect laparoscopic surgery errors. Training these models, which requires up to six months each, involves a team of 12 analysts, who manually annotate OR videos to teach the AI to recognize bleeding and particular instruments.

Grantcharov told MIT Technology Review that the algorithm is not yet fully autonomous. Capturing audio with ceiling microphones to reliably document the completion of every element of the surgical safety checklist is challenging, resulting in an estimated 15% error rate.

Consequently, an analyst manually verifies adherence to the checklist before finalizing the output from each procedure.

Currently, the company operates globally, and its technology is used in the USA, Canada, Germany, Denmark, The Netherlands, and Belgium.

Nobody wants to be watched

While the idea of ensuring safety is commendable, its application remains controversial. Healthcare institutions told MIT Technology Review that the surveillance technology was making staff uncomfortable.

Northwell Health System was the first hospital to pilot OR black boxes in February 2019. However, the staff turned the cameras around and deliberately unplugged them.

After being installed in Faulkner Hospital in November 2023, the technology caused anxiety among the employees of the Department of Surgery. “We were being watched, and we felt like if we did something wrong, our jobs were going to be on the line,” told MIT one healthcare worker.

The sense of being watched and the fact that everything is recorded also raises doubts about whether the employees will be protected. Despite the company's anonymization claims, some healthcare workers are convinced that the recordings will be used against them.