Industry reflection

Consequences of the data explosion

By Claes Lundström, PhD, Director of Product Management & Research, Sectra Medical

The data explosion involves two major issues. One is the limitations of the hardware and network components. Enhancements in the performance of the IT infrastructure is by far outpaced by the rapid increase in data volumes.

The second issue is the limitations of the human mind. The image data sizes are now at a level at which it is humanly impossible to take in all data at once. It is neither possible nor desirable for a radiologist to analyze, for example, one GB of image data in one chunk. Instead, the focus must be on navigating to the subset of data that is actually relevant to the diagnosis and to make this navigation as efficient as possible.

How do we handle the data explosion? Image compression—effective but blunt from a clinical perspective

Image compression is often suggested as a solution to the data overload. Compression is, and will remain, a highly necessary technique, but it has major limitations. The compression must be applied taking the clinical workflow into account. For example, a large portion of the data in most medical images is irrelevant—typically all regions outside the body, as well as fat and soft tissue depending on the type of examination. Compression algorithms cannot take advantage of this fact, since they do not include any information on how the data is to be used.

Dynamic data management

The conclusion is that a new type of data management framework is needed. The main goal is to always provide exactly the data that the radiologist needs to see. No more, no less. Given that the image review is a dynamic navigation process, it is vitally important that the data management adapts to the user interaction with the image [2]. This is why the framework is named dynamic data management.

It’s like driving. Dynamic data management is the road system that minimizes highway congestion. Compression is like stepping on the gas, but this doesn’t help much if you’re stuck in a traffic jam.

What are the PACS implications?

Conventional PACS solutions are insufficient to respond to the data explosion. To successfully provide an efficient imaging workflow now and in the future, dynamic data management strategies must be built into the core PACS architecture.

It is important to bear in mind that there are firm limitations for data transfer performance. Limitations due to the system environment such as bandwidth, latency, disk I/O capacity are a fact, and there is no approach that can circumvent these prerequisites. Network latency, in particular, is a huge problem for conventional PACS approaches.

What PACS should solve is to minimize the overhead and provide the best possible image transfer performance. This requires an architecture carefully designed for the data explosion era.

Author: Claes Lundström, PhD, Director of Product Management & Research, Sectra Medical

Related reading

Related products