[home] [Personal Program] [Help]
tag
16:00
0 mins
Cloud-based ultrasound platform for advanced real-time image information
Beatrice Federici, Ben Luijten, Andre Immink, Ruud van Sloun, Massimo Mischi
Session: Poster Session 1 (Even numbers)
Session starts: Thursday 26 January, 16:00
Presentation starts: 16:00



Beatrice Federici ()
Ben Luijten ()
Andre Immink ()
Ruud van Sloun ()
Massimo Mischi ()


Abstract:
Background & Objective. Deep learning (DL) based signal processing methods are taking an ever more prominent role in ultrasound (US) imaging, demonstrating unprecedented opportunities for image formation and processing. Integrating these models in clinical devices poses additional challenges and increased cost due to their high computational footprint. Moreover, due to the interactive nature of US imaging, such a system should reliably operate at real-time frame rates and with minimal latency. This work proposes a cloud-based US platform that allows live streaming of raw, high-bandwidth, US channel data to the cloud and remote control of run-time parameters. The system enables inference of compute-intensive models for latency-sensitive applications. We assess the performance of the proposed framework by demonstrating real-time cloud-based US beamforming with a trained DL method. Methods. US channel data is acquired using a widely-used programmable US platform (Vantage 256, Verasonics). A linear probe (L11-4v) with 128 channels is adopted for plane wave imaging at 100 fps. The proposed framework includes the communication of US in-phase and quadrature channel data from the Verasonics system to Python on an on-premises server (replaceable by a public cloud). An optimized neural network for adaptive beamforming [1] processes the received channel data remotely and returns the reconstructed frames. After reconstruction, the frames can be sent back to the Verasonics system, rendered through a display window on a remote desktop, or visualized by a third-party connected device though a web application, (e.g., smartphone). A user interface enables control of transmit parameters. In a next step, parameters tuning can be made adaptive based on cloud-based learned algorithms, e.g., quality-based. Results & Discussion. Considering a single plane wave (243 KB) and assuming instantaneous processing, the framework introduces a 20 ms latency and an inter-frame period of about 15 ms. When integrating with the advanced beamformer, we measured a total latency between acquisition and rendering of less than 80 ms and a display frame rate of 20 fps. This work demonstrates the possibility to live stream high bandwidth US channel data over the network and opens up new opportunities for real-time cloud-based image formation and processing.