Download PDFOpen PDF in browser

Deep Learning-based Plane Pose Regression towards Training in Freehand Obstetric Ultrasound

EasyChair Preprint no. 8151

2 pagesDate: May 31, 2022


In obstetric ultrasound (US), standard planes (SPs) retain a significant clinical relevance, but their acquisition requires the ability to mentally build a 3D map of the fetus from a 2D US image. The autonomous probe navigation towards SP remains a challenging task due to the need to interpret variable and complex images and their spatial relationship. Our work focuses on developing a real-time training platform to guide inexperienced sonographers in acquiring proper obstetric US images that could be potentially deployed for existing US machines. First, we developed a Unity-based environment for volume reconstruction and acquisition of synthetic images to this aim. Secondly, we trained a regression CNN for the 6D pose estimation of arbitrarily oriented US planes on phantom data and fine-tuned it on real ones. 
Our regression CNN reliably localises US planes within the fetal brain in phantom data and generalises pose regression to an unseen fetal brain without real-time ground truth data or 3D volume scans of the patient beforehand. Future development will expand the prediction to volumes of the whole fetus and assess its potential for vision-based, freehand US automatic navigation when acquiring SPs.

Keyphrases: deep learning, Fetal ultrasound, pose regression, surgical training

BibTeX entry
BibTeX does not have the right entry for preprints. This is a hack for producing the correct reference:
  author = {Chiara Di Vece and Brian Dromey and Francisco Vasconcelos and Anna L David and Donald M Peebles and Danail Stoyanov},
  title = {Deep Learning-based Plane Pose Regression towards Training in Freehand Obstetric Ultrasound},
  howpublished = {EasyChair Preprint no. 8151},

  year = {EasyChair, 2022}}
Download PDFOpen PDF in browser