Unity Imaging Collaborative

Open-access datasets, models, and code for the development and validation of AI in cardiology


Project 2: Left ventricular longitudinal strain [Submission 2]

Aim

This dataset (and associated paper under submission) is for training a neural networks to make LV longitudinal strain measurements in the A4C view.

To do this we obtained expert labels for 3 points and 1 curve in the A4C view:

  • Mitral hinge point on septum
  • Mitral hinge point on lateral wall
  • Left ventricular endocardial apex
  • Left ventricular endocardial contour

Below is the output (after conversion of heatmaps into discrete points and a traced endocardial border) of Unity-GLS.

Unity-Frame labelling of an A4C video

Paper

Under review

Easy to use inference code

To aid testing new easy-to-use inference code for your own dicoms has been added with instructions.
The code with a README is available here: https://github.com/UnityImaging/unity-gls
This is easier to use than the code below (which is our canonical source).
Test A4C DICOM: test_ac4.dcm

Dataset for model development

This is a snapshot of the data and code used for this paper. You should use the "latest release" if you are training your own neural network. These snapshots are provided for reproducibility.

The dataset for model development is divided into train, tune, and internal validation sets. There are 7523 videos in this dataset, which include the from 2587 labelled images from 1224 A4C videos (the other videos may be of different views or not completely labelled). The anonymised PNG files are downloaded seperately from the labels.

Dataset for model validation

The dataset for model validation, which comprises of 100 echocardiograms used in the external validation is kept private for competition use.

However, the 600 page appendix (stored here due to size) with the model output on every validation image is here.

Models

The checkpoint used for the paper for Unity-GLS was training run 211, Epoch 400.

Code

A snapshot of the exact code used for the paper is provided for reproducibility. The latest version of the code is available on Unity Imaging GitHub with improvements.

Download

Please use the latest code, models, labels, and data available from the main page if you are building upon our work. A snapshot of all the materials is provided below for reproducibility purposes.

  • Unity Imaging Echocardiography Model Development Dataset Images: Download
  • Unity Imaging Echocardiography Model Development Dataset Labels: Download
  • Unity-GLS Imaging Echocardiography Model [Version 211, Epoch 400]: Download
  • Unity-GLS Code for training Unity-GLS and inference: Download

License

The model weights, labels, and images are available under the Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International license
The code is available under the MIT license.


Funding, support, and approvals

We are grateful to the following institutions for funding and support

  • National Institute of Health Research (NIHR), who support the Clincial Lecturerships of Dr. Matthew Shun-Shin and Dr. James Howard at Imperial College.
  • NIHR Imperial Biomedial Research Centre (NIHR Imperial BRC) for providing start-up funding to collect pilot data to enable project / programme grant applications.

This research and open-access release of the has been conducted under:

  • The Imperial AI Echocardiography Dataset [IRAS: 279328, REC:20/SC/0386]

Contact details

Any questions Dr. Matthew Shun-Shin