(times are in UTC)



09:00 – 09:10


09:10 – 09:30

Image-based Subthalamic Nucleus Segmentation for Deep Brain Surgery With Electrophysiology Aided Refinement

Varha, Bakstein, Novak, Gilmore

09:30 – 09:50

A Baseline Approach for AutoImplant: the MICCAI 2020 Cranial Implant Design Challenge

Li, Pepe, Gsaxner, von Campe, Egger

09:50 – 10:10

3D Slicer Craniomaxillofacial Modules Support Patient-specific Decision-making for Personalized Healthcare in Dental Research

Bianchi, Paniagua, Carlos De Oliveira Ruellas, Fillion-Robin, C. Prietro, Roberto Gonçalves, Hoctor, Yatabe, Styner, Li, Lima Gurgel, Maia Chaves Junior, Massaro, Gamba Garib, Vilanova, Fernando Castanha Henriques, Aliaga-Del Castillo, Janson, R. Iwasaki, C. Nickel, Evangelista, Cevidanes

10:10 – 10:30

A Radiomics-based Machine Learning Approach to Assess Collateral Circulation in Stroke on Non-contrast Computed Tomography

Aktar, Xiao, Tampieri, Rivaz, Kersten-oertel

10:30 – 10:50

Learning Representations of Endoscopic Videos to Detect Tool Presence Without Supervision

Li, Ishii, Taylor, Hager, Sinha

10:50 – 11:10


11:10 –11:30

Adversarial Prediction of Radiotherapy Treatment Machine Parameters


11:30 – 11:50

Optimal Targeting Visualizations for Surgical Navigation of Iliosacral Screws

Pandey, Guy, Lefaivre, Hodgson

11:50 – 12:10

Single-shot Deep Volumetric Regression for Mobile Medical Augmented Reality*

Karner, Gsaxner, Pepe, Li, Fleck, Arth, Wallner, Egger

12:10 – 12:30

Prediction of Type II Diabetes Onset with Computed Tomography and Electronic Medical Records

Tang, gao, Lee, Wells, Spann, Terry, Carr, Huo, Bao, Landman

12:30 – 13:00

Best paper award ceremony and closing remarks

Note for presenters:
All presenting authors must pre-record a video presentation with a maximum file size of 2 GB and in 1080p resolution in file format .mp4. The videos must be uploaded to the server specified in the notification email until 11 September 2020. Presenting authors must be available for virtual Q&A after their video presentations.

*Winner of the CLIP 2020 best paper award!

The CLIP 2020 proceedings are available here.