Login | Register

CREMI

MICCAI Challenge on
Circuit Reconstruction from Electron Microscopy Images

Leaderboard Neuron Segmentation

Results for the neuron segmentation category, averaged over all samples.

Change the view to:
Group Submission CREMI score VOI split VOI merge ARAND
DIVE BasUnet 2.959 1.740 7.366 0.962
Cvlab cvlab 3.472 6.069 6.542 0.956
IIL+DIVE@TAMU new_segm 2.671 0.928 6.475 0.964
Cvlab cvlab_2 2.756 1.526 6.411 0.957
- Final 2.896 3.805 5.599 0.895
Cvlab cvlab_1 2.891 4.686 5.421 0.828
afish1001 unet_wj 2.223 0.648 5.178 0.858
afish1001 wunet_wj 2.212 0.641 5.157 0.854
afish1001 ABWUnet 2.199 0.696 5.042 0.856
afish1001 OurUnet 2.197 0.698 5.034 0.855
afish1001 unet_ce 2.246 0.831 4.987 0.883
afish1001 wunet_ce 2.244 0.829 4.982 0.882
- testsubm 2.683 3.083 4.768 0.918
VCG ZWZ-0.0 2.675 3.524 4.193 0.927
HVCL Deep Res 2.212 1.672 4.182 0.837
VCG testmala 2.412 3.297 3.283 0.888
DIVE RUseg 2.976 6.003 3.121 0.972
SeungLab-eding Seung-ed 1.525 0.895 2.160 0.764
DIVE focalZ 1.424 0.793 1.925 0.748
DIVE ag23 1.221 0.755 1.557 0.647
Rhoana Test1 1.162 1.646 1.179 0.486
DIVE unet1 1.001 1.436 0.861 0.443
DIVE unet-1 1.001 1.436 0.861 0.443
DIVE unet3 0.994 1.451 0.831 0.441
DIVE unet-3 0.994 1.451 0.831 0.441
  • Page 1 of 7
  • 25 of 162 items

Legend

CREMI score
The geometric mean of (VOI split + VOI merge) and ARAND.
VOI split and merge
The Variation of Information between a segmentation X and ground truth Y. The split and merge parts correspond to the conditional entropies H(X|Y) and H(Y|X), respectively.
ARAND
The Adapted Rand Error, i.e., 1.0 - Rand F-Score.

For each value, lower is better.

Leaderboard Synaptic Cleft Detection

Results for the synaptic cleft detection category, averaged over all samples.

Change the view to:
Group Submission CREMI score FP FN 1 - F-score ADGT ADF
3DEM BesA0.20 60.04 121311.3 7529.0 0.182 92.91 27.17
3DEM BesA225 59.34 120053.7 7534.0 0.177 90.88 27.80
DIVE MixUNet3 59.11 125333.0 7696.0 0.180 89.68 28.54
DIVE MixUNet4 59.03 122122.0 7761.3 0.177 89.35 28.71
CleftKing CleABC1 58.02 130521.0 7637.3 0.179 87.11 28.94
DIVE MixUNet 57.91 125220.0 7920.0 0.178 86.57 29.25
DIVE Aug-UNet 57.89 123985.7 7992.0 0.177 86.45 29.32
DIVE CleftNet 57.73 116095.3 8004.3 0.169 85.60 29.87
CleftKing CleABC2 59.47 110518.3 10494.7 0.170 88.88 30.05
SDG Unet2 63.92 202177.7 13156.0 0.256 97.64 30.19
CleftKing CleABC3 59.10 106917.7 10746.3 0.165 87.22 30.99
DIVE ex19t5 315.97 64192.0 1757.3 0.112 600.79 31.15
DIVE SegCleft 77.03 80978.3 20639.3 0.162 121.03 33.03
DIVE celft_1 77.03 80978.3 20639.3 0.162 121.03 33.03
SDG PyTC20-P 64.59 215905.3 4962.0 0.227 96.14 33.05
3DEM Part_A 65.43 112964.7 10645.0 0.174 96.87 34.00
DIVE MixUNet2 65.19 106260.3 13962.0 0.188 95.28 35.09
DIVE cleft 88.46 97220.7 21239.3 0.172 139.74 37.18
JRC DTU-1 72.21 85169.0 13607.3 0.131 106.31 38.11
I2I CleftB1 64.97 80004.7 19895.0 0.158 90.98 38.97
SDG PyTC20 65.68 167107.0 6190.0 0.196 90.36 41.00
CMM UNet50 69.31 65762.0 12634.7 0.139 97.45 41.17
VCG FgSm0808 57.34 75120.7 9109.7 0.107 72.74 41.94
SDG SnapX.O 124.59 182577.0 17000.7 0.306 205.18 44.01
SDG SnapF.V 124.59 182577.0 17000.7 0.306 205.18 44.01

Legend

CREMI score
The mean of ADGT and ADF. We are not including the F-Score for now, as our current way of computing it can lead to unfair comparison. We are working on a more robust measure and will update the results accordingly.
FP, FN, and F-Score
The False Positives, False Negatives, and the resulting F-Score in the clefts detection volume. See metrics for details.
ADGT
The average distance of any found cleft voxel to the closest ground truth cleft voxel.
ADF
The average distance of any ground truth cleft voxel to the closest found cleft voxel.

For each value, lower is better (F-Score shown as 1 - F-Score).

Leaderboard Synaptic Partner Identification

Results for the synaptic partner identification category, averaged over all samples.

Change the view to:
Group Submission CREMI score FP FN
NO2 lr_affin 0.567 118.333 450.000
DIVE PTR_V2 0.453 162.667 333.667
NO2 lr_balan 0.447 175.333 314.000
HCBS PrnTrn66 0.465 178.667 322.333
HCBS Tr66t-01 0.469 182.667 318.667
IAL PSP_full 0.464 187.333 310.000
HCBS Tr66comb 0.451 219.000 292.333
HCBS Tr66_80K 0.449 223.000 286.667
AnonumoysGroup PrTrn66 0.620 238.000 451.667
IAL PSP_unar 0.461 266.667 281.000
PNI asyn_at1 0.490 274.333 314.667
PNI asyn_at0 0.468 293.333 297.000
PNI asyn_ori 0.493 310.000 302.333
PNI asyn_mod 0.423 367.667 262.333
  • 14 items

Legend

CREMI score
1 - F-Score of the FP and FN.
FP, FN
The False Positives and False Negatives synaptic partner pairs. See metrics for details.

For each value, lower is better.