Login | Register

CREMI

MICCAI Challenge on
Circuit Reconstruction from Electron Microscopy Images

Leaderboard Neuron Segmentation

Results for the neuron segmentation category, averaged over all samples.

Change the view to:
Group Submission CREMI score VOI split VOI merge ARAND
Cvlab cvlab 3.472 6.069 6.542 0.956
DIVE RUseg 2.976 6.003 3.121 0.972
DIVE BasUnet 2.959 1.740 7.366 0.962
- Final 2.896 3.805 5.599 0.895
Cvlab cvlab_1 2.891 4.686 5.421 0.828
Cvlab cvlab_2 2.756 1.526 6.411 0.957
- testsubm 2.683 3.083 4.768 0.918
VCG ZWZ-0.0 2.675 3.524 4.193 0.927
IIL+DIVE@TAMU new_segm 2.671 0.928 6.475 0.964
VCG testmala 2.412 3.297 3.283 0.888
afish1001 unet_ce 2.246 0.831 4.987 0.883
afish1001 wunet_ce 2.244 0.829 4.982 0.882
afish1001 unet_wj 2.223 0.648 5.178 0.858
afish1001 wunet_wj 2.212 0.641 5.157 0.854
HVCL Deep Res 2.212 1.672 4.182 0.837
afish1001 ABWUnet 2.199 0.696 5.042 0.856
afish1001 OurUnet 2.197 0.698 5.034 0.855
SeungLab-eding Seung-ed 1.525 0.895 2.160 0.764
DIVE focalZ 1.424 0.793 1.925 0.748
DIVE ag23 1.221 0.755 1.557 0.647
Rhoana Test1 1.162 1.646 1.179 0.486
SCI Submission_1 1.088 1.782 0.556 0.518
DIVE unet1 1.001 1.436 0.861 0.443
DIVE unet-1 1.001 1.436 0.861 0.443
DIVE unet3 0.994 1.451 0.831 0.441
  • Page 1 of 7
  • 25 of 162 items

Legend

CREMI score
The geometric mean of (VOI split + VOI merge) and ARAND.
VOI split and merge
The Variation of Information between a segmentation X and ground truth Y. The split and merge parts correspond to the conditional entropies H(X|Y) and H(Y|X), respectively.
ARAND
The Adapted Rand Error, i.e., 1.0 - Rand F-Score.

For each value, lower is better.

Leaderboard Synaptic Cleft Detection

Results for the synaptic cleft detection category, averaged over all samples.

Change the view to:
Group Submission CREMI score FP FN 1 - F-score ADGT ADF
SDG SnapF.V 124.59 182577.0 17000.7 0.306 205.18 44.01
SDG Unet0 171.88 33308.3 108288.7 0.262 91.45 252.31
Kreshuk MG 152.52 129494.3 63958.0 0.278 157.83 147.21
JRC DTU-2 67.56 70761.3 10069.3 0.123 109.67 25.46
JRC DTU-1 72.21 85169.0 13607.3 0.131 106.31 38.11
IAL full_mod 137.73 153335.3 61967.7 0.310 176.37 99.09
IAL unaries 150.64 175624.3 60151.3 0.327 213.52 87.77
IAL A+test 153.91 196558.0 55962.3 0.339 232.28 75.53
IAL 3d_model 229.78 86415.3 84295.0 0.317 255.27 204.29
I2I CleftB1 64.97 80004.7 19895.0 0.158 90.98 38.97
I2I FCEval 66.45 107963.7 21352.3 0.179 82.28 50.62
I2I SparSep 66.76 151781.7 11350.3 0.188 87.35 46.17
I2I CleftB 74.31 65334.0 21669.0 0.142 92.36 56.26
I2I CSparSep 75.79 71650.0 18918.3 0.140 72.06 79.52
HSS 3DUNet 65.34 43440.3 22443.3 0.106 63.26 67.43
HSS 3DUNet04 79.33 29732.0 27514.3 0.113 53.22 105.44
HSS HSZ 90.11 65092.3 25807.3 0.154 99.61 80.61
DLST DLST001 71.19 53908.0 14839.0 0.106 69.02 73.36
DLST DLST002 75.46 56645.7 17498.3 0.119 65.93 85.00
DLST DLST003 91.71 33118.7 35934.7 0.125 58.14 125.27
DKFZ-MIC nnUNet 74.96 49914.0 18981.3 0.117 64.46 85.46
DIVE CleftNet 57.73 116095.3 8004.3 0.169 85.60 29.87
DIVE Aug-UNet 57.89 123985.7 7992.0 0.177 86.45 29.32
DIVE MixUNet 57.91 125220.0 7920.0 0.178 86.57 29.25
DIVE MixUNet4 59.03 122122.0 7761.3 0.177 89.35 28.71

Legend

CREMI score
The mean of ADGT and ADF. We are not including the F-Score for now, as our current way of computing it can lead to unfair comparison. We are working on a more robust measure and will update the results accordingly.
FP, FN, and F-Score
The False Positives, False Negatives, and the resulting F-Score in the clefts detection volume. See metrics for details.
ADGT
The average distance of any found cleft voxel to the closest ground truth cleft voxel.
ADF
The average distance of any ground truth cleft voxel to the closest found cleft voxel.

For each value, lower is better (F-Score shown as 1 - F-Score).

Leaderboard Synaptic Partner Identification

Results for the synaptic partner identification category, averaged over all samples.

Change the view to:
Group Submission CREMI score FP FN
NO2 lr_affin 0.567 118.333 450.000
DIVE PTR_V2 0.453 162.667 333.667
NO2 lr_balan 0.447 175.333 314.000
HCBS PrnTrn66 0.465 178.667 322.333
HCBS Tr66t-01 0.469 182.667 318.667
IAL PSP_full 0.464 187.333 310.000
HCBS Tr66comb 0.451 219.000 292.333
HCBS Tr66_80K 0.449 223.000 286.667
AnonumoysGroup PrTrn66 0.620 238.000 451.667
IAL PSP_unar 0.461 266.667 281.000
PNI asyn_at1 0.490 274.333 314.667
PNI asyn_at0 0.468 293.333 297.000
PNI asyn_ori 0.493 310.000 302.333
PNI asyn_mod 0.423 367.667 262.333
  • 14 items

Legend

CREMI score
1 - F-Score of the FP and FN.
FP, FN
The False Positives and False Negatives synaptic partner pairs. See metrics for details.

For each value, lower is better.