Login | Register

CREMI

MICCAI Challenge on
Circuit Reconstruction from Electron Microscopy Images

Leaderboard Neuron Segmentation

Results for the neuron segmentation category, averaged over all samples.

Change the view to:
Group Submission CREMI score VOI split VOI merge ARAND
Cvlab cvlab 3.472 6.069 6.542 0.956
DIVE RUseg 2.976 6.003 3.121 0.972
DIVE BasUnet 2.959 1.740 7.366 0.962
- Final 2.896 3.805 5.599 0.895
Cvlab cvlab_1 2.891 4.686 5.421 0.828
Cvlab cvlab_2 2.756 1.526 6.411 0.957
- testsubm 2.683 3.083 4.768 0.918
VCG ZWZ-0.0 2.675 3.524 4.193 0.927
IIL+DIVE@TAMU new_segm 2.671 0.928 6.475 0.964
VCG testmala 2.412 3.297 3.283 0.888
afish1001 unet_ce 2.246 0.831 4.987 0.883
afish1001 wunet_ce 2.244 0.829 4.982 0.882
afish1001 unet_wj 2.223 0.648 5.178 0.858
afish1001 wunet_wj 2.212 0.641 5.157 0.854
HVCL Deep Res 2.212 1.672 4.182 0.837
afish1001 ABWUnet 2.199 0.696 5.042 0.856
afish1001 OurUnet 2.197 0.698 5.034 0.855
SeungLab-eding Seung-ed 1.525 0.895 2.160 0.764
DIVE focalZ 1.424 0.793 1.925 0.748
DIVE ag23 1.221 0.755 1.557 0.647
Rhoana Test1 1.162 1.646 1.179 0.486
SCI Submission_1 1.088 1.782 0.556 0.518
DIVE unet1 1.001 1.436 0.861 0.443
DIVE unet-1 1.001 1.436 0.861 0.443
DIVE unet3 0.994 1.451 0.831 0.441
  • Page 1 of 7
  • 25 of 162 items

Legend

CREMI score
The geometric mean of (VOI split + VOI merge) and ARAND.
VOI split and merge
The Variation of Information between a segmentation X and ground truth Y. The split and merge parts correspond to the conditional entropies H(X|Y) and H(Y|X), respectively.
ARAND
The Adapted Rand Error, i.e., 1.0 - Rand F-Score.

For each value, lower is better.

Leaderboard Synaptic Cleft Detection

Results for the synaptic cleft detection category, averaged over all samples.

Change the view to:
Group Submission CREMI score FP FN 1 - F-score ADGT ADF
HSS 3DUNet 65.34 43440.3 22443.3 0.106 63.26 67.43
DIVE CleftC4 65.38 112813.3 14762.3 0.179 82.62 48.15
3DEM Part_A 65.43 112964.7 10645.0 0.174 96.87 34.00
SDG PyTC20 65.68 167107.0 6190.0 0.196 90.36 41.00
DIVE CleftC5 65.75 125395.7 11906.7 0.181 84.10 47.41
DIVE CleftC2 66.19 117135.0 19904.7 0.181 83.30 49.08
DIVE CleftC3 66.19 112868.0 18785.0 0.180 82.73 49.65
I2I FCEval 66.45 107963.7 21352.3 0.179 82.28 50.62
DIVE CleftC1 66.50 149148.3 12971.0 0.187 86.56 46.43
3DEM BC_BC 66.71 82113.3 13347.3 0.126 79.96 53.46
I2I SparSep 66.76 151781.7 11350.3 0.188 87.35 46.17
JRC DTU-2 67.56 70761.3 10069.3 0.123 109.67 25.46
SDG PyTC20-1 67.97 131489.3 8699.0 0.167 84.22 51.73
CMM UNet50 69.31 65762.0 12634.7 0.139 97.45 41.17
DLST DLST001 71.19 53908.0 14839.0 0.106 69.02 73.36
JRC DTU-1 72.21 85169.0 13607.3 0.131 106.31 38.11
CMM Max Test 74.10 24019.3 44179.7 0.101 64.63 83.56
I2I CleftB 74.31 65334.0 21669.0 0.142 92.36 56.26
DIVE Mix_ABC1 74.39 101269.0 16022.0 0.174 79.04 69.73
DKFZ-MIC nnUNet 74.96 49914.0 18981.3 0.117 64.46 85.46
DIVE Mix_ABC2 75.02 131499.0 13795.7 0.200 78.36 71.67
SDG Unet1 75.28 185988.3 16897.3 0.230 84.34 66.22
DLST DLST002 75.46 56645.7 17498.3 0.119 65.93 85.00
I2I CSparSep 75.79 71650.0 18918.3 0.140 72.06 79.52
3DEM All_A 76.12 157692.7 6808.7 0.256 130.59 21.66

Legend

CREMI score
The mean of ADGT and ADF. We are not including the F-Score for now, as our current way of computing it can lead to unfair comparison. We are working on a more robust measure and will update the results accordingly.
FP, FN, and F-Score
The False Positives, False Negatives, and the resulting F-Score in the clefts detection volume. See metrics for details.
ADGT
The average distance of any found cleft voxel to the closest ground truth cleft voxel.
ADF
The average distance of any ground truth cleft voxel to the closest found cleft voxel.

For each value, lower is better (F-Score shown as 1 - F-Score).

Leaderboard Synaptic Partner Identification

Results for the synaptic partner identification category, averaged over all samples.

Change the view to:
Group Submission CREMI score FP FN
PNI asyn_mod 0.423 367.667 262.333
IAL PSP_unar 0.461 266.667 281.000
HCBS Tr66_80K 0.449 223.000 286.667
HCBS Tr66comb 0.451 219.000 292.333
PNI asyn_at0 0.468 293.333 297.000
PNI asyn_ori 0.493 310.000 302.333
IAL PSP_full 0.464 187.333 310.000
NO2 lr_balan 0.447 175.333 314.000
PNI asyn_at1 0.490 274.333 314.667
HCBS Tr66t-01 0.469 182.667 318.667
HCBS PrnTrn66 0.465 178.667 322.333
DIVE PTR_V2 0.453 162.667 333.667
NO2 lr_affin 0.567 118.333 450.000
AnonumoysGroup PrTrn66 0.620 238.000 451.667
  • 14 items

Legend

CREMI score
1 - F-Score of the FP and FN.
FP, FN
The False Positives and False Negatives synaptic partner pairs. See metrics for details.

For each value, lower is better.