Login | Register

CREMI

MICCAI Challenge on
Circuit Reconstruction from Electron Microscopy Images

Leaderboard Neuron Segmentation

Results for the neuron segmentation category, averaged over all samples.

Change the view to:
Group Submission CREMI score VOI split VOI merge ARAND
Cvlab cvlab 3.472 6.069 6.542 0.956
DIVE RUseg 2.976 6.003 3.121 0.972
DIVE BasUnet 2.959 1.740 7.366 0.962
- Final 2.896 3.805 5.599 0.895
Cvlab cvlab_1 2.891 4.686 5.421 0.828
Cvlab cvlab_2 2.756 1.526 6.411 0.957
- testsubm 2.683 3.083 4.768 0.918
VCG ZWZ-0.0 2.675 3.524 4.193 0.927
IIL+DIVE@TAMU new_segm 2.671 0.928 6.475 0.964
VCG testmala 2.412 3.297 3.283 0.888
afish1001 unet_ce 2.246 0.831 4.987 0.883
afish1001 wunet_ce 2.244 0.829 4.982 0.882
afish1001 unet_wj 2.223 0.648 5.178 0.858
afish1001 wunet_wj 2.212 0.641 5.157 0.854
HVCL Deep Res 2.212 1.672 4.182 0.837
afish1001 ABWUnet 2.199 0.696 5.042 0.856
afish1001 OurUnet 2.197 0.698 5.034 0.855
SeungLab-eding Seung-ed 1.525 0.895 2.160 0.764
DIVE focalZ 1.424 0.793 1.925 0.748
DIVE ag23 1.221 0.755 1.557 0.647
Rhoana Test1 1.162 1.646 1.179 0.486
SCI Submission_1 1.088 1.782 0.556 0.518
DIVE unet1 1.001 1.436 0.861 0.443
DIVE unet-1 1.001 1.436 0.861 0.443
DIVE unet3 0.994 1.451 0.831 0.441
  • Page 1 of 7
  • 25 of 162 items

Legend

CREMI score
The geometric mean of (VOI split + VOI merge) and ARAND.
VOI split and merge
The Variation of Information between a segmentation X and ground truth Y. The split and merge parts correspond to the conditional entropies H(X|Y) and H(Y|X), respectively.
ARAND
The Adapted Rand Error, i.e., 1.0 - Rand F-Score.

For each value, lower is better.

Leaderboard Synaptic Cleft Detection

Results for the synaptic cleft detection category, averaged over all samples.

Change the view to:
Group Submission CREMI score FP FN 1 - F-score ADGT ADF
VCG FgSm0805 54.23 111486.0 4917.0 0.142 84.04 24.43
VCG FgSemi08 55.41 160792.7 3178.3 0.181 96.54 14.28
VCG FgSm0808 57.34 75120.7 9109.7 0.107 72.74 41.94
VCG FgSemi05 57.63 221731.7 975.7 0.232 107.73 7.52
DIVE CleftNet 57.73 116095.3 8004.3 0.169 85.60 29.87
DIVE Aug-UNet 57.89 123985.7 7992.0 0.177 86.45 29.32
VCG FgUPlus 57.90 137412.7 4763.3 0.160 90.99 24.81
DIVE MixUNet 57.91 125220.0 7920.0 0.178 86.57 29.25
CleftKing CleABC1 58.02 130521.0 7637.3 0.179 87.11 28.94
VCG FgUPlus8 58.66 65345.7 11887.0 0.101 70.36 46.96
DIVE MixUNet4 59.03 122122.0 7761.3 0.177 89.35 28.71
lbdl lb_thicc 59.04 107211.3 234.0 0.190 109.17 8.91
CleftKing CleABC3 59.10 106917.7 10746.3 0.165 87.22 30.99
DIVE MixUNet3 59.11 125333.0 7696.0 0.180 89.68 28.54
3DEM BesA225 59.34 120053.7 7534.0 0.177 90.88 27.80
CleftKing CleABC2 59.47 110518.3 10494.7 0.170 88.88 30.05
3DEM BesA0.20 60.04 121311.3 7529.0 0.182 92.91 27.17
3DEM ABC_BC2 61.70 110946.0 10787.3 0.168 76.67 46.73
3DEM ABC_BC1 61.72 114005.0 10670.3 0.170 76.99 46.46
3DEM ABC_BC 62.21 99221.7 10981.0 0.151 75.79 48.63
SDG Unet2 63.92 202177.7 13156.0 0.256 97.64 30.19
lbdl lb_lst 64.22 118160.0 0.0 0.209 121.50 6.93
SDG PyTC20-P 64.59 215905.3 4962.0 0.227 96.14 33.05
I2I CleftB1 64.97 80004.7 19895.0 0.158 90.98 38.97
DIVE MixUNet2 65.19 106260.3 13962.0 0.188 95.28 35.09
  • Page 1 of 5
  • 25 of 101 items

Legend

CREMI score
The mean of ADGT and ADF. We are not including the F-Score for now, as our current way of computing it can lead to unfair comparison. We are working on a more robust measure and will update the results accordingly.
FP, FN, and F-Score
The False Positives, False Negatives, and the resulting F-Score in the clefts detection volume. See metrics for details.
ADGT
The average distance of any found cleft voxel to the closest ground truth cleft voxel.
ADF
The average distance of any ground truth cleft voxel to the closest found cleft voxel.

For each value, lower is better (F-Score shown as 1 - F-Score).

Leaderboard Synaptic Partner Identification

Results for the synaptic partner identification category, averaged over all samples.

Change the view to:
Group Submission CREMI score FP FN
NO2 lr_balan 0.447 175.333 314.000
NO2 lr_affin 0.567 118.333 450.000
PNI asyn_ori 0.493 310.000 302.333
PNI asyn_mod 0.423 367.667 262.333
PNI asyn_at1 0.490 274.333 314.667
PNI asyn_at0 0.468 293.333 297.000
HCBS Tr66t-01 0.469 182.667 318.667
HCBS Tr66comb 0.451 219.000 292.333
HCBS Tr66_80K 0.449 223.000 286.667
HCBS PrnTrn66 0.465 178.667 322.333
AnonumoysGroup PrTrn66 0.620 238.000 451.667
DIVE PTR_V2 0.453 162.667 333.667
IAL PSP_unar 0.461 266.667 281.000
IAL PSP_full 0.464 187.333 310.000
  • 14 items

Legend

CREMI score
1 - F-Score of the FP and FN.
FP, FN
The False Positives and False Negatives synaptic partner pairs. See metrics for details.

For each value, lower is better.