Login | Register

CREMI

MICCAI Challenge on
Circuit Reconstruction from Electron Microscopy Images

Leaderboard Neuron Segmentation

Results for the neuron segmentation category, averaged over all samples.

Change the view to:
Group Submission CREMI score VOI split VOI merge ARAND
Cvlab cvlab 3.472 6.069 6.542 0.956
DIVE RUseg 2.976 6.003 3.121 0.972
DIVE BasUnet 2.959 1.740 7.366 0.962
- Final 2.896 3.805 5.599 0.895
Cvlab cvlab_1 2.891 4.686 5.421 0.828
Cvlab cvlab_2 2.756 1.526 6.411 0.957
- testsubm 2.683 3.083 4.768 0.918
VCG ZWZ-0.0 2.675 3.524 4.193 0.927
IIL+DIVE@TAMU new_segm 2.671 0.928 6.475 0.964
VCG testmala 2.412 3.297 3.283 0.888
afish1001 unet_ce 2.246 0.831 4.987 0.883
afish1001 wunet_ce 2.244 0.829 4.982 0.882
afish1001 unet_wj 2.223 0.648 5.178 0.858
afish1001 wunet_wj 2.212 0.641 5.157 0.854
HVCL Deep Res 2.212 1.672 4.182 0.837
afish1001 ABWUnet 2.199 0.696 5.042 0.856
afish1001 OurUnet 2.197 0.698 5.034 0.855
SeungLab-eding Seung-ed 1.525 0.895 2.160 0.764
DIVE focalZ 1.424 0.793 1.925 0.748
DIVE ag23 1.221 0.755 1.557 0.647
Rhoana Test1 1.162 1.646 1.179 0.486
SCI Submission_1 1.088 1.782 0.556 0.518
DIVE unet1 1.001 1.436 0.861 0.443
DIVE unet-1 1.001 1.436 0.861 0.443
DIVE unet3 0.994 1.451 0.831 0.441
  • Page 1 of 7
  • 25 of 162 items

Legend

CREMI score
The geometric mean of (VOI split + VOI merge) and ARAND.
VOI split and merge
The Variation of Information between a segmentation X and ground truth Y. The split and merge parts correspond to the conditional entropies H(X|Y) and H(Y|X), respectively.
ARAND
The Adapted Rand Error, i.e., 1.0 - Rand F-Score.

For each value, lower is better.

Leaderboard Synaptic Cleft Detection

Results for the synaptic cleft detection category, averaged over all samples.

Change the view to:
Group Submission CREMI score FP FN 1 - F-score ADGT ADF
SDG PyTC20 65.68 167107.0 6190.0 0.196 90.36 41.00
3DEM All_A 76.12 157692.7 6808.7 0.256 130.59 21.66
3DEM BesA0.20 60.04 121311.3 7529.0 0.182 92.91 27.17
3DEM BesA225 59.34 120053.7 7534.0 0.177 90.88 27.80
CleftKing CleABC1 58.02 130521.0 7637.3 0.179 87.11 28.94
DIVE MixUNet3 59.11 125333.0 7696.0 0.180 89.68 28.54
DIVE MixUNet4 59.03 122122.0 7761.3 0.177 89.35 28.71
DIVE MixUNet 57.91 125220.0 7920.0 0.178 86.57 29.25
DIVE Aug-UNet 57.89 123985.7 7992.0 0.177 86.45 29.32
DIVE CleftNet 57.73 116095.3 8004.3 0.169 85.60 29.87
SDG PyTC20-1 67.97 131489.3 8699.0 0.167 84.22 51.73
VCG FgSm0808 57.34 75120.7 9109.7 0.107 72.74 41.94
JRC DTU-2 67.56 70761.3 10069.3 0.123 109.67 25.46
CleftKing CleABC2 59.47 110518.3 10494.7 0.170 88.88 30.05
3DEM Part_A 65.43 112964.7 10645.0 0.174 96.87 34.00
3DEM ABC_BC1 61.72 114005.0 10670.3 0.170 76.99 46.46
CleftKing CleABC3 59.10 106917.7 10746.3 0.165 87.22 30.99
3DEM ABC_BC2 61.70 110946.0 10787.3 0.168 76.67 46.73
3DEM ABC_BC 62.21 99221.7 10981.0 0.151 75.79 48.63
I2I SparSep 66.76 151781.7 11350.3 0.188 87.35 46.17
VCG FgUPlus8 58.66 65345.7 11887.0 0.101 70.36 46.96
DIVE CleftC5 65.75 125395.7 11906.7 0.181 84.10 47.41
SDG Synap0.0 118.10 193580.0 12504.0 0.270 185.62 50.58
SDG SynapX.0 118.10 193580.0 12504.0 0.270 185.62 50.58
CMM UNet50 69.31 65762.0 12634.7 0.139 97.45 41.17

Legend

CREMI score
The mean of ADGT and ADF. We are not including the F-Score for now, as our current way of computing it can lead to unfair comparison. We are working on a more robust measure and will update the results accordingly.
FP, FN, and F-Score
The False Positives, False Negatives, and the resulting F-Score in the clefts detection volume. See metrics for details.
ADGT
The average distance of any found cleft voxel to the closest ground truth cleft voxel.
ADF
The average distance of any ground truth cleft voxel to the closest found cleft voxel.

For each value, lower is better (F-Score shown as 1 - F-Score).

Leaderboard Synaptic Partner Identification

Results for the synaptic partner identification category, averaged over all samples.

Change the view to:
Group Submission CREMI score FP FN
NO2 lr_affin 0.567 118.333 450.000
DIVE PTR_V2 0.453 162.667 333.667
NO2 lr_balan 0.447 175.333 314.000
HCBS PrnTrn66 0.465 178.667 322.333
HCBS Tr66t-01 0.469 182.667 318.667
IAL PSP_full 0.464 187.333 310.000
HCBS Tr66comb 0.451 219.000 292.333
HCBS Tr66_80K 0.449 223.000 286.667
AnonumoysGroup PrTrn66 0.620 238.000 451.667
IAL PSP_unar 0.461 266.667 281.000
PNI asyn_at1 0.490 274.333 314.667
PNI asyn_at0 0.468 293.333 297.000
PNI asyn_ori 0.493 310.000 302.333
PNI asyn_mod 0.423 367.667 262.333
  • 14 items

Legend

CREMI score
1 - F-Score of the FP and FN.
FP, FN
The False Positives and False Negatives synaptic partner pairs. See metrics for details.

For each value, lower is better.