Leaderboard Neuron Segmentation
Results for the neuron segmentation category, averaged over all samples.
| Group | Submission | CREMI score | VOI split | VOI merge | ARAND | 
|---|---|---|---|---|---|
| IAL | CMaskMC | 0.305 | 0.341 | 0.175 | 0.184 | 
| IAL | LMC V5 | 0.398 | 0.597 | 0.272 | 0.184 | 
| IAL | CMaskSP1 | 0.317 | 0.360 | 0.186 | 0.187 | 
| IAL | LMC V1 | 0.399 | 0.646 | 0.211 | 0.187 | 
| IAL | MWS | 0.392 | 0.718 | 0.100 | 0.187 | 
| JRC | MTLSD_L | 0.318 | 0.386 | 0.153 | 0.190 | 
| DIVE | CRNNGate | 0.514 | 1.191 | 0.196 | 0.194 | 
| IAL | CMaskMWS | 0.314 | 0.361 | 0.162 | 0.196 | 
| IAL | MutexWS3 | 0.362 | 0.362 | 0.312 | 0.199 | 
| IAL | CMaskSP2 | 0.327 | 0.339 | 0.205 | 0.201 | 
| JRC | LSDLR | 0.344 | 0.336 | 0.256 | 0.203 | 
| IAL | LMC V3 | 0.445 | 0.655 | 0.308 | 0.208 | 
| IAL | UnetLRMC | 0.410 | 0.442 | 0.374 | 0.208 | 
| IAL | LMC V4 | 0.447 | 0.622 | 0.295 | 0.219 | 
| IAL | LMC V2 | 0.446 | 0.629 | 0.285 | 0.219 | 
| IAL | CMaskSP3 | 0.378 | 0.349 | 0.323 | 0.220 | 
| IAL | MutexWS2 | 0.469 | 0.427 | 0.551 | 0.228 | 
| DIVE | CRunet2 | 0.566 | 1.081 | 0.389 | 0.229 | 
| IAL | LMC V6 | 0.516 | 0.672 | 0.415 | 0.245 | 
| PCH | ifn_x4 | 0.572 | 1.176 | 0.151 | 0.249 | 
| IAL | MutexWS1 | 0.514 | 0.423 | 0.626 | 0.255 | 
| VIDAR | unet3d | 0.534 | 0.833 | 0.282 | 0.256 | 
| DIVE | MPfcn | 0.625 | 1.063 | 0.486 | 0.260 | 
| DIVE | DISW | 0.557 | 1.035 | 0.149 | 0.265 | 
| PCH | ifn_x0 | 0.634 | 1.261 | 0.257 | 0.266 | 
Legend
- CREMI score
 - The geometric mean of (VOI split + VOI merge) and ARAND.
 - VOI split and merge
 - The Variation of Information between a segmentation X and ground truth Y. The split and merge parts correspond to the conditional entropies H(X|Y) and H(Y|X), respectively.
 - ARAND
 - The Adapted Rand Error, i.e., 1.0 - Rand F-Score.
 
For each value, lower is better.
Leaderboard Synaptic Cleft Detection
Results for the synaptic cleft detection category, averaged over all samples.
| Group | Submission | CREMI score | FP | FN | 1 - F-score | ADGT | ADF | 
|---|---|---|---|---|---|---|---|
| VCG | FgDT05 | 124.93 | 15650.7 | 88933.3 | 0.165 | 41.04 | 208.83 | 
| VCG | FgDTSm08 | 434.91 | 4747.0 | 355265.0 | 0.619 | 42.86 | 826.96 | 
| HSS | 3DUNet04 | 79.33 | 29732.0 | 27514.3 | 0.113 | 53.22 | 105.44 | 
| VCG | FgDTSm05 | 232.99 | 15377.7 | 173608.0 | 0.349 | 56.70 | 409.27 | 
| DLST | DLST003 | 91.71 | 33118.7 | 35934.7 | 0.125 | 58.14 | 125.27 | 
| VCG | FgDT08 | 795.59 | 727.0 | 616222.3 | 0.834 | 60.07 | 1531.10 | 
| HSS | 3DUNet | 65.34 | 43440.3 | 22443.3 | 0.106 | 63.26 | 67.43 | 
| DKFZ-MIC | nnUNet | 74.96 | 49914.0 | 18981.3 | 0.117 | 64.46 | 85.46 | 
| timliu | mynnUNet | 75.11 | 49973.7 | 19071.3 | 0.117 | 64.50 | 85.71 | 
| CMM | Max Test | 74.10 | 24019.3 | 44179.7 | 0.101 | 64.63 | 83.56 | 
| DLST | DLST002 | 75.46 | 56645.7 | 17498.3 | 0.119 | 65.93 | 85.00 | 
| DLST | DLST001 | 71.19 | 53908.0 | 14839.0 | 0.106 | 69.02 | 73.36 | 
| VCG | FgUPlus8 | 58.66 | 65345.7 | 11887.0 | 0.101 | 70.36 | 46.96 | 
| DIVE | WeitUNet | 78.12 | 54372.7 | 24480.3 | 0.128 | 71.51 | 84.73 | 
| I2I | CSparSep | 75.79 | 71650.0 | 18918.3 | 0.140 | 72.06 | 79.52 | 
| VCG | FgSm0808 | 57.34 | 75120.7 | 9109.7 | 0.107 | 72.74 | 41.94 | 
| 3DEM | ABC_BC | 62.21 | 99221.7 | 10981.0 | 0.151 | 75.79 | 48.63 | 
| DIVE | BouUNet | 81.45 | 75985.0 | 21974.0 | 0.160 | 76.66 | 86.24 | 
| 3DEM | ABC_BC2 | 61.70 | 110946.0 | 10787.3 | 0.168 | 76.67 | 46.73 | 
| 3DEM | ABC_BC1 | 61.72 | 114005.0 | 10670.3 | 0.170 | 76.99 | 46.46 | 
| DIVE | Mix_ABC2 | 75.02 | 131499.0 | 13795.7 | 0.200 | 78.36 | 71.67 | 
| DIVE | Mix_ABC1 | 74.39 | 101269.0 | 16022.0 | 0.174 | 79.04 | 69.73 | 
| 3DEM | BC_BC | 66.71 | 82113.3 | 13347.3 | 0.126 | 79.96 | 53.46 | 
| I2I | FCEval | 66.45 | 107963.7 | 21352.3 | 0.179 | 82.28 | 50.62 | 
| DIVE | CleftC4 | 65.38 | 112813.3 | 14762.3 | 0.179 | 82.62 | 48.15 | 
- Page 1 of 5
 - Next
 - 25 of 102 items
 
Legend
- CREMI score
 - The mean of ADGT and ADF. We are not including the F-Score for now, as our current way of computing it can lead to unfair comparison. We are working on a more robust measure and will update the results accordingly.
 - FP, FN, and F-Score
 - The False Positives, False Negatives, and the resulting F-Score in the clefts detection volume. See metrics for details.
 - ADGT
 - The average distance of any found cleft voxel to the closest ground truth cleft voxel.
 - ADF
 - The average distance of any ground truth cleft voxel to the closest found cleft voxel.
 
For each value, lower is better (F-Score shown as 1 - F-Score).
Leaderboard Synaptic Partner Identification
Results for the synaptic partner identification category, averaged over all samples.
| Group | Submission | CREMI score | FP | FN | 
|---|---|---|---|---|
| PNI | asyn_mod | 0.423 | 367.667 | 262.333 | 
| NO2 | lr_balan | 0.447 | 175.333 | 314.000 | 
| HCBS | Tr66_80K | 0.449 | 223.000 | 286.667 | 
| HCBS | Tr66comb | 0.451 | 219.000 | 292.333 | 
| DIVE | PTR_V2 | 0.453 | 162.667 | 333.667 | 
| IAL | PSP_unar | 0.461 | 266.667 | 281.000 | 
| IAL | PSP_full | 0.464 | 187.333 | 310.000 | 
| HCBS | PrnTrn66 | 0.465 | 178.667 | 322.333 | 
| PNI | asyn_at0 | 0.468 | 293.333 | 297.000 | 
| HCBS | Tr66t-01 | 0.469 | 182.667 | 318.667 | 
| PNI | asyn_at1 | 0.490 | 274.333 | 314.667 | 
| PNI | asyn_ori | 0.493 | 310.000 | 302.333 | 
| NO2 | lr_affin | 0.567 | 118.333 | 450.000 | 
| AnonumoysGroup | PrTrn66 | 0.620 | 238.000 | 451.667 | 
- 14 items
 
Legend
- CREMI score
 - 1 - F-Score of the FP and FN.
 - FP, FN
 - The False Positives and False Negatives synaptic partner pairs. See metrics for details.
 
For each value, lower is better.