Leaderboard Neuron Segmentation
Results for the neuron segmentation category, averaged over all samples.
Group | Submission | CREMI score | VOI split | VOI merge | ARAND |
---|---|---|---|---|---|
DIVE | unet-3 | 0.994 | 1.451 | 0.831 | 0.441 |
PCH | UNet_hw | 0.990 | 1.828 | 0.529 | 0.419 |
DIVE | unet4 | 0.987 | 1.523 | 0.775 | 0.429 |
DIVE | unet-4 | 0.987 | 1.523 | 0.775 | 0.429 |
DIVE | unet2 | 0.972 | 1.481 | 0.803 | 0.419 |
DIVE | unet-2 | 0.972 | 1.481 | 0.803 | 0.419 |
FI-NCA | UnetTest | 0.950 | 1.120 | 0.737 | 0.486 |
PCH | ori_x4 | 0.839 | 1.700 | 0.078 | 0.398 |
DIVE | ag35 | 0.804 | 0.853 | 0.647 | 0.438 |
DIVE | seg3 | 0.798 | 1.587 | 0.351 | 0.341 |
DIVE | seg030 | 0.793 | 1.584 | 0.328 | 0.337 |
DIVE | ag35-5 | 0.793 | 1.491 | 0.296 | 0.358 |
DIVE | seg4 | 0.791 | 1.565 | 0.347 | 0.340 |
DIVE | seg018 | 0.787 | 1.567 | 0.325 | 0.336 |
DIVE | ag35-3 | 0.784 | 0.829 | 0.612 | 0.433 |
DIVE | segBW3 | 0.776 | 1.110 | 0.653 | 0.346 |
DIVE | segBW2 | 0.776 | 1.111 | 0.651 | 0.346 |
DIVE | segBW5 | 0.775 | 1.112 | 0.651 | 0.345 |
DIVE | seg0316 | 0.775 | 1.536 | 0.327 | 0.331 |
DIVE | segBW4 | 0.772 | 1.113 | 0.645 | 0.344 |
DIVE | M2uD_CSR | 0.766 | 1.030 | 0.770 | 0.329 |
DIVE | ag35-4 | 0.759 | 0.899 | 0.510 | 0.416 |
DIVE | newSeg | 0.752 | 1.471 | 0.252 | 0.335 |
DIVE | CRunet22 | 0.746 | 1.522 | 0.438 | 0.298 |
DIVE | seg1 | 0.745 | 1.461 | 0.242 | 0.333 |
Legend
- CREMI score
- The geometric mean of (VOI split + VOI merge) and ARAND.
- VOI split and merge
- The Variation of Information between a segmentation X and ground truth Y. The split and merge parts correspond to the conditional entropies H(X|Y) and H(Y|X), respectively.
- ARAND
- The Adapted Rand Error, i.e., 1.0 - Rand F-Score.
For each value, lower is better.
Leaderboard Synaptic Cleft Detection
Results for the synaptic cleft detection category, averaged over all samples.
Group | Submission | CREMI score | FP | FN | 1 - F-score | ADGT | ADF |
---|---|---|---|---|---|---|---|
3DEM | BesA225 | 59.34 | 120053.7 | 7534.0 | 0.177 | 90.88 | 27.80 |
3DEM | BesA0.20 | 60.04 | 121311.3 | 7529.0 | 0.182 | 92.91 | 27.17 |
3DEM | ABC_BC2 | 61.70 | 110946.0 | 10787.3 | 0.168 | 76.67 | 46.73 |
3DEM | ABC_BC1 | 61.72 | 114005.0 | 10670.3 | 0.170 | 76.99 | 46.46 |
3DEM | ABC_BC | 62.21 | 99221.7 | 10981.0 | 0.151 | 75.79 | 48.63 |
3DEM | Part_A | 65.43 | 112964.7 | 10645.0 | 0.174 | 96.87 | 34.00 |
3DEM | BC_BC | 66.71 | 82113.3 | 13347.3 | 0.126 | 79.96 | 53.46 |
3DEM | All_A | 76.12 | 157692.7 | 6808.7 | 0.256 | 130.59 | 21.66 |
3DEM | BesA275 | 252.56 | 254701.0 | 581781.3 | 0.456 | 199.02 | 306.10 |
3DEM | ABC_C | 375.91 | 289364.7 | 89797.7 | 0.619 | 385.49 | 366.34 |
CMM | UNet50 | 69.31 | 65762.0 | 12634.7 | 0.139 | 97.45 | 41.17 |
CMM | Max Test | 74.10 | 24019.3 | 44179.7 | 0.101 | 64.63 | 83.56 |
CMM | GenT | 380.86 | 167900713.3 | 0.0 | 0.990 | 761.72 | 0.00 |
CleftKing | CleABC1 | 58.02 | 130521.0 | 7637.3 | 0.179 | 87.11 | 28.94 |
CleftKing | CleABC3 | 59.10 | 106917.7 | 10746.3 | 0.165 | 87.22 | 30.99 |
CleftKing | CleABC2 | 59.47 | 110518.3 | 10494.7 | 0.170 | 88.88 | 30.05 |
DIVE | CleftNet | 57.73 | 116095.3 | 8004.3 | 0.169 | 85.60 | 29.87 |
DIVE | Aug-UNet | 57.89 | 123985.7 | 7992.0 | 0.177 | 86.45 | 29.32 |
DIVE | MixUNet | 57.91 | 125220.0 | 7920.0 | 0.178 | 86.57 | 29.25 |
DIVE | MixUNet4 | 59.03 | 122122.0 | 7761.3 | 0.177 | 89.35 | 28.71 |
DIVE | MixUNet3 | 59.11 | 125333.0 | 7696.0 | 0.180 | 89.68 | 28.54 |
DIVE | MixUNet2 | 65.19 | 106260.3 | 13962.0 | 0.188 | 95.28 | 35.09 |
DIVE | CleftC4 | 65.38 | 112813.3 | 14762.3 | 0.179 | 82.62 | 48.15 |
DIVE | CleftC5 | 65.75 | 125395.7 | 11906.7 | 0.181 | 84.10 | 47.41 |
DIVE | CleftC2 | 66.19 | 117135.0 | 19904.7 | 0.181 | 83.30 | 49.08 |
- Page 1 of 5
- Next
- 25 of 101 items
Legend
- CREMI score
- The mean of ADGT and ADF. We are not including the F-Score for now, as our current way of computing it can lead to unfair comparison. We are working on a more robust measure and will update the results accordingly.
- FP, FN, and F-Score
- The False Positives, False Negatives, and the resulting F-Score in the clefts detection volume. See metrics for details.
- ADGT
- The average distance of any found cleft voxel to the closest ground truth cleft voxel.
- ADF
- The average distance of any ground truth cleft voxel to the closest found cleft voxel.
For each value, lower is better (F-Score shown as 1 - F-Score).
Leaderboard Synaptic Partner Identification
Results for the synaptic partner identification category, averaged over all samples.
Group | Submission | CREMI score | FP | FN |
---|---|---|---|---|
PNI | asyn_mod | 0.423 | 367.667 | 262.333 |
NO2 | lr_balan | 0.447 | 175.333 | 314.000 |
HCBS | Tr66_80K | 0.449 | 223.000 | 286.667 |
HCBS | Tr66comb | 0.451 | 219.000 | 292.333 |
DIVE | PTR_V2 | 0.453 | 162.667 | 333.667 |
IAL | PSP_unar | 0.461 | 266.667 | 281.000 |
IAL | PSP_full | 0.464 | 187.333 | 310.000 |
HCBS | PrnTrn66 | 0.465 | 178.667 | 322.333 |
PNI | asyn_at0 | 0.468 | 293.333 | 297.000 |
HCBS | Tr66t-01 | 0.469 | 182.667 | 318.667 |
PNI | asyn_at1 | 0.490 | 274.333 | 314.667 |
PNI | asyn_ori | 0.493 | 310.000 | 302.333 |
NO2 | lr_affin | 0.567 | 118.333 | 450.000 |
AnonumoysGroup | PrTrn66 | 0.620 | 238.000 | 451.667 |
- 14 items
Legend
- CREMI score
- 1 - F-Score of the FP and FN.
- FP, FN
- The False Positives and False Negatives synaptic partner pairs. See metrics for details.
For each value, lower is better.