Task 2: Wikipedia small

Results Ordered by Accuracy:

NameAccEBFEBPEBRLBMaFLBMaPLBMaRLBMiFLBMiPLBMiRMGIE
arthur0.3739130.433220.4634250.4610230.2316670.3785530.2733970.390440.3814820.3998294.17094
chrishan0.3621040.4217580.4785960.4362250.2133810.3974270.2600980.3756240.3759540.3752944.36359
brouardc0.3535550.4189270.4743960.4360960.241520.3651770.2699110.3842450.3977480.3716294.07641
coolvegpuff0.3514530.4002520.4750490.3969850.1880380.4776910.1835370.3889390.444380.3457973.85817
cheshbon0.2790670.3250880.4301460.2900420.1080010.5609740.09973260.3039620.4187820.2385563.72619
anttip0.2522170.2904030.3725520.2641010.1201950.4033760.113350.2714220.3671760.215283.85877
Knn Baseline0.2491370.3175960.2829530.416390.1757920.2522060.2353990.297860.2508560.366545.70073

Acronyms of the evaluation measures:
Acc = Accuracy
EBF = Example Based F1-measure
EBP = Example Based Precision
EBR = Example Based Recall
LBMaF = Label Based Macro F1-measure
LBMaP = Label Based Macro Precision
LBMaR = Label Based Macro Recall
LBMiF = Label Based Micro F1-measure
LBMiP = Label Based Micro Precision
LBMiR = Label Based Micro Recall
MGIE = Multi-label Graph Induced Error