Task 3: Wikipedia large

Results Ordered by Accuracy:

NameAccEBFEBPEBRLBMaFLBMaPLBMaRLBMiFLBMiPLBMiRMGIE
arthur0.3466840.4263580.4964130.4578430.186770.369870.2282360.3483530.354280.3426214.64739
chrishan0.3367030.4116010.4961750.415710.1833440.3866450.2275370.3345630.3697240.305514.39208
coolvegpuff0.2828580.348860.4431160.381050.1478330.3102670.1511970.2737320.2601220.2888456.17844
Knn Baseline0.2724380.3471610.3627650.3869410.1486020.3033480.1769780.3015730.3256110.2808394.28833
anttip0.1767980.2101440.3129980.1924740.0612740.2983510.06629640.1611840.287510.1119814.53489

Acronyms of the evaluation measures:
Acc = Accuracy
EBF = Example Based F1-measure
EBP = Example Based Precision
EBR = Example Based Recall
LBMaF = Label Based Macro F1-measure
LBMaP = Label Based Macro Precision
LBMaR = Label Based Macro Recall
LBMiF = Label Based Micro F1-measure
LBMiP = Label Based Micro Precision
LBMiR = Label Based Micro Recall
MGIE = Multi-label Graph Induced Error