Ranking Recommendation Metrics
Created: 2022-03-23 12:32
Average Precision at k (AP@K) s a measure of the average relevance scores of a set of the top-K recommendations presented. Its formula is where rel(i) is just an indicator that says if the i-th item was relevant.
MAP@K tells us how relevant the list of recommended items are. It consists in the mean of the Average Precision
MAR@K tells us how well the recommender is able to recall all the items the user has rated positively in the test set. It consists in the mean of the Average Recall.
References
- https://www.sciencedirect.com/science/article/pii/S0950705113001044#b0940
- http://sdsawtelle.github.io/blog/output/mean-average-precision-MAP-for-recommender-systems.html
- https://jonathan-hui.medium.com/map-mean-average-precision-for-object-detection-45c121a31173
Code
- https://github.com/benhamner/Metrics
- https://github.com/statisticianinstilettos/recmetrics
Tags
#ranking_metrics #precision #map #mar #recall #recsys