Using a priori algorithm for recommendations

So, a recent question let me know about a rather cool a priori algorithm . I see why this works, but I'm not sure about the practical use. Presumably, the main reason for calculating the corresponding sets of elements is to give recommendations for someone based on their own purchases (or their items, etc.). But how do you move from a set of related item sets to individual recommendations?

Wikipedia article concludes:

The second problem is to create an association rule from those large sets of items with restrictions of minimal confidence. Suppose that one of the large sets of elements is Lk, Lk = {I1, I2, ..., Ik}, the rules for associating with this set of elements are as follows: first rule: {I1, I2, ..., Ik-1 } β‡’ {Ik}, by checking the confidence in this rule can be determined as interesting or not. Then another rule is generated by removing the last elements in the antecedent and inserting it, therefore, the credibility of the new rules was checked to determine their interest. Those processes are repeated until the antecedent becomes empty

I am not sure that the set of association rules also helps in determining the best set of recommendations. Perhaps I lack sense, and a priori is not intended for this use? In which case, what is it for?

+4
source share
2 answers

Thus, the a priori algorithm is no longer modern for the analysis of a market basket (aka Association Rule Mining ). The methods have improved, although the Apriori principle (that support for a subset of the upper limit supports the set) is still the driving force.

In any case, how association rules are used to generate recommendations is that, given a certain set of elements of the story, we can check each antecedant rule to see if it is contained in the story. If so, then we can recommend this rule (except when the subsequent data is already in the history, of course).

We can use various indicators to rank our recommendations, since with many rules we can have many hits when comparing them with history, and we can only make a limited number of recommendations. Some useful metrics are rule support (which coincides with support for combining the antecedent and the subsequent), confidence in the rule (support of the rule over the support of the antecedent), and rule raising (support of the rule over the product of supporting the antecedent and the subsequent), among others.

+2
source

If you need information on how Apriori can be used for classification, you can read the article on the CBA algorithm:

Bing Liu, Wynne Hsu, Yiming Ma, "Classification Integration and Association Management Rules." Proceedings of the Fourth International Conference on Knowledge Discovery and Data Mining (KDD-98, plenary presentation), New York, USA, 1998.

0
source