Recursive Attentive Methods with Reused Item Representations for Sequential Recommendation

09/16/2022
by   Bo Peng, et al.
1

Sequential recommendation aims to recommend the next item of users' interest based on their historical interactions. Recently, the self-attention mechanism has been adapted for sequential recommendation, and demonstrated state-of-the-art performance. However, in this manuscript, we show that the self-attention-based sequential recommendation methods could suffer from the localization-deficit issue. As a consequence, in these methods, over the first few blocks, the item representations may quickly diverge from their original representations, and thus, impairs the learning in the following blocks. To mitigate this issue, in this manuscript, we develop a recursive attentive method with reused item representations (RAM) for sequential recommendation. We compare RAM with five state-of-the-art baseline methods on six public benchmark datasets. Our experimental results demonstrate that RAM significantly outperforms the baseline methods on benchmark datasets, with an improvement of as much as 11.3 wider models for better performance. Our run-time performance comparison signifies that RAM could also be more efficient on benchmark datasets.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset