Reinforcement Learning (MDP) Based Recommender System

Case Study for Reinforcement Learning
Reinforcement Learning (MDP) Based Recommender System 1 | Veracity


Recommender systems seek to predict the “rating” or “preference” a user would give to an item. Recommender systems are utilized in a variety of areas and are most commonly recognized as playlist generators for video and music services or product recommenders for retail services or content recommenders for social media platforms.

The need for the recommender system has been expanded by the information explosion. As the choice of users increases, the importance of recommender systems that assist in decision making becomes crucial.


Recommender systems have achieved great success with a method called collaborative filtering (CF), a popular technique in the domain. The objective of CF is to make a personalized prediction about the preferences of users, using the information about other users who have similar interests for items. Disadvantage of CF are that they are usually one dimensional and static limiting the accuracy and adaptability of the approach.


Reinforcement Learning is a recent interest area in Machine Learning that uses an experience-based approach to learn how to maximize the outcome. VeracityAI provides a reinforcement (MDP) based approach to recommender system that can adapt to individual needs efficiently and accurately. We use a discrete state MDP model to maximize the utility function that takes into account the future interactions with the users.


Recommender user interactions in MDP [1]


Using Reinforcement Learning based recommender system we not only optimize the efficiency but also improve the recommendation quality of the cold start problem. Moreover, the methodology enables us to identify the reasoning behind the recommendations without being limited to a black box solution.

Source : [1] F. Liu et al., “Deep Reinforcement Learning based Recommendation with Explicit User-Item Interactions Modeling”,, 2019. [Online]. Available: [Accessed: 05- Sep- 2019].

Contact us for get to know more about our products

Recent research publications

Schedule a demo with our experts

Get to know more about our products