Ever wondered how some of your favorite shopping sites already know what you like? Curious how some of your most wanted items pop everywhere while you browse on social media? Well, that is the power of AI, specifically recommendation system. Every now and then I am put to relief by recommendations, they barely let me forget. Moreover, they even suggest better options without getting lost in the pool of items in different categories with different names, quality, brands, a little overwhelming too!People spending time on online shopping portals can completely relate to this. But why do we usually relate the concept of a recommender system to e-commerce sites? AI is a huge aspect and a concept deeper than our imagination. So we, the machine learning team, at Znbound thought of bringing the concept of recommendations to B2B sites. We were inspired and motivated, confused and dazzled, excited and determined. I'd like to share with you our journey that led to this idea that involved U-turns, uphills, downhills and hopefully soon enough a "eureka moment".
When you visit a site, you often search for a specific content and most probably something that might interest you. Keeping this in our minds, we thought of implementing the concept of page recommendations on our own website. The main reason we wanted to integrate its usage was to improve the visitor to lead conversion rate. This looks like a black box but it's not that hard to think. Suppose you visit our website and you become a lead, but do you know what our black box planned to do?It planned to track your and as well as other users' behavior patterns, what pages they visited, how much time they spent, what was the location, at what time and much more. This helped us provide you with a better path to navigate our website and hopefully get you interested in the website.Our aim was to track the most preferred pattern of our visitors that eventually got converted into leads and then recommend the same to the other visitors. This kind of data would help the user by suggesting significant navigation paths, saving precious time and most importantly getting lost in the content. It's always easy when you know the path. So the idea excited us and got us going!
To begin with anything, we need to pin down the steps that we need to follow. The very basic step was to understand how a recommendation system works. To understand the protocol we started researching some AI based recommendation systems. I'd like to mention that there are myriads of choices but for understanding the core basics we considered some factors before shortening down our choices. One of the major factors included 'easy access to API'. After long hours of research, we came to a conclusion of going with Recombee.Since data is the prior need for any model building process, the very first step was to create a dummy data-set. This dummy data-set was a record of 1000 users which included their IDs and 20 different pages. Each user was allotted a different path of pages visited, this randomly jumbled up data gave more variation for algorithms to give better and specific insights.
The reason we had to create the data-set manually was the unavailability of the required data from other sources. For a recommender system, the data requirements are much more than the data required for prediction models. Exporting such data from Google Analytics and Hubspot was also forbidden. This was the very first issue we faced.The second issue was that we had no products to offer on our site since we are a B2B facing business, therefore, validating the recommendations was not possible. Thus we had no way to cross check if the user when going to a specific page actually liked thatSince the content was less, therefore, the recommendations were highly generalized.
A Turn Towards New Approach
Of course the problems we faced clearly indicated that we needed to change our approach, altogether bringing us back to square one, but then not dropping the idea made us scratch our heads more. It then struck us that maybe the solution wasn't as complicated as we made it out to be.Let's see an exampleSuppose you go grocery shopping and you enter the shopping complex. You might have noticed that the grocery section just includes the veggies and fruits. You won't randomly come across shampoo bottles in between. My point is the items in a supermarket are arranged in a specific manner and order for a reason.Therefore we come across a term Market Basket Analysis which is based on the Apriori algorithm.
It is a classic algorithm and is majorly used for digging out the frequency of a particular pattern or an item and for finding associations.
This algorithm can be understood on the basis of three terms namely: support, confidence and lift.
- Support: It is the frequency of a rule, for e.g. how many times does "A" appear in the transactions
- Confidence: Tells us how often the rule is valid. A confidence of A => B at 80% tells us that 80% of the times when people purchase "A", they'll also purchase "B"
- Lift: Tells us whether the rules just happened by chance or are actually significant
Simple Apriori Rule: A visitor at "work-order-software" page generally goes to "premium-onboarding-package" page
Exploring the AprioriNow as a team we are exploring our best chances of providing better recommendations for our website. Our main motive is to find a pattern of pages through this algorithm. Finally, we have the access to the data provided by Mouseflow
Sample data from mouseflow
It is better than our previous approach because this isn't affected by validation. Thus association rules will better define the preference of the visitors by analyzing their behavior well. We are going ahead with this as our previously faced issues won't be an issue for this approach. If we are able to implement this, we'll have solved another problem of recommendations for B2B websites.Last time we solved the problem of lead scoring