Conversion Rate Optimisation w/ Rishi Rawat
AT: Welcome to another episode of Munch & Learn.
When you work in marketing, it doesn't take long to realise that being able to maximise a conversion rate is of high importance. Minimising waste and maximising opportunities is the name of the game to ensure all that hard-earned traffic is converted into sales.
Today, I'm joined by Rishi Rawat and we take a closer look at how e-commerce teams can improve their conversion rate by 20% in 90 days. Rishi and his team at Frictionless Commerce, specialise in radically improving product page conversion rates, and achieve this by supercharging a product story using buyer psychology copywriting.
AT: The most valuable thing Rishi owns is a reprint of an 1897 Sears Catalogue. For those hungry for knowledge, Rishi also has a weekly conversion secrets newsletter, which you can find in the show notes. He's also quite active on LinkedIn, where Rishi is consistently sharing quick examples that you can easily implement on your side. This I can 100% confirm as it's where I came across Rishi and felt compelled to invite on to the show to share his expertise.
AT: Rishi, welcome to the show.
RR: Andy. Great to be here. Thanks for that.
AT: No worries. Okay, so Rishi, it's 15 minutes of fast paced learning for our audience where we'll go through:
How teams can identify they have an issue.
What planning steps are required, and finally
What to keep in mind to ensure you achieve the outcome.
Are you ready to Munch & Learn?
RR: I am.
AT: Great, let's kick off with:
How teams can identify they have an issue.
R: From my perspective, I am looking at singularly the product page. That's the scope that I'm focused on. So the way we identify if a product page has an issue, or which of your product pages have an issue. The way you do that is you look at your analytics data and you are looking at a very specific metric, which is called the look to book ratio. Meaning how many people came to this product page? And then how many units of this item did we actually sell?
So you're not looking at the overall conversion rates, but you're looking at the very specific conversion rates of people that bought a specific product on a product page. That reveals to you, which of your product pages if you have 10 of them, for example, are doing a good job with the follow-through and getting the users to convert, and which ones are actually struggling. That's one area that you can focus on. The other thing you're paying attention to is what is the visibility of those product pages. Often times what happens is Brands and Marketers assume that we have 10 products, therefore they're all equally visible, but what you find in analytics data is that you might have a product that's driving 30% of overall sales, but is seen by only 22% of users, but if it's driving 30% of overall sales? I want at least 30% of my users to see it, and by doing this type of analysis, you're able to really see what's going on and which product page in particular has the biggest opportunity.
AT: I love the laser focus. I have to admit when I reached out to you on LinkedIn and we talked about doing this, you said, well, I only talk about one topic and that's just improving the conversion rate on product pages. So I can see that comes through in how you identify insights as well, because you're so clear in mind in what you're trying to achieve. So is that that's I guess what you recommended teams as well, just to stay very focused on what they're trying to move?
RR: That's exactly right.
AT: Because it can get overwhelming, right? When you work in marketing and you're juggling a lot of things. You sometimes are wearing many hats and having that ability to really dial it in is isn't easy.
RR: I think this is one of the key things that I would just try to underscore again, is that there are 10,000 things that a Marketer can do that the marketing stack is overwhelming. I think we need to move away from efficiency and focus in on efficacy. Really ask yourself which two or three items if I would improve by 20% would have the biggest impact overall and just zero in on those things. I once worked for a client on just one page for two and a half years, and it drove them crazy because they were like, ‘listen, we are a $400 million business we have so many opportunities’. And I told them, I said, well, this page drives 30% of your overall revenue. So I want to continue focusing on this page. And, you know, it's really important to kind of have that laser focus and to just stick to it and not get distracted by all the shiny objects in the marketing universe.
AT: That's amazing – two and a half years. Can I ask how many tests you ran in that period?
RR: We ran at least 50 statistically valid AB test.
AT: Okay, so I think we've covered how teams could identify that they're potentially suffering from this let's move into the next section.
What planning steps are required?
RR: When it comes to AB testing then obviously one of the limiting factors for us is statistical significance. And so we can only run an A/B test on a page where there is relatively high transaction velocity. If there isn't, then you can't run a meaningful A/B test. In that case, you would have to shift your focus on qualitative analysis and qualitative solutions, not quantitative solutions. But I think the place to start is to identify pages or identify parts of your funnel, where there is enough quantitative data so you can actually run a statistical A/B test. So kind of look through your funnel to see where the transaction, where the trends, what is influencing transactions, and also pay attention to parts of the funnel where people are exiting.
If people are exiting at a point, what it really signals to us as Marketers is that we made a promise to them, which brought them to the page. We showed them some message, but it didn't resonate with them, which is why they exited. So we want to stop that leakage.
AT: It kind of touches back on identifying if you're also a good fit, right? So you need to have enough traffic to get stat. sig. and so I love it you're often balancing the qualitative and quantitative, so I'm assuming product pages you come across are most times qualifying?
RR: So that's the thing I mean, in our case, we have a threshold that the product page needs to be driving at least $300,000 annual sales for us to even touch it. So if it's anything lower than a 300,000, then we don't even do the qualitative analysis because our speciality is in getting statistical winners. So our criteria is 300,000 in annual sales and the reason why that revenue matters so much is that if we were to simply base it on transactions, for example, you could be in a situation where you have a product page, let's say a hundred thousand in sales has the right number of transactions because the average order value is pretty low, but a 20% lift on one hundred thousand dollars of revenue is just not high enough for us.
We want to get at least a $60,000 incremental increase in top line revenue for it to be economically viable. So that's why we have a $300,000 threshold.
AT: Okay that makes sense. So you really looking at that gain or output that they're going to get from a revenue perspective? So you're not just optimising for the sake of it, the small gains.
RR: Exactly.
AT: Because your playing ground is very much e-commerce you're getting that attribution and revenue attribution. So if, say for example that you, you said a team is thinking about going down this. You really want them to focus on revenue rather than just improving some of the more vanity metrics.
RR: That's a really good point because unless I know I've worked obviously with certain sophisticated clients where they actually have such sophisticated attribution where they're able to put a dollar amount to an email signup. So we know, for example, an email signup is worth $17. Well, if it's that quantifiable and it's so defensible, then it's okay to optimise for that goal. But if it's like, you know, we want more email signups. We don't really know what the value of an email signup is, but we want more email signups than I would steer it away from that opportunity, because the great thing about the product page is that I am directly tracing that to revenue and it's defensible.
AT: I love it. So are there any other planning tips that you would advise or how the teams get together or get buy-in or document what they're about to do.
RR: Yeah, I think one of the most effective things you can do is look at historical trend lines and what happens as a Data Analyst, you are trying to paint a picture for the management team, and they want to see a full narrative arc. Sometimes we go to CEOs and we go to our Managers and we say, you know, here's a point of friction I found, or here's an opportunity that I'm seeing. Out of context, it's not a compelling enough story, but if you draw a historical arc and say, Hey, I've been looking at the data for over the last three years, and I'm actually looking at a secular trend where our efficacy on the product page is going down that then you can then project the impact of that and that gets them to take action notice and shocks them into taking action.
AT: Joining dots and telling a story. So you mentioned qualitative. Do you like to complement that with the quantitative? Do you work that hand in hand, or do you like to have a certain amount of qualitative? Because I know you say you use buyer psychology. Do you treat that Carte Blanche or do you carry out some interviews? How do you like to get close to the buyer?
RR: I want to tread on this answer very carefully, because it's a very controversial topic and I’m not pontificating anything over here.
I'm just speaking from, from my own perspective, from my own bias perspective. I'm not a big fan of research at all. I don't like research. It's extremely expensive, extremely time consuming and extremely prone to bias and errors. So unless there's proper setup for voice of customer research or jobs to be done research, we're not involved in any case, I'm not involved in it. So I really liked the idea of quantitative analysis and I think one thing I would mention is that I think you can figure out a lot about the buyer by simply thinking about the buyer.
I like to meditate and think about the buyer and think about the buyer journey and it's really interesting how you can get these beautiful insights without having to interview hundreds of users. There are certainly certain types of insights. Especially, if it's a contrarian insight, you can pick up through interviews. So I'm not saying that research is not powerful. I'm just suggesting that it's very time consuming and it's expensive, but there's a lot you can get to by simply meditating on the problem and just by asking yourself, why would a consumer want to buy this product? What are some of the ways that we are not providing them the information and the clarity that they need to buy the product and just meditating on that. You know, we take a printout of each product page we're working on the stick it on the wall and we keep on staring at it passively all day long. And it leads to these beautiful insights that we would've never had if we simply said, you know, I'm going to spend an hour looking at a product page and I'm going to try to get as me and says, like, again, a lot of great insights come by ideas kind of floating around in the background.
AT: That's really cool. I like printing it out and putting it on the wall, making it tangible. I think a lot of teams should do that potentially just think about this, keep it front of mind and swirling, as you say. Awesome, Rishi this is great. So, so you've now you've worked through those, those first two phases.
You've identified that, you know, you have a problem. You starting to work up a bit of a plan and a solution. You know, you've got this promise of improving conversion rate of 20% in 90 days. How do you make sure you get to that point to achieve that? And I guess I'd love to also hear what, what can go wrong through that period.
How do you make sure you get to the point to achieve that?
RR: This is a very interesting question and, and the way we actually, you know, this is great war strategy, where you land on the shores of the place where it's going to have battle, and then you burn the ship because now the soldiers, no, they have no choice but to win the fight. And by drawing a line in the sand and saying that we will give you a 20% improvement in 90 days, the timer starts the moment the client engages us so we remove all the noise. The other thing is that. And so that kind of focus and clarity and that kind of pressure, it actually is what helps us get good results. What I find is that work expands to fill the time.
So if we were to go to clients, say we will take six months to get to a 40% improvement. We'd end up certainly taking six months, but we probably won't get to a 40% improvement, but by having it very tight and saying, look, we have 90 days to do it and we're going to give you a 20% lift. It's a really beautiful number because three months is enough time where you can actually kind of fully, it's not like a one shot scenario, but at the same time, it's short enough where the time runs out. And so I really liked that focus in, in our case, you know, because we are already focused on product pages, but more specifically, we are focused on just one product page, just the best seller page we've already kind of narrowed the scope long before we even signed the dotted line.
AT: I'm also looking at it from the lens of you obviously have a lot of built up knowledge and so when you're on tools and you're working it specifically or directly. I'm sure there's a level of expertise there though for teams who are trying to do some of this themselves – what can go wrong with doing their testing program or trying to improve the conversion rate of their product pages?
RR: I think one of the things that I can say that people can make a mistake with is one of the golden rules of A/B testing is you want to, and this is, I think of great value. You want to increase the contrast between the control, which is what you have and the idea that you are testing. And oftentimes I find that teams don't incorporate enough contrast.
So it's, it's, it's better, but it's slightly better. This can lead to all kinds of statistical problems. Number one is that the statistical tool is, is fine tuned to actually pick up on big differences. So if what you're designing is and incremental improvement the tool actually has a very hard time picking up on it, which means that the tool takes much, much longer to, to know for certain, if the change was to statistically significant or not.
The other downside of one of the things that teams struggle with is with, you know, if they're concerned about bringing in too much contrast is their concern is that how do we sell this to upper management? And again, I like to tell upper management that we are intentionally bringing contrast because we want to get to a winner faster.
So we're doing it for increasing the contrast specifically to get to an outcome. And so I think one of the mistakes that teams make, one of the, one of the downsides is that we come up with ideas that in order to get political buy-in from senior management teams. We want to make make bigger improvements, but then we get pushback from management teams saying that this is seems too risky.
And I think we shouldn't fall into that trap because if you fall for that trap and you reduce the contrast, you're actually increasing the likelihood that you will not get us statistical outcome, which means basically you would get a null result and you can't do anything with that.
AT: Interesting, so inconclusive and all's lost, waste of time. Sp think carefully about the tests and improvements you make to ensure that they have enough contrast. So these are like, do you use the word, like with the language bets or anything like that when you said working with these?
RR: We, we don't use the word bets, but we call them attempts.
AT: I imagine that would be from your experience a whole bunch of obvious things to you that teams make mistakes with. And so that would either be just through not enough experience or potentially just not enough exposure to the proper techniques, but is there like one common mistake teams make with their product pages that or tests they try and run that you wouldn't recommend.
RR: I would like to talk about something that teams don't do on product pages. I think this is going to be a very valuable lesson. This I think is a billion dollar idea. I'll do a thought experiment for your listeners.
Please think of any product page that you've been to and think of a large website don't have to think about a super small website, a fairly successful sophisticated, large website that you visited and now ask you to. If your friend went to that same page, or if some random person came from Facebook and would the product description change at all and the answer is 0% of being it being different. One of the golden rules of product pages is that they are cast. They are exactly the same. They don't change the description. Doesn't change this to me is such a ludicrous idea and it makes no sense because if you really think about it, the awareness level of the person you're trying to persuade on a product page varies.
If you are selling a blender, there are people that are coming to this blender page who have been researching for a blender for the last six months. There are people who are coming to this Blender page who've done no research at all. There are people who are coming to this blender page, who already have a blender at home. They don't like that blender, they want to find that blender but they need a good, compelling reason to do so. So the whole, these three sets of people that have come through this page, The description is identical for all of them and I think that's a really wasted opportunity. It'd be wonderful to kind of, to some kind of a conversational style finding out from the buyer.
How long have we been struggling with this problem? Or do you have a blender and things and then personalising copy? To match those responses. That to me is the ultimate and I've never found this in product pages except for the pages that I create for my clients but I think it's such a massive opportunity because we're asking the wrong question.
We're saying, what is the perfect product page? That's the wrong question. The question is what are the perfect product pages? There's a different, perfect product page for different types of buyers and you need to figure out a way to create that, even though it's the same product, you're creating a slightly different story depending on the buyer.
AT: I'm sure that's got a lot of teams thinking about how they're approaching their product pages as not just static as being static, but being more dynamic as you say to the buyer and their journey. Awesome. Rishi, anything else you wanted to add on how teams can achieve this to push them over the line and to get them either a going down their own optimisation journey or be reaching out to your team?
RR: So another thing I would mention, and this is the final thing I'll share with them is that big, close attention to the devicetype. Oftentimes what happens is when we design product pages or when we look at product pages and we have conversations about product pages, we're looking at the desktop version of the page.
If you look at your data, you might actually find that 85% of people that are buying from you or that have seen that page are actually seeing it on their mobile devices. Now, if you're not, you need to walk in the shoes of your potential buyers. So stop looking at the desktop version of the page, make a discipline, put a reminder for yourself to say that every week I'm going to spend 20 minutes just looking at the mobile version of my page and read the description over and over again. I think oftentimes we try to like in a snapshot come up with great ideas, and as we mentioned earlier, a lot of times ideas permeate just with time. So just read the description over and over again. If you read the description 20 times, I guarantee you'll have at least seven incredible ideas that will make you more money.
AT: So just keep staring at it don't rush it. You know these are very important pages as you, as you pointed out, a lot of teams would obviously appreciate that. So give yourself the time is what I'm hearing there and be mindful of a device. Are there any other segments that you see teams that missing or they should be aware?
RR: Yeah, So, I mean, one of the important segments is like with Google display ads or Google product listing ads, when you do a search for running shoes, the first set of results styles that show up are actually ads. That directly link you to the product page. Now, someone who comes directly to this product based through a paid search ad is fundamentally different than someone who navigated this product page from one of the inner pages, be aware of these distinctions and then personalise the story also for that, for those distinctions
AT: Rishi, it is, it is much in line, so it's, and it's fast paced.
So we have run out of time today. I wanted to thank you for your time first and foremost we really appreciate it. I'm sure that listeners would have taken a lot from that. If they did want to obviously follow along, they can check out in the show notes, your newsletter and yet you also invite you into the Focus Groups like community Rishi, and, and if any of the members want to reach out to you, then I'm sure they can.
But thank you again for your time and hopefully we get to chat again soon. Andy.
RR: Thanks a lot. That was wonderful. Thanks for your time.