Small Business Marketing Blog

How to Validate Pricing for Your New Product in Less Than 1 Week

Written by Blinking Eights | Aug 15, 2022 9:34:59 PM

 

✏️ Test Design: 1 day

 🗓️ Test Length: 4 Days

 👨‍🔬 Analysis: 0.5 Days

 💪 Evidence Strength: Strong

 🛠️ Tools Used: Hubspot Email, Landing Pages

Pricing your product can be one of the hardest things to get right for a new product. Whether you’re a startup trying to launch something new or working at a bigger company and trying to release a new product, the question eventually becomes “how much do we charge for this?”. There are many ways to go about trying to figure this out but I thought I would share an example of one test I ran for Strategyzer when we needed to figure out how much people would pay for an online workshop. Covid had struck and like everyone else we were going virtual. We needed to transition our in-person Masterclasses, an immersive multi-day event into an online only event. A very different experience with a very different price point. In this blog I will walk through the thinking, planning, execution and evaluation that we went through. I am going to change a few of the details (the pricing options etc) but other than that this is exactly how I did it. Since this test, Alex Osterwalder has been using it as an example in his Testing Business Ideas Masterclass.

Step 1: Brainstorm Ideas

You have to start somewhere, so we came up with 3 different options. We started with the lowest price we could sell for that would still cover the costs and at least be cash positive. This will be called Option A from now on, let’s say it is $300. Option B, call it $500, we decided would give us a healthy profit if we could sell the same amount of tickets to a virtual class as we could to an in-person one. (However we did have the hypothesis that we could sell more because it would be global and people wouldn’t have to travel, but I digress). Finally Option C we wanted to test the top of the market so we priced it much higher than the other 2 options. For this example let’s say it was $900. To get to these prices we did some quick napkin math and went with our gut that felt like these were reasonable prices and reflected the value customers would get. The conversation was maybe less than 20 minutes in one of our recurring test planning meetings and we were ready to jump on it.

Step 2: Design the experiment

Because the Masterclass was going to be online and so was the entire customer journey, it made it pretty easy to set up a few steps that would really validate our assumptions and show us high intent from the customers. The most important thing I have learned when trying to price products is, that if you want to KNOW how much people will pay, you have to create an experience where they think they are actually paying. Asking what they would pay or running ads with different price points and measuring clicks are okay to get direction, but they don’t show actual intent. Those types of tests might be a good place to start if you have NO idea how much to charge however to actually know if people will pay for it, you need to get them to click something they think is a purchase.

This topic can often spark some debate and for good reason. “What if we piss off customers who think they are buying something that doesn’t exist yet?” is a completely valid question. So far I have run quite a few Wizard of OZ type tests over the years and haven’t seen much backlash, if any, because I do two simple things. Number one, be honest with the customers after they click to buy something. I always create a landing page that explains that this is a test and thanks them for their feedback. And number two, offer them a discount for their troubles and offer them an opt in so you can connect with them when the product is live. That’s it. Pretty simple. Usually people are willing to wait a bit to get access at a discount and they are actually kind of excited to go behind the curtain of products and companies they like.

Because we wanted to measure real intent, we needed to create a journey that seemed as real as possible for our customers. We didn’t want to just run ads or measure email clicks because we needed an exchange, that “yeah I am buying this right now” moment. I created a 3-step process for customers to go through, an email top open then click and a landing page with a “sign up now” button. I created 3 identical emails, landing pages and “Thanks for participating, this was a test” landing pages and changed only the price on each page. We had the benefit of a good sized list of customers and newsletter signups to email, if you don’t have that luxury you can run some quick Google or Facebook ads like I did when testing book titles for Stefano Mastrogiacomo’s book “High Impact Tools for Teams” which I will write about later. 

Creating a 2-step process meant I could measure a few different things. 

 

  1. Interest in the idea: I could measure which emails were opened the most - Option A, B or C. This would show us if people were interested in the online Masterclass at all and if price was a deterrent right up front. 
  2. Desire to learn more about it: When we looked at clicks in the emails we could learn if people thought enough about the idea that they wanted to learn more. The email also had the first mention of the price. We could see if people were scared away by cost pretty early. 
  3. Intent: Webpage clicks, bounce rate etc. The CTA on the page was very direct “Sign Up Now” right beside the price. There was no mistaking what people would be intending to do.

The assets took an afternoon to create in Hubspot and to make sure all the tracking was in place. We couldn’t afford to run a test like this to find out that our analytics were messed up and we didn’t get good data. I sketched out the test in Miro and this is what it looked like. 

 



Step 3: Running the Experiment

So I had 3 emails, 3 landing pages and 3 thank you pages. I also created thank you emails for anyone who dropped their email in again on the thank you page. We wanted to make sure that we had the right audience so we segmented a list in Hubspot and then created 3 totally random lists from it. We had no idea who would get what price point message. We decided that we would let the test run for only 3-4 days because realistically, if people don’t open, click and buy something from an email within that time frame they probably aren’t going to. So we sent the emails out to the 3 groups and we waited. 

Within 24 hours we had probably 90% of the data we were going to get back and we were starting to analyze it. Hubspot has great built in analytics for this kind of stuff. You are able to see all the data for the campaign in one place and compare email and landing page performance in one view. I still like to take that data and visualize it in Miro to help tell the story. Seeing the journey mapped against the results instantly helps people understand what happened. 

Step 4: Analysis and decision making

So we had our data and as usual, the results were really interesting. Just to recap, Option A is the lowest price, $300, Option B in the middle at $500 and Option C, the most expensive option at $900. First we looked at the email data. 

Email opens for the 3 versions were basically the same A - 31%, B - 29% and C - 30%. Nothing here was able to give us any indication on pricing BUT it was good to see such high open rates, meaning people were interested in the online Masterclass. 

Pricing Test Example Template

 

Next we looked at email clicks. Surely people would be less likely to click on a higher price point right? Wrong. Again the clicks were basically the same. A - 2.4%, B - 3% and C 3%. Slightly annoying to have everything so even through the first 2 steps, but we really wanted to see who would get to the landing page and actually try to buy a ticket. And THAT my friends is where the fun was. 

Pricing Test Example - Email Metrics

 

Wait, before you read on, what do you think happened?

You think the lowest price point would win right? Don’t lie, that’s what you thought. 

That’s what I thought too but we all wrong. While the lower two price points A and B came in pretty close at 32% and 28%, option C DESTROYED them both with 61% click through to sign ups. We were floored at the result and had to check it a bunch of times to make sure we weren’t seeing things or that the tracking wasn’t messed up somehow. In the end we determined that people were 2x more likely to buy our product at the highest price point. Amazing, we are going to be rich! But wait, hold on.

Pricing Test Example - Clicks to Buy

 

I presented the results to rest of the team and it sparked a very interesting conversation. We ultimately decided that since we didn’t know how good the product was going to turn out (we hadn’t run a Masterclass online) we just didn’t feel right going with the top dollar price point. We were confident that we could deliver a great product but we wanted to make sure that with all the variables we were facing with the production of the Masterclass that participants would feel good about their experience and feel like they got great value. So ultimately we decided to go with the middle pricing to start off with. Enough to cover the costs and make a profit but not so much that we risked under delivering value for the customers. 

The total time to design, run and analyze this test was 5 days. 

If you are interested in more conversation about business testing and innovation. Please join our Linkedin Group and sign up for our Newsletter