Traffic Source: HackerNews, link to landing page with demo video targeted at early adopters

Result: Pass. Built list of around 5,000 interested prospects. The problem description resonated strongly with their targeted early adopter audience.

Step Back: Moving to a video format helped get across the product’s story more effectively. This made it more accessible to everyone. In and of itself, an explainer video is just a change in format–similar to how a bestselling book is treated as a candidate to become a blockbuster movie. Ultimately, a landing page that tests demand much get across the story well. The medium for telling that story is secondary to the relevance and quality of that story to the target audience.

As Dropbox’s initial target audience was quite technical, this explanation mapped to a number of tools and behaviors they already knew from software development. And they were more than happy to run with the write once, read anywhere concept. Arguably, this niche perceived other cloud storage as defective because it didn’t support this kind of functionality. As a result, they needed little convincing and persuasion to hop on board.

Hypothesis: Paid search can be a profitable engine of growth for Dropbox

Test Type: Growth hypothesis

Success Criteria: Customer Value > Customer Acquisition Cost

Traffic Source: Paid search engine marketing

Dropbox experimented with using paid acquisition on a landing page. This is not an early-stage landing page MVP test. It’s an attempt to figure out what will grow the company, not whether the product idea is attractive. They hired an experienced search engine marketer, who went out and made landing pages. On those pages, they hid the free option, replacing it with a free time-limited trial.

google adwords launch tomorrow
AdWords interface is showing incorrect campaign conversation numbers. Difference of numbers are computed here.


Total ad spend: approximately $3,000 in the image above

In their words, here were the problems they faced:

Result: Fail. As their cost of acquisition at the time was at least $233 for a $99 product, the experiment to test paid acquisition as a profitable traffic source failed. Based on the economics of paid search, pay-per-click didn’t look like a viable growth strategy for Dropbox.

Step back: Even though PPC as a source of traffic didn’t work for them, it solidified their confidence in their ability to retain customers.

If people bought, their subscription retention rate was over 75%. In short, they had a great product their community loved, and they had product market fit. In Drew Houston’s words, this meant that “product-market fit cures the many sins of management.” After this idea failed, Dropbox created a famous explainer video which went viral, thus proving that a viral engine of growth was better for them.

Hypothesis: A simple value proposition resonates more with the target audience than a complex one

Test Type: Value hypothesis

Success Criteria: Conversion rate > initial conversion rate

Traffic Source: Paid search engine marketing

dropbox design launch tomorrow
Simpler the better!

Result: Pass. Simple and concise converts better, as does having a clear call-to-action for the next step.

Step back: This test type is taken out of the traditional toolbox of conversion rate optimization (CRO).

The idea is clear to the founders. They want to communicate it as concisely and effectively as possible to their target audience. Even if they move to a different traffic source later (as Dropbox did), a clear and powerful value proposition ensures a high conversion rate for all further marketing efforts, including free traffic sources. Moving too early into this kind of testing can be a type of premature optimization.

The Takeaways

The key tactic Dropbox used was to to test both market and technical viability simultaneously. In addition, they did a number of smaller tests. Each test checked a much smaller piece of the bigger puzzle. This required them to break down the overall vision into discrete tests which they built and ran.

Build -> Measure -> Learn

By running a series of experiments, Dropbox stayed with the ethos of MVP=experiment. Each cycle around the Build, Measure, Learn loop gave them greater insight. Each step they took tested something new about their target market and their product.

As a result the product evolved very quickly, because the team gathered actionable yet counter-intuitive data. This helped them build a strong USP (unique selling proposition) in a crowded marketplace using technology that was theoretically possible but unproven.

There is a lot more to a minimum viable product than just a beta software release. This post aims to make that clearer.

<< Help Yo' Friends

Empty pocket testing by Buffer

buffer launch tomorrow

Tweet or Posts on a social media at a pre-defined schedule

The following are a number of Lean Startup validation case studies. Some will already be well known; some will be completely unknown. A lot of landing page testing has happened since The Lean Startup was being pieced together by Eric Reis. These are retrospective reconstructions of what happened using landing pages as vehicles for minimum viable products.

For example, Buffer did empty pocket testing with a landing page before building their product. Just for your context, Buffer is a social media sharing tool, allowing you to publish tweets or social media posts on a pre-defined schedule.

While you may have heard of some of the lean landing page case studies before, there is a lot of nuance in exactly what each test actually tested. They are typically not “traditional” A/B split tests, where they were testing whether a new variation of an ad or landing page beat the old one.

In order to help make it more explicit, I’ve tried reformatting the experiments to be lightweight. Lean Startup experiments are generally not about testing the landing page or the product, but the business ideas they represent.

Hypothesis: The target audience wants this product

Test Type: Value hypothesis, confirm the problem exists and people want a “hands off” way to tweet

Success Criteria: Emails gathered > 0

Traffic Source: Social media, word of mouth


Result: Pass. A few people used it to give founder Joel Gascoigne their email. He used these to get some useful feedback and initiate a conversation with prospects.

Step back: Potential users had left their email address at a random web page promising them help with this particular problem. This meant the idea itself was valuable, and there was potentially unmet demand for this particular idea. I would be careful to use only # of emails gathered as the primary metric in all cases.

For a consumer facing product, this is probably good enough, assuming you have enough traffic. It would be better to also include some kind of a target number of sessions, to make sure that you have enough “attempts to convert” to make your metric meaningful.

Hypothesis: The target audience is willing to pay for the product

Test Type: Value hypothesis, confirm declared willingness to pay for “a way to automate their tweeting”

Success Criteria: People would click-through the additional pricing page, and still leave their email.

Traffic Source: Social media, word of mouth

buffer mvp launch tomorrow

Pricing packages of this social media tool


Result: Pass. People were still clicking through this additional step. Joel was able to gather useful information about the suggested pricing plans, in order to figure out pricing.

Step back: Potential users weren’t put off by the blatant pitch, and still kept leaving an email address at the far end. What Joel hadn’t tested was whether people would actually buy; however, he was able to complete a functional prototype within seven weeks, and tested this hypothesis with a functional system. He actually got his first paying customer 4 days after the “rough-around-the-edges” product launch.

I just wanted to thank Joel for contributing this fantastic case study to the Lean Startup community. It’s quite a well known one. As a result, I really wanted to cover it as an example of a line of thinking that’s worth following.

Empty Pocket Testing

At the core, the Buffer landing page MVP test was meant to address a major question for founders: will “they” buy it, if I create it? Before getting caught up in theoretical debates about what is and isn’t an MVP, Buffer just did an experiment. It just happened to be using a landing page to address a major risk factor for a new product business.

In this particular case, checking for whether early adopters

  1. had the budget and
  2. were willing to spend it

de-risked spending more time and money on the solution significantly. Even though they were asking theoretically, this helped to validate their sense that their target early adopters would be willing to pay.

Buffer landing page MVP launch tomorrow

People are willing to pay out of their pockets for this tool.

[image:Dan Moyle]

If you’d like to see a number of case studies like the above, grab Launch Tomorrow. I’m updating it in an upcoming version with a lot of in-depth experiments that have been run.

Lean Startup 101

lean startup launch tomorrow

Focused learning is the fastest way to validate a product idea.

Lean Startup is based on the scientific process, albeit more business focused, focused on learning fast when you know little. It’s the fastest known way to validate a product idea. Consider this a Lean Startup 101 introduction to how to use it in your business.

The first step of choosing a hypothesis to test is to map out your business with canvases. Business Model Canvas for the big picture, and any of the following three to figure out problem-solution fit:

After you’ve settled on a vision you’re happy to start testing, you prioritize what will give you the strongest boost in confidence. You try to get data confirming it’s true.

Let’s say you’ve now settled on a specific assumption you’d like to test. Which means you need to map a fuzzy concept down to one metric.

Choose a Metric

Choosing this one metric is the next step (and one which is largely driven by intuition).

Which number represents what you want to change? Or captures what you want to monitor? This requires some analysis, but also some subjective skill.

For example, do you want to increase sales or profits? There aren’t really clear answers, because it depends on where you are and what you want to achieve. A venture funded startup interested in growth at all costs is happy to be unprofitable, as long their growth is up and to the right. A bootstrapper will be monitoring cash flows and profits like a hawk.

If you are just exploring an idea and a new market, gather data to confirm that the external environment matches your assumptions. For example, one startup I worked with was testing out an idea about lending and borrowing DIY tools to strangers. They wanted to build a peer-to-peer lending platform for these tools. They found that consumers were much happier to lend out tools to strangers, than to ask to borrow.

Once you have problem solution fit, focus on metrics that you can influence with your actions. Choose one that’s actionable. Dave McClure’s pirate metrics are useful here as a high level starting point: Acquisition, Activation, Retention, Revenue, Referral.

Choose a success signpost

For this particular metric, figure out what means “success” to you–before you run the experiment.

This is a common point where founders trip up. What’s “meaningful”? Conversion rate hurdles are a good example. If I choose a conversion rate that’s too high, and the experiment doesn’t pass, I’ll have to give up on my idea. If I set it too low, I’ll be building a business around a bad idea and end up with a zombie product. In this case, a meaningful conversion rate is one which gives you a customer acquisition cost which is less than the expected long term value of a customer.

Here’s an analogous situation I’ve seen offline. One team I worked with were gung-ho about an idea for a fitness app. They wanted to get 75% of the people they speak with confirm that the problem they wanted to solve exists. After speaking with 20 people, it turned out 30% confirmed that they had this problem. In this particular case, they’d been too aggressive about setting the success metric, as they would still be able to build a good business around 30% of a consumer market.

Choosing a success metric is also very much an art, one that requires a good sense of context and an awareness of other data driving the success of your startup. A great way to do this, by the way, is using the Grasshopper Herder experiment template. This is also available as a bonus for Launch Tomorrow buyers.

Intuition+imagination: the secret sauce

So even the scientific process requires intuition–to choose and formulate hypotheses with practical implications. What’s more, these come from a worthwhile vision, based on what’s possible for you and your founding team.


The 80/20 of Lean Startup

 lean startup launch tomorrow

80% of results from 20% of the resources

Remember the 80/20 rule?

You get 80% of the results from 20% of the inputs. Well, 80% of the results from using Lean Startup come from running experiments, in my opinion. Meticulously.

  1. First, focus around the product; prove your value hypotheses.
  2. Then, focus on growth; prove your growth hypotheses.

If you aren’t running experiments, you’re not learning. You’re doing what Ashley Aitken from Australia calls “Faux Lean Startup”. For example, think back to how tech startups launched in the dotcom days.

Old Traditional WayFaux Lean Startup
Business PlanLean Canvas
Market ResearchValidate solution
Waterfall Project Management“Agile”
Alpha -> Beta -> v1.0Alpha -> MVP -> v1.0
Private Beta testingEarly Adopters
Launch ProductLaunch Product
Get market feedbackGet market feedback

Faux Lean Startup means you replaced some of the dotcom era tools with “lean approved” tools. If so, you’re missing the point of lean startup. Lean Startup isn’t only about customer development, personas, and interviewing customers. UXers have been doing that for decades before Lean Startup existed.

In my mind, the classic example of this is using “MVP” as a synonym for beta product. Beta software means that you still expect to have missed some bugs in the software. Your unknowns are technical. An MVP is completely different animal than a beta product. It’s created to test initially one business assumption. With MVPs, you test unknowns on the business side—not technical ones.

80% of the results from using Lean Startup come from running experiments. Click To Tweet

It’s easy to fall back into the old pattern of avoiding what you don’t want to hear. It’s only human. You need to be deliberate about learning, in order to really get valuable feedback. A true lean startup implementation requires you to be running a long series of experiments (at least 1 per week). Minimize your own biases. Maximize learning when you know the least. See things how they are, based on data you gather from the market.

That’s the 80/20 of Lean Startup.