Get the newsletter everyone is talking about

Drop your email to get weekly small business advice

GoSite Operations Team: 6 Steps to Run Data-Driven Experiments

Experiments are the thrust that propel innovation. Computers, iPhones, Gmail, refrigerators, post-it notes all wouldn't be here today without experiments. If you're in operations it is your duty (and privilege) to experiment constantly.

Jordan Sternberg
7 min read
featured_image

As operators, we're the tinkerers of the business world. The problem solvers. And we love running experiments. But without a defined process for running and controlling these tests, ops leaders run the risk of facilitating experiments that result in zero validated learnings.

(Learn more about validated learning when you read a GoSite team favorite—The Lean Startup by Eric Ries.)

In an experiment, validated learning is the gasoline that keeps the engine running. Experiments can be successful or fail, but as long as you learn something important, every experiment can be a success.

At companies with young leaders it can be all too easy to see decisions get made without the help of data. This is a byproduct of what leaders call "following their heart" or "following their gut."

Although both of these things can actually lead to strong business outcomes they should also be reserved for last unless you’re incredibly short on time. So as operations leaders it’s up to us to help our teams prove that their gut feelings and their hearts are correct or incorrect.

Follow these steps to a T and you're sure to keep your experiment engine running.

1. Start with a Goal

cartoon businessman running with an arrow towards a target

In any experiment you need a goal.

What the heck are you trying to prove or disprove? What are you trying to do? This should be the foundation of your experiment. Goals can come from upper management, from within the team, or from natural business needs. Some examples:

  • Increase sales conversion
  • Increase sales velocity
  • Decrease churn
  • Increase customer NPS scores

All of these are great goals. And now we need to turn them into a hypothesis.

2. Build a Hypothesis

Now that you have your goal, you need to form a hypothesis. A hypothesis can be based on your gut, what the competition appears to be doing, previous experience, etc.

(Mission-critical tip: Just because something worked at your last company DOES NOT mean it will work at your current company.)

Some hypotheses could look like:

  • If we use a new call script, it will increase our sales conversion.
  • If we offer a new lower price, it will increase our sales velocity.
  • If we have sales reps perform the initial build, it will decrease churn.
  • If we create a new service level agreement (SLA) on our support team, it will increase customer net promoter score (NPS).

Learn more about how to assign acceptance criteria (metrics) to a hypothesis in step 4.

3. Find a Baseline

cartoon man taking notes on a pad of paper

Generally speaking, the younger the company the harder this is going to be. In my personal experience, founders spend a lot more time doing than tracking. This is out of necessity because early-stage startups are at their most vulnerable and keeping the ship afloat takes precedence to good data practices.

When finding a data baseline for the specific hypothesis you're trying to prove or disprove, I generally recommend taking at least six months of data. This data can often be extracted from your CRM (if you have one), from Excel files, billing systems, etc. This step may take some time for your first experiments because you may be doing a lot of data appending and Excel work to get the data to reflect a meaningful baseline. However, this time and effort will pay dividends in the future because it will enable you and the rest of your team to pull data more readily and get into better adherence practices.

If you're trying to affect sales conversion, you'll need at least the last six months of sales conversions to give yourself a reasonable baseline before you run your test. Similarly, if you're trying to increase sales velocity, you'll need the last six months or more of sales velocity.

4. Know Your Acceptance Criteria

All too often, I've seen companies run experiments without being able to answer this question: What does success look like?

If you can't point to exactly what metric—whether qualitative or quantitative—that you're trying to change, you don't stand a chance of providing any validated learning.

With your baseline in hand you can work with stakeholders to build lofty, yet achievable targets for your experiment. Acceptance criteria will be your guiding light in determining if you experiment was successful...or not.

Here are some examples of good acceptance criteria:

  • Increase sales conversion 15%.
  • Increase sales velocity 20%.
  • Decrease churn 50%.
  • Increase NPS scores 60%.

By definition, acceptance criteria should be tied to a specific metric. Ideally these will be quantitative, but you should make qualitative considerations here as well. If you increase sales conversion by 10%, but the entire sales team hates your new process, then you haven't hit a legitimate acceptance criteria.

You don't necessarily need to spell out every negative externality when documenting your experiment, but they should be factored into the equation. For example, if sales conversion increases 10%, but churn increases 40%, then your experiment was not successful.

5. Begin Your Experiment

cartoon team of people working together at a table

This may sound odd, but many companies find this step somewhat difficult. Whether managers are concerned about negative externalities that could become a problem from your new experiment—or your company is unaccustomed to running experiments—many companies struggle to get things off the ground.

Here are some tips to get past these problems:

  • Test by committee. The more managers, operators, and c-suite members involved with the new test, the easier it will be to make sure a test gets run.
  • Have extensive documentation. Oftentimes, managers may feel their concerns aren't being voiced. Documentation is a great way to help their concerns feel validated and offer safety nets to help prevent negative externalities.
  • Start small. You don't have to go crazy on your first test. Try it with 5-10% of your team to start and if things are still looking good, increase your test group.
  • Always relate findings back to your goals. The whole company should be aligned around the goals of your experiment before you run though. Reiterating these can help move tests along.

6. Don’t Drop the Ball

Now the work really begins. Getting the data, convincing management and starting your experiment are the easy part. Now you have to help train your managers and reps to ensure the experiment is communicated and rolled out effectively.

If you're offering a new price point or package you'll need to work with marketing and sales to prepare the necessary collateral. You'll need to work with coaches and managers to help sell the new prices/packages. You'll need to meet with reps regularly to explain new procedures and communicate any changes that may affect the new process.

During this phase ongoing communication is essential. Schedule semiweekly (twice a week) meetings with relevant stakeholders to be sure everything is running smoothly.

Above all...Stay the course!

If you want to pivot a company you've got to get your hands dirty. Listen to sales calls, become acquainted with your testing team and the stakeholders. If it were easy to make this pivot, the company would have likely done it long ago. Steadfast dedication to your experiments is sometimes all it needs to get your experiments across the finish line.

"All Managers of metrics have an opportunity to become leaders of people."
—Simon Sinek

Whether Success or Failure, it's Success

If you've done your homework and followed this process you will have learned something by running any experiment. You'll have learned that adding new packages or prices or SLAs does or doesn't affect conversion rates, velocity, churn, etc. Which will open the door to further experiments and process changes.

Whether your experiment was a success or a failure, as long as you have validated learning, you'll have run a successful experiment.

Even if there were negative externalities that came along with your experiment, they can give you valuable insights for future testings. Armed with this new knowledge, your next experiment will have that much higher of a chance to be a success.

That's what validated learning is all about: Forever increasing the knowledge capital for your business as a whole. No two businesses are exactly the same, so the more information you have specific to your business the faster you can succeed.

The Pivot

Eureka! You've hit your acceptance criteria and your experiment is a rousing success by all accounts! Pat yourselves on the backs, friends...

Now what?

Tell everyone.

You've successfully run an experiment. You've checked off the acceptance criteria and overall the positives of the experiment outweigh the negatives and you have the data to prove it.

It's time to pivot the company around your successful experiment. Now it's time to talk to all stakeholders and roll out a new process design that brings your experimental conditions companywide. Your 5-10% test group will now be your allies in retraining the entirety of the team around this new process.

Take time here to iron out any remaining kinks and make the process as robust as possible. You may go through more rigorous process design and debugging as you did in step six to be sure the whole team effectively transitions to the new normal.

Be patient here; oftentimes the test cohorts are more capable or experienced representatives. Getting buy-in and training the entirety of the whole team can take quite a bit longer. Lean on your managers here because they'll know each rep best and how to articulate the new process to them as effectively as possible.

Keep an eye on your Goal metrics for the next six months or so. You should see things start to trend toward your test group. You may not see company-wide metrics quite match your test group, but as long as things are creeping toward your goal you're on track!

Looking for an operations opportunity? Check out the GoSite careers page to see what's in store for you.

View GoSite Careers

0 Comment
Jordan Sternberg
Comments
GoSite-2
The easiest way to connect your business with more customers.
Group 12 (2) (1) (1)

Keep Reading

There are no related posts