Think back to when you were a kid, and you would stand with your back to the wall to measure how much you’ve grown. You’d put a mark for how tall you were, and you’d do this every few months, or every year. You’d then compare the marks over time to see how much you grew. While a micro example of benchmarking, we apply the same basic techniques to everything we do here at Zion & Zion, and user experience (UX) is no exception.

Benchmarking is the process used to measure how much something has improved by comparing it to previous standards.

UX is the experience a user has interacting with something (whether that something be a website, mobile app, ATM, or any other product). For example, think about someone withdrawing $100 from an ATM. Whether they’re able to easily accomplish this task or not can be due to the machine’s user experience, but that doesn’t mean it’s the sole reason. At Zion & Zion, our UX team focuses mostly on the UX of websites and applications. Our mission as UX strategists is to make all interactions as seamless, intuitive, and easy to use as possible. By benchmarking, we have the foundation to continuously improve upon our work. Today we’re going to focus on why benchmarking is important, and how to benchmark with analytical data and user testing.

Why You Should Benchmark

Whatever service or product your company offers, benchmarking specific aspects of your business, such as year-over-year revenue, will give you the data you need to determine if you need to improve and how much. You’ve heard the saying, “set the bar high,” but what benefit does that have if you don’t know where the bar was before? Often times, companies fail or under deliver because they set the bar unrealistically high during the improvement process.

You can and should benchmark your business in multiple ways. For example, here at Zion & Zion, we set benchmarks based on quantitative and qualitative research for our websites and web applications. By performing research, understanding what KPIs (key performance indicators) are important to the business and understanding what areas are most troubled, we can create a benchmarking plan to not only give context, but to provide analysis on how much we’ve improved. This all ties into UX in terms of how we can improve the experience of users based on information from previous standards set.

When it comes to benchmarking, one thing to keep in mind is that you’ll be much more thorough if you combine efforts. While there are countless forms of quantitative and qualitative research, below I chose two of our top picks. By including both analytical research and user testing, you’ll have a more expansive data set to work with.

Analytical Benchmarking

Analyzing the data you get from benchmarking can be subjective since there are many unforeseen variables that can attribute to the changes you see. For instance, you may see that your website traffic increased this weekend. By looking at just the analytics, you may not be able to truly understand why. However, by performing additional research and digging deeper, you may uncover something truly insightful. For instance, what if you find out that the cause of your traffic spike was from a new advertisement the digital marketing department was running, or maybe it was due to a major current event that affected a certain region of the country. When benchmarking websites, there are key analytical data points we look at including, but not limited to:

  • Page views
  • Time on page
  • Bounce rate
  • Average number of page views per session
  • New vs. returning visitors
image_a

Client Example: Google Analytics data sample

Adding an annotation within Google Analytics noting whenever you launch a new website or make a major change to your existing site. This way after launch, and as you begin to collect data, you’ll have a record of when the new website/changes started to collect data. As more time passes, you’ll be able to perform date-range comparisons to analyze what has changed on the new website while comparing similar date ranges allowing you to benchmark analytically.

When picking date ranges to compare, you want to be sure you’re comparing apples to apples, meaning you want your days to follow the same day of week and same time period of the month (e.g. January 1-31, 2015 compared to January 1-31, 2016).

date range in google analytics example

Client Example: Date range in Google Analytics

If you look at the figures below, you’ll see what this looks like in Google Analytics. Based on this kind of data, you can determine if your website has improved or declined based on the benchmarks set. When analyzing the data based on benchmarks you’ve set, you’ll want to be aware of the multi-variant factor. For instance, in a new variation of a website, you could focus heavily on improving one aspect of the site, only to find out later that in doing so, you hurt another area of the site. At minimum, by knowing where you were previously, you can have context on how well the current data is doing.

daily session data in google analytics

Client Example: Daily session data in Google Analytics

image_d

Client Example: Data overview within Google Analytics

User Testing Benchmarks

While analytical benchmarking provides you with quantitative data, user testing benchmarking gives you qualitative data. With user testing, you’re able to witness firsthand how your users are interacting with your website. For this reason, the value gained from testing on actual users is second to none. User testing can take on many different forms including, but not limited to:

  • Preference and click tests
  • Five second tests
  • Recorded user tests
  • Observed user tests
  • Remote user tests
  • Heat maps
  • Scroll reach maps
zion and zion heatmap

Client Example: Zion & Zion Home Page Heat Map

Here are a few examples of the different types of user tests you can perform and analyze to help determine what your benchmarks should be. Above is an example of a heat map; you can see where users are moving their mouse, what they’re interacting with, and what they’re ignoring. With recorded user testing, you can analyze everything the user does, from how they interact with certain elements, to areas of the site that cause confusion or frustration. Social psychology principles such as the Hick-Hyman Law that talks about time it takes for a person to make a decision as a result of the possible choices he or she has can help you better analyze data you get from benchmarking.

image_e

Client Example: Further heat map analysis from Zion & Zion’s main navigation.

image_f

Client Example: Additional heat mapping data for user testing.

In general, user testing requires much more carefully planned and defined benchmarking procedures because you’re dealing with real users. When you setup these types of benchmarks from user tests, it’s crucial you ask the same questions, have users perform the same tasks and use the same demographics; otherwise you’ll be comparing apples to oranges, and that is less than ideal.

We’re constantly running user tests for our clients when performing research and testing wireframes.

Conclusion

In conclusion, benchmarking is a process that should be incorporated into everything you do, and user experience, web design, and development are no exceptions. It’s a process that has no boundaries.