Chartbeat Dashboard

A Redesign of Chartbeat’s Home Screen Experience

Problem
Chartbeat’s main dashboard was a point of significant drop-off for new users, and wasn’t serving existing users well.

Solution
A redesigned dashboard that compels new users to explore the product and is adds value for existing users.

Outcome
The redesigned page was more effective at driving users to features correlated with retention and scored higher in customer satisfaction surveys.

Role
Design Direction, Research, Prototyping

 

Background

Image showing what the Home Screen looked like for the 40% of users who had access to only one domain

Chartbeat is an analytics tool that helps publishers understand how audiences are engaging with the content on their websites and apps. The Home Screen is the first page users see when logging into Chartbeat. It is a key moment in onboarding new users, and for existing users, it’s a page they may encounter multiple times a day.

Despite this, it hadn’t been updated in years. It didn’t help explain what Chartbeat was for new users, leading to drop-off. Existing users complained about it’s lack of features, and often found work-arounds to avoid it.

When Chartbeat adopted a company-wide goal around product usage, I led an effort to look at usage for our key product surfaces.

We realized that roughly 25% of new users logged in, saw the Home Screen, and didn’t return within 30 days. The page wasn’t getting new users to engage with our product.

Discovery

Screenshot of Miro board used for the project kick-off

The goal during the discovery phase was to gain a better understanding of both user needs and stakeholder goals. To kick off the project I facilitated a workshop with the team working on the project to align on goals and scope.

Based on this we set a few goals for the project:

  1. Increase click-through rate (within 15 minutes) from 75% to 95%.

  2. Increase user satisfaction with the page.

  3. Increase click-through to key features linked to retention including our Historical Dashboard and email Reports.

Given our time constraints we wanted to start prototyping before waiting for our interviews to conclude. We started by listing out our main assumptions and looking at existing usage data.

Our two hypothesis regarding why the page’s click-through rate was low were:

  1. Users don’t see data relevant to them

  2. Users aren’t aware of particular data and links because they are hidden by default (and only accessible by collapse/expand buttons or hover interactions).

When looking at data, roughly twice as many users used the in-page links compared to the main nav. This suggested that building on our existing pattern of linking to our various tools could be an effective way of driving more people to a greater variety of product surfaces.

 

User Research

PROTOTYPE FEEDBACK

As part of our user research we wanted to get feedback from users on prototypes in order to validate/invalidate hypotheses we had. Our main hypothesis was that users would prefer a page which focused on aggregating more data about a single site as opposed to it being a list of sites with little data as it was now.

In order to get feedback we sketched out various ways we can achieve that goal. Below are some of those sketches.

Version 1

This was a rough sketch of the concept for showing data for a single site as opposed to a list of sites. It was visually rough at this stage, focusing more on which data to show as opposed to how to show it.

Version 2

In this iteration I continued to experiment with which data to show, but began focusing more on layout in order to see how much information would be digestible.


Version 3

This iteration began focusing more on visual design, increasing the information density while also attempting to reuse patterns used elsewhere, to make the data more recognizable.

Version 4

This was the sketch we used to get feedback from users. In addition to including historical data, we also included some features we knew were we’d unlikely be able to include in scope but which we wanted to get feedback to help us shape our future roadmap.

 

USER INTERVIEWS

In addition to getting feedback on prototypes we also wanted to ask users questions to help generate ideas for how to better serve them. To start, we wrote down our research goals. They were:

  1. How useful do users find the current version?

  2. How do users use the current page?

  3. What are the common pain points?

  4. What are users’ reactions to our prototype?

To answer these questions we:

  1. Conducted two surveys—one for new users and one for existing users

  2. Conducted 6 remote user interviews

TAKEAWAYS


1. The page isn’t useful if you have only 1 domain

At the time, the Home Screen was essentially just a list of sites with the number of active readers for each, which linked to the Real-time Dashboard. It’s true that for accounts that only had one domain, it wasn’t very useful.

  • “It has its purpose, but for users with just one Site the system should direct the user to the dashboard”

  • “Since we have only one main site, I do not actively use the homepage. I’ll go straight to the dashboard instead.”


2. The page doesn’t aggregate the data users need

In the previous version, there was only 1 metric—Concurrents (users engaging with your site). Since this was also the primary metric you’d see once you clicked into the Real-time Dashboard, users felt the page didn’t add any value, and if anything, was an unnecessary extra step to getting to the Real-time Dashboard. Users expected it to aggregate their real-time and historical data in one place.

  • “I rarely look at the information on the home page and primarily use it to click into specific dashboards.”

  • “I hit the homepage and almost immediately drill into one of our sites. There isn’t meaningful data on the homepage”


3. There's a lack of awareness and confusion about the few features on the page

In the previous version, users could expand each of the cards representing their site(s) or network of sites. The issue was that most users weren’t aware this functionality existed, and even when we made them aware, they didn’t find it particularly useful.


4. Users showed a strong preference for the new design over the current version

We used the interviews to both gain a better understanding of how users currently used the page and their pain points, and also to get feedback on a new version we prototyped. Though a small sample size, the users we spoke with unanimously and enthusiastically preferred the new version to the existing one which we viewed as a positive signal.

Based on this we iterated on the prototype.

Outcome

Image of what was shipped to customers

In the end we shipped a version that added many new features while retaining most of the existing functionality, which was challenging but doable because the previous version had minimal existing functionality. This approach allowed us to ship with confidence that we weren’t removing functionality that users were reliant upon, while also reducing stakeholder concerns about changes to a key product surface.

The key changes we made were:

  1. Surfacing more data including historical data

    Previously we only showed one real-time metric. The new version showed three key real-time metrics, while also allowing users to see their historical data for yesterday. Additionally, we added top pages lists both for now, and yesterday. Previously users would have to click through to see those in two separate dashboards. This change had the biggest impact on users who had access to only site. Now, rather than seeing a sparse page, they could see aggregated data about their single site in one place.

  2. Adding links to more features and making existing ones more obvious

    Previously we only linked to our Real-time and Historical Dashboards. Now we added links to most of our other core tools, including those with a correlation to retention such as or Reports. We also updated the existing links to make them always visible as opposed to being hidden behind hover states.

Before shipping the test to all users we shipped it to a small subset of users and surveyed them to see whether they preferred the new or old version. The survey results conclusively showed that users preferred the new version.