Beauhurst’s most-requested collaboration feature
The story so far…
Beauhurst is IMDB for UK companies. We have profiles on companies, the people at them, investments into them, and more.
Our subscribers can save these profiles to lists that we call “collections”.
I worked with a product manager and developer. We were asked to improve engagement with collections.
The product manager was new to the company, and wanted to try Teresa Torres’ opportunity space trees for the first time.
Metrics showed that subscribers with more collections were less likely to cancel their subscription. But we didn’t know whether this was correlation or causation.
We wanted a quick win to start
As we got stuck into bigger work, we wanted to ship something fast to get the blood flowing. We focused on these two buttons above our search results table. One of them is “Add to collection”. It’s a bit quiet. If we could make it a little louder, maybe that would increase engagement?
So I fired up the idea machine
I like to come up with a lot of options. They were all variations on the same idea: make the buttons visually heavier.
And made a very small change
We made the buttons violet, and gave them longer labels to make it clearer what they do.
Even for small changes, I like to explore lots of options. This way I feel more comfortable that we choose the right direction.
We spoke to our subscribers
We booked in interviews with relevant subscribers and spoke to them about how they use collections.
Then tagged their conversations
We used Condens for this.
And put it in an opportunity tree
What I like about this method is that it gives you a visual overview of the opportunities, and helps you to choose between options.
We decided to focus on collection visibility
Collection visibility means that subscriber A can see what subscriber B has done with their collection. This helps to avoid two people focusing on the same company, for example.
Before this project, lots of subscribers had asked for “collaborative collections”. We assumed this meant “more than one person can edit the same collection”. The interviews showed us that it didn’t. What people actually wanted was visibility of each others work in collections.
As we chose an opportunity to focus on, I realised that I might have pushed for certain directions because I felt more comfortable with them. This is a dangerous reason to choose a direction. Maybe it’s not the most valuable? Maybe the “uncomfortable” projects are just as feasible, once you get stuck in?
We came up with some ideas to address this
Items in collections can have tags assigned to them which help subscribers remember something about them. This works like tagging does in any CRM.
And then some assumptions we'd need to test
These tests would help us prove or disprove the important assumptions attached to a certain project.
Before this exercise I had not realised how hard good assumption testing was. The first idea we had was often “let’s build it and find out if it works”, which isn’t sustainable.
I thought up some broad approaches
I put together some mockups that represented different approaches to the opportunity in the platform. For each of these steps I discussed the options with the squad, then iterated based on feedback.
And built some simple prototypes
Once we’d settled on a general direction, I built prototypes so we could test them with our subscribers.
I refined the best option
Results from subscribers and feedback from people at the company helped a lot.
And handed it off at the last minute
At this moment we started another product squad to work on central feature of the platform, and I was needed elsewhere. I handed this project off to a junior designer, and gave them my notes from the most recent round of feedback.
The “polishing” phase might be my favourite, so it was a shame to hand off the project just before the finishing touches.
The results are good
Metrics have shown that collection engagement is up. Our account management team has passed on positive feedback from our subscribers.
We also wanted to increase subscribers’ use of tags with this feature. Metrics for tags were not added to our analytics platform until this project started, so it’s not possible to know the results.