MEETUP

Designing an authentic reputation system

In 2018, there wasn’t a good way to understand how good an event was on Meetup before attending. The content on the platform had evolved organically without many measures in place to monitor quality. Leadership decided that establishing trust in the platform as a whole, as well as on individual events was critical to growth.

Role: v1.0: Product strategy, Research (IC), v2.0: Design direction (Manager) Team: 1 designer, 1 content strategist

FeedbackCollection.png

THE CHALLENGE

Increase engagement with high quality events, and set the bar for quality on Meetup.

At the outset of the project, our team wasn’t given super defined objectives, but our mission was to increase engagement with high quality events by allowing attendees to learn from others’ experience.

Our secondary mission was meant to also inspire organizers to create better events because quality was being monitored.

CONSIDERATIONS

Building the right system for Meetup

We started with a competitive audit of other reputation systems out there. It was clear that having feedback loops was essential to creating trust in attending events, but it was also clear that this had to be the right system for Meetup.

Multi layered user relationships

Meetup’s ecosystem was more complicated than others with multiple organizers for a group, and multiple attendees per event. We decided to simplify and focus on the relationship between a single attendee and group.

Ecosystem.png

Regular attendees

How valuable would this ratings system be for people who regularly attend Meetup events from the same group? How will the experience change over time as they use it? Our ecosystem was different than others in that it relies on members repeat engaging with groups.

Organizers don’t get paid

How would the exchange of money factor into a rating system? Organizers are our paying customers, and most of them don’t make money from hosting. How might we make sure this system is valuable for them?

Impactful moments

As a starting point, we used qualitative and quantitative data to identify the moments in which users would be most influenced by a reputation system.

MemberJourney.png
OrganizerJourney.png

RESEARCH

Understanding what quality means

Since our initial focus was members, we started off by interviewing members to understand how they thought about an event’s quality, and which signals would influence their decision to RSVP. Here are some insights from our first round of interviews.

Unfamiliar signals caused friction

We started off by showing users a range of different ways to say an event was “good”, but the ones that were unfamiliar were confusing and caused friction. Although people agreed that going back to an event was the strongest signal of quality, it took them a moment to read and understand that.

QualitySignals.png

Quality was subjective based on fit

Several users responded that they would rate an event “5 stars” even if they didn’t enjoy it, because they thought others might. 

Collecting negative feedback was a challenge

People were hesitant to leave public negative feedback. This was the case with many reputation systems, but people were worried about running into the organizer in their community. 

After the initial round of research, it was clear to us that the members would use ratings in order to decide on the event that would be the right fit, not to decide between events to attend. The consolidated number rating was helpful to build overall trust in the platform, but the deciding factor for attendance was based heavily on the fit. 

REFRAMING THE PROBLEM

A reputation system optimized for first-time attendees

We took some time to reframe the problem and narrow in on what we were actually trying to do. It was clear that the rating system would be most useful for attracting first-time attendees to events. We identified the 2 main jobs the system had to do for members and organizers.

Member jobs

  1. Proof of activity: show that others attend the events to avert a common fear of being the only one to show up.

  2. Manage expectations: Give people an idea of what to expect at an event and highlight the qualities that made an event unique. 

Organizer jobs

  1. Attract new members: Show off what was special or unique about their events so that new members join.

  2. Private feedback: Get private kudos to feel good about their continued efforts. 

We defined clear success metrics across the ecosystem, so that everyone was clear on what success looked like.

Goals.png

THE RESULT

A 5-star rating system that allowed organizers to show off the best qualities of their groups 

Recognizable 5 Star rating system

The pros of the 5 star system being easily scannable and recognizable outweighed the cons of it feeling too transactional. Most events were rated 4.7 or higher, so members used the number of ratings to assess relative quality. 

5starRatings.png

Event attributes

We collected attributes of the event to display as a “highlights” section. This gave users a sense of what it would be like to attend an event from that group. 

Highlights.png

Private event feedback

attendees could give private kudos or say thanks to the host, without having it display on event pages. While the first version only sent an email to the host, the next iteration would start a conversation between host and attendee. 

PrivateFeedback.png

THE IMPACT

Mixed results and next steps

The initial beta rollout saw positive results on the member side, with the intended improvements in first time members’ engagement with events. 

  • Increased first time RSVP-ers to high quality groups 

  • Increased joins to high quality groups 

However, as we scaled up we began to see negative effects on the organizer side of the marketplace. Because of this, we didn’t end up scaling up to 100%. 

Although we did identify risks to the organizer experience early on, a decision was made to continue forward to benefit from the increased engagement on the member side. 

I consider the design and product strategy work from this initiative a good demonstration of my ability to think through complex systems, but the business outcome was not ultimately successful. I certainly learned from this experience and would do many things differently next time!

Previous
Previous

Building and growing a design team

Next
Next

Visualizing the future Meetup