2021 / iOS + AN + WEB / SHIPPED

User Safety Controls

Defining safety for a community of 90 million+ monthly active users.

Overview

Over COVID, Wattpad had a large spike in new users, and with that came a large spike in hate & harassment tickets that operationally overloaded our Support team. We wanted to take this as an opportunity to build out a long overdue feature we’ve been advocating for - the ability to Block users. But we quickly realized that Block couldn’t be the only solution for tackling our user and business problems.

We needed to take a holistic approach to defining safety on Wattpad that protected not only our users, but our internal teams on the frontline as well.

Team: Audience Trust - Product manager, Frontend + Backend engineers, Trust & Safety champions, Community Wellness team

Role: Product Strategy, UX flows & Designs, Interaction design, Stakeholder management, QA

Our team's mission for 2021 was to help users protect themselves from unwanted interactions. 


The success metric:
The proxy for a reduction in harassment on the platform is a reduction in the # of unactionable hate & harassment tickets that our Support team receives. 

Understanding the problem space: Harassment on Wattpad

To kick off the discovery process we first needed to get a good understanding of the current landscape of harassment on Wattpad. We set out to understand which negative interactions that have the greatest risk/negative impact to our community and business & which interactions generate the most unactionable tickets.

1. Consulting the experts: We worked very closely with our stakeholders from our Trust & Safety (T&S) and Community Wellness (CW) teams to understand the current landscape of user safety on Wattpad. Some things we wanted to know were:
What does Wattpad consider to be harassment?
Which user segments receive the most bullying/toxic interactions - writers or readers?
Operationally, what issues were the team facing in dealing with hate & harassment tickets?

problem-space

Discussing with Trust & Safety, we decided to use unactionable Hate & Harassment tickets as a metric for harassment on Wattpad. This metric was a signal for issues that users could have solved on their own if the right safety tools were provided. 


2. Working with data:
 We worked with the Analytics team to understand which areas of user interaction that are currently unprotected has the most traffic. We found out that Story Comments have 700 million interactions/month with second being Profile conversations with 6M/month.


Screen-Shot-2022-07-19-at-6.45.47-PM

An overview of the current landscape of our safety tools we offer to users and how it gets tagged in Zendesk. From this map we could explore opportunity areas to help users stay safe.

3. Survey + interview data: We analyzed data from our monthly Sentiment Survey & App Reviews to pull out the top pain points around safety for our users. In addition, interviews were conducted with 6 of our top writers to see if there were specific pain points our most valuable user segment were facing. The major themes of our user's pain points with platform safety were:


What prevents users from feeling safe?

  • Users are frustrated that our current tools don’t sufficiently protect them from harassment (Mute doesn't fully work)
  • Users feel frustrated by lack of resolution when they submit a ticket. 
  • Users feel like they’re not heard or their concerns aren't being taken seriously and that there is no urgency in addressing their response
  • Writers want ways to be able to disallow certain users from accessing their stories.
block-meme


Understanding our user’s painpoints better, we soon realized Block is not a feature where the outcome is deployment.

What we are really building is a series of safeguards that enable users to feel safe and reduces the exposure of negative interactions. This could either be reactive (block feature) or proactive (anti-bullying marketing campaign).

After presenting our research at the project kick off the team agreed the goal isn't building a Block feature. We need to be grounded in our use cases and prioritize them based off of where the biggest opportunities to solve the problem are. The outcome is a higher user sentiment of platform safety and reduction of bullying tickets.

Understanding the Opportunities

After aligning the team on the core user problem areas, I aligned the team on the current landscape of safety tools we offer to our users. We offer the ability to report story comments, private & public messages, and users. We also have a Mute feature BUT it isn’t fully built out and doesn’t sufficiently protect users from harassers when they are muted. This was one of the core issues users mentioned through our sentiment survey.

Screen-Shot-2022-07-25-at-2.53.28-PM

Ideation Session

With a good understanding of the core user problems around safety & what the gaps are in our current safety tools, I organized a workshop to ideate and prioritize solutions cross-functionally. Our engineers, along with our Trust & Safety and Community Wellness manager also participated.  

Screen-Shot-2022-07-13-at-11.47.15-AM
Screen-Shot-2022-07-13-at-11.48.29-AM

The prioritized solutions were to build out our Mute functionality to cover Comments. Adding another option within the Report flow to funnel unactionable harassment tickets away from T&S, introducing in-context educational tips into the reporting flow, and Story Block - disabling certain users from accessing writer's stories.

 

Solutions we shipped! 🎉

Comment Muting

  • Now, when you mute a user, you won’t see their comments on any stories, and they won’t see yours.

  • Users can now protect themselves in the largest area of interaction on Wattpad (~700M+ comments/month)

  • Mute now covers all interaction areas (Private messages, Public Convresations, Comments)

  • What happens when you unmute a user? All previous comments reappear

Process highlights:
• Conducting competitive anaylsis to see what other large social media platforms were doing
• Mapping out all user stories/use cases 
• Working with engineers to tweak interaction for hiding muted comments

commentmuting-tap-transparent
blob-commentmuting
I-dont-like-what-im-seeing-mockup2

"I don't like what I'm seeing" Report Option

  • This option is for users to report comments that they don’t like seeing but also don’t violate guidelines. A big portion of our unactionable harassment tickets were from use cases like this.
    • Ex. "This person doesn’t like my HarryxLiam ship, I’m going to report them"
  • Adding this option helps reduce unactionable tickets for our Support team by funnelling the use case away from filing a Harassment ticket.

  • An updated confirmation screen was added that included a link to the Code of Conduct for more info on what is and isn't allowed on the platform + the ability to Mute the user as well. 

  • Before the designs for the report flow were not unified across platform. Designs & Copy would be different the iOS flow to the Web flow. As part of the solution, the designs and copy for the reporting flow were brought to parity across all platforms (iOS, Android, Web).

Process highlights:
• Competitive analysis
• Working with our Community Wellness/Support team to set up how it's tracked in Zendesk

 

Currently open for opportunities! 🍵
Let's chat about all the amazing things we can build together. 

© jess lee 2022 

Site built with ❤️+🫠+🧋 (+ Semplice)