[go: up one dir, main page]

Jump to content

Universal Code of Conduct/2021 consultations/Enforcement/Wikidata community

From Meta, a Wikimedia project coordination wiki
Universal Code of Conduct

This is a general summary of the Phase 2 consultation that was run on Wikidata in the first quarter of 2021.

Introduction to Wikidata

[edit]

Wikidata started on October 29, 2012, and is the youngest Wikimedia project so far. It rapidly became one of the most edited Wikimedia projects, along with English Wikipedia and Wikimedia Commons. As of April 2, 2021, Wikidata has more than 93.2 million items and around 26.4k active users.[1]

As it is a multilingual project, editors come from many different backgrounds and geographical areas. English is the default language for basic communications among members, although all languages are theoretically eligible for communications: all of the important policies and guidelines and several of the community hubs are available in many languages. There are currently 62 administrators, 3 bureaucrats, 5 checkusers, plus several other classes of users with defined rights.

Outline of Wikidata behavioural policies

[edit]

During the years, Wikidata has established a code of very basic guidelines for behaviour on the project, mainly based on Wikipedia’s Five Pillars. The most important issues covered are deletions, blocks, personal attacks, assuming good faith, use of alternate accounts, and biographies of living people.

Given the particular nature of Wikidata, which is even more content-driven than other communities, this basic code proved more than sufficient to deal with virtually all problematic behaviours up until now.

All violations are dealt with on the Administrators’ noticeboard (for all «matters requiring administrator attention») or on the Requests for checkuser page (in case of abuse of multiple/alternate accounts). More broad issues are to be discussed through a request for comment.

In addition to the existing guidelines, Wikidata community produced two more pages: “Speaking rightly, acting rightly,” a general essay about how to behave politely (and how difficult sometimes this can be); and “Wikidata:Behavior norms,” an abandoned draft about which behaviours the community wants to see and which are to be shunned/banned from it.

Facilitation process on Wikidata

[edit]

The main hub for discussion on Wikidata was structured as a set of dedicated pages, with a general landing page that summarised the scope and the duration of the consultation, as well as the actions that every user could take during the period; a general discussion page, that worked as «the general space for discussion about the consultation»; and a FAQ page, that was adapted from Meta’s FAQ page to cover the needs of the Wikidata consultation in particular.

Most of the communications and solicitation of answers were conducted on wiki, through Wikidata’s Project Chat or user talks, and through the Wikidata mailing list. Only sporadically, communications also happened in the Wikidata unofficial Telegram channel and the Wikidata-related Facebook groups.

Throughout February, three different rounds of questions were held, each with a particular focus. From February 10 to February 28, a sitenotice was set up to advertise both the consultation and an online survey. A “gamified” version of the consultation, similar to the Italian initiative, was also launched in the same period.

The local consultation on Wikidata was officially closed on March 5, 2021. In total, 21 users participated in on-wiki discussions, while 5 others interacted off-wiki (e-mail, Telegram/Skype chats, Facebook comments). The online survey proved, by far, to be the most successful way of collecting feedback, with 337 individual responses (92.8% of the total feedback received).

Community’s feedback

[edit]

General summary

[edit]
  • There was no clear consensus on preferring a “local” solution instead of a global solution. Generally speaking, it can be inferred that Wikidatans might prefer a solution that involves directly and transparently the community, but with a “fail-safe option” of recurring to an external “solution”, in case admins are involved.
  • It was more easy to observe consensual suggestions about the enforcement process, than about who should be in charge of it – the latter was a more polarising argument, considering also the considerations above.
  • The current reporting system on Wikidata is fairly working, but can be improved – especially in terms of visibility for users who are not expert in using Wikidata. There is reasonable consensus about establishing a private channel for reporting harassment and unwanted behaviour, but there is no clear consensus about who should be in charge of it.
  • Harassment that happens outside of the projects should be reported to the community or to the Foundation, but its eventual punishment might still depend on its severity and its potential link to on-wiki issues.

Feedback about enforcement pathways and escalation

[edit]

About the current on-wiki reporting system

[edit]

Wikidata is perceived as a welcoming community by those who provided feedback: of the 337 respondents to the survey, 196 (58.2%) stated they «suffered or witnessed any of the unacceptable behaviours listed in the Code of Conduct», but this number drops to 95 (28.2%) when interviewees answered about their experience on Wikidata.

Question      Yes      No Total
Have you ever suffered or witnessed any of the unacceptable behaviors listed in the Code of Conduct within the Wikimedia community? 196 (58.2%) 141 (41.8%) 337 (100%)
Have you ever suffered or witnessed any of the unacceptable behaviors listed in the Code of Conduct on Wikidata? 95 (28.2%) 242 (71.8%) 337 (100%)

However, the answers also clearly indicate that the current system for reporting harassment is simply unknown to many people: 51.6% of the people interviewed (189 out of 337) said they would not know «how to report such case(s) on Wikidata».

Question      Yes      No Total
If necessary, would you know how to report such case(s) on Wikidata? 148 (43.9%) 189 (56.1%) 337 (100%)

This was also echoed by a user during the on-wiki consultation:

I would suggest adding something to account talk pages to make it really clear how to make a report. I’ve been on the site a fair while and (happily) I know nothing about these procedures. That’s a privilege but it does mean that if there was some issue, I wouldn’t know where to begin. I imagine that’s common to many users. [...]

— User:Mr_impossible

Moreover, of the 95 people who said they suffered or witnessed harassment on Wikidata, only 51 (53.7%) decided to go on and report it, either publicly or privately.

“If so, did you report such behavior in a project page, to an admin or to another user?”[2]
     “Yes, publicly on a project page” 28 (35.4%)
     “Yes, publicly or privately to an admin/user” 23 (29.1%)
     “No, I decided not to report it” 28 (35.4%)
Total respondents 79 (100%)

Even though it is not possible to determine exactly why those reports did not follow through, some of the follow-up answers provide some interesting points to discuss (all quotes are from anonymous user answering the survey):

Two concerns I’ve encountered are that only a few volunteers [are] involved in the project page conversation to evaluate the case, or that the volunteers participating have prior involvement with the individuals involved that affects their judgment.

— Anonymous user

Some users have so much influence, that they can get away with literally anything. What I often hear is “yes, this user is harassing and misbehaving, but we can not afford to lose such a valuable contributor”. [...]

— Anonymous user

they take a long time to process, there is no psychological support for victims and reporting pictures you as problematic in the eyes of the community

— Anonymous user

About establishing a private reporting tool to enhance the current reporting system

[edit]

A private reporting tool was explicitly referred to as a possibility of enhancing the current reporting system on Wikidata. Most of the users who intervened, both on-wiki and off-wiki, seem to be in favour of the general idea, but differ – sometimes substantially – in its implementation.

More specifically, three of the four users who intervened on-wiki on the topic are open to set up «a complementary procedure» to report harassment privately. In the survey, of the 337 respondents, 195 (57.9%) agreed that «a private reporting method [would] be useful in addition to the existing (public) ones». It is worth noting, though, that 60 responders (17.8%) did not agree with the idea.

Question      Yes      No      Don’t know Total
In your opinion, could a private reporting method be useful in addition to the existing (public) ones? 195 (57.9%) 60 (17.8%) 82 (24.3%) 337 (100%)

Using e-mails to report harassment was the main suggestion of the respondents. At least two of the users who interacted on-wiki referred to an e-mail address as a potential means for private communication, as well as 287 (85.2%) of the 337 respondents to the survey. The main suggestions coming from the survey ranged from using the native Special:EmailUser function, to establishing a separate e-mail account, to set up an OTRS-like system. Again, it is worth noting that a handful of survey participants (5, 1.5%) expressed their opposition to a private reporting system in itself.[3]

If there was consensus for a private reporting tool, it should be easy to use. Which tool do you think could be more apt?[4]
Special:EmailUser – Wikimedia’s function for sending emails 75 (22.3%)
A dedicated e-mail address to send reports to 113 (33.5%)
An OTRS-like service 99 (29.4%)
Other answers[5] 40 (11.9%)
I don’t know 10 (3.0%)
Total respondents 337 (100%)

About the functioning of the system, respondents’ answers were too much different among themselves to provide clear preferences. There is consensus, though, about a number of general principles, such as:

  • fairness in the investigation process;
  • clear definition of who is in charge of it;
  • transparency in the publication of the results;
  • possibility for the accused user to express and defend themselves.

These principles are echoed by interactions in the on-wiki consultation by two users (1, 2):

[...] I believe that wiki’s can only work if information is freely shared, unless it really is too bad to be posted openly, then it should be locked behind “need-to-know” pages. [...]

— User:Donald Trung

If we want someone to change their behavior we have to explain to them what actions they took that go against our norms. The same goes for banning a person. Confidential reports can still be used to bring awareness about problematic behavior of users and afterwards admins can investigate the conduct of the person about whom someone complained confidentially.

— User:ChristianKl

On the contrary, one of the anonymous respondents to the survey was sufficiently clear about their opposition to set up a private reporting system:

Is this really necessary? It seems like lot of administrative busy work to pamper people whose feelings got hurt. I get that this is part of the broader culture, and that implementing these “gotcha” hit jobs is something that is done nowadays to make some people feel better.

— Anonymous user

About cases of harassment that go beyond Wikimedia projects

[edit]

If a Wikimedia user suffers harassment outside of the Wikimedia projects in relation to their Wikimedia work and/or affiliation, there is general consensus coming from the consultation that either the community and/or Trust & Safety should be notified of the fact. Of the 337 survey’s respondents, 162 people (48.1%) said the community should be notified, and 167 (49.6%) that T&S should be made aware of such unwanted behaviour.

“In your opinion, how can we intervene against or manage harassing behaviors that involve wikimedians, but happen outside of the Wikimedia community?”[4]
“The Wikimedia community should be notified about them” 162 (48.1%)
“The Trust & Safety Office should be notified about them” 167 (49.6%)
“They should be reported to the social network/platform administrators” 166 (49.3%)
Other answers[5] 74 (11.9%)
Total respondents 337 (100%)

However, no conclusive answer could be drawn about the effects of such reporting. This depends from a series of possible reasons, such as the severity of the offence and the lack of precise on-wiki rules about such cases. It should be also noted that there is a general unspoken view among Wikimedia users, that is well-summarised by this anonymous comment posted in the survey:

Off-wiki harassment that has no on-wiki relevance or relation to on-wiki issues must not have any on-wiki consequences. If the behaviour in question has on-wiki relevance and involves no or easily redactable sensitive information, it is to be handled through normal community process. If the behaviour in question involves sensitive information that makes it unfit to be handled by the community at large, it is to be handled by trusted users (such as ArbCom, if available). In no [case] must any on-wiki action be taken without meaningful community involvement.

— Anonymous user

Feedback about enforcement body

[edit]

About a local enforcement body

[edit]

The discussion about how to enforce the UCoC on Wikidata strangely did not see a lot of participation on-wiki, therefore large parts of the data in this section come from off-wiki discussions. In general, there is no clear-cut consensus about who should be dealing with UCoC violations. A general majority of survey respondents indicated a preference for a “local” solution, i.e. either by establishing a locally-elected enforcement body or by trusting Wikidata administrators of dealing with the violations. There was though a substantial minority of respondents that would prefer stewards or a global body to deal with them.

In your opinion, who should be in charge of handling reports and dealing with violations of the Code of Conduct on Wikidata?[4]
Wikidata administrators 126 (37.4%)
A local team of users, elected by the Wikidata community 160 (47.5%)
Stewards 72 (22.0%)
A new global body to be established within Wikimedia community 140 (41.5%)
Other answers [5] 44 (13.0%)
Total respondents 337 (100%)

In the case «the violator is an admin on Wikidata», though, this ratio is slightly inverted: in this case, respondents made clear their preference for somebody else from outside the project to take care of the violation, in order to ensure neutrality in the decision.

In your opinion, who should be in charge of handling reports and dealing with violations of the Code of Conduct, if the violator is an admin on Wikidata?[4]
Other Wikidata administrators 116 (34.4%)
A local team of users, elected by the Wikidata community 136 (40.4%)
Stewards 100 (29.7%)
A new global body to be established within Wikimedia community 165 (49.0%)
Total respondents 337 (100%)

About the functioning of the enforcement body, conclusions are similar to those for the private reporting tool. In addition to those considerations, though, there was one more recurrent request among respondents: the body should be composed by people who are somewhat “professional”, or at least explicitly trained, in dealing with harassment. Some respondents were also worried about the potential inclusion of volunteers, due to the possibility of them shying away from the task.

A user, who asked to remain anonymous, outlined a couple more principles about such enforcement body, in a 1-on-1 private discussion:

  • The body should not have just a punitive or repressive function, but should also have “positive” functions, i.e. community building or light surveillance (in order to act more quickly and decisively before problems arise).
  • The body should be, without any doubt, controlled by the community, but at least one member can be WMF staff (with community building and legal backgrounds), and at least another member should be a steward because of their experience (maybe they can rotate among themselves in occupying that seat).
    — Anonymous user

Probably the comment that sums up best all of the considerations above is the following:

It seems the main question is not asked: how do we make sure that the people that judge complaints are not biased, not prejudiced, without preconceptions, without prejudgement, being impartial, not a direct colleague/team member/etc of any of the involved people, well informed (read: understanding) about all kinds of diversity matters and disabilities in the movement, taking all sides of an issue seriously and not only from one side, understanding social differences from different areas around the world and not just think that what one individual learned is common practise around the world, that they do actual efforts to work towards a solution, that they fact check what happened by asking other (uninvolved) people about what happened, and they must understand mistakes can happen unintentionally. This all went wrong multiple times in the past years, and this was often kept indoors under treats, power abuse, and more. Further, people that feel hurt have a need for comfort, [this] should also be taken care of. Just as the ombuds commission they must be independent and work with the greatest care.

— User:Romaine

About a potential global enforcement body

[edit]

The potential establishment of a global enforcement body to deal with violations of UCoC on Wikidata was thoroughly discussed in the survey and in some off-wiki private discussions. The results of the latter are generally coherent with the survey’s results, and relevant feedback on the matter is reported in the “Interesting ideas” section.

The questions about the global enforcement body can be divided into two groups. The first one is about some of the more general aspects, such as community decision on its establishment, definition of the body’s scope of action, and the possibility for all users to address the body regardless of language-based barriers. In two cases, users overwhelmingly stated their agreement with the statement they were presented: 286 users (89.9%) supported a very clear definition of the body’s scope of action; and 264 users (78.3%) concurred with the idea that it should be possible for all users to address such global body.

Also the idea that the global community should have the last word about the creation of the global enforcement body was fundamentally agreed upon, but it is noteworthy that its support was somewhat lower than the other two statements (209 users, 64.9%), while the minority expressing its disagreement was more marked than the other two cases (61 users, 18.1%).

Statement Disagree Neutral Agree Total
1 2 3 4 5
The decision to create this “enforcement body” must be approved by the global Wikimedia community 40 (11.9%) 21 (6.2%) 57 (16.9%) 80 (23.7%) 129 (41.2%) 337 (100%)
The scope of action of this “enforcement body” must be defined very clearly 17 (5.0%) 3 (0.9%) 31 (9.2%) 63 (23.7%) 223 (66.2%) 337 (100%)
It must be possible for every user to address such “enforcement body”, regardless of any language barrier 25 (7.4%) 11 (3.3%) 37 (11.0%) 52 (15.4%) 212 (62.9%) 337 (100%)

The second group of statements was composed by four different and competing affirmations, that were drafted to explore the various possibilities of the scope of action of such body, from including all of the Wikimedia projects, to including only those who do not have sufficient procedures, to other middle-ground possibilities. In all cases, opinions were far more polarised, and a substantial quantity of users chose to position themselves on a “neutral” stance.

The hypothesis relatively preferred by interviewees was to include all Wikimedia projects and communities within the scope of action of the (potential, yet-to-be-established) global enforcement body: 202 users (59.9%) expressed their agreement, while 85 (25.2%) disagreed with the idea. This seems to be coherent with the results scored by the second hypothesis presented, i.e. limiting the global body’s capabilities to «Wikimedia communities who do not have sufficient/proper established procedures»: in this case, opposing users are the majority (145 people, 43%), with a noteworthy minority of 108 users (32.1%) agreeing with the idea.

The last two statements, that basically deal with the idea of leaving the decision to the single communities about the adoption of the body, show even more polarisation: the idea of leaving out those community who do not wish such a body to be established was basically welcomed with a substantial tie between those in favour and those against (both groups account 133 users, 39.4%), with slightly more than a fifth of the interviewees choosing a neutral stance. The idea for larger communities to opt-in, on the other hand, receives a bit more of support (152 users, 45.1%), with still a substantial minority opposing the idea (96, 28.5%).

Question Disagree Neutral Agree Total
1 2 3 4 5
The scope of action of this “enforcement body” must include ALL of Wikimedia projects and communities 60 (17.8%) 25 (7.4%) 50 (14.8%) 52 (15.4%) 150 (44.5%) 337 (100%)
The scope of action of this “enforcement body” must include only Wikimedia communities who do not have sufficient/proper established procedures 88 (26.1%) 57 (16.9%) 84 (24.9%) 40 (11.9%) 68 (20.2%) 337 (100%)
The scope of action of this “enforcement body” must be limited only to the Wikimedia project or community who choose to establish it 84 (24.9%) 49 (14.5%) 71 (21.1%) 40 (11.9%) 93 (27.6%) 337 (100%)
Larger communities should have the possibility to opt-in the scope of action of such “enforcement body”, should there be consensus about it 64 (19.0%) 32 (9.5%) 89 (26.4%) 58 (17.2%) 94 (27.9%) 337 (100%)

Feedback about support for targets of harassment

[edit]

More concrete indications were given for the expected timings for dealing with harassment and other violations’ reports. Generally speaking, 72.7% of respondents to the survey clearly indicated their preference for a quick “initial response”: 110 people (out of 337, 32.6%) expressed a preference for being contacted within the first 24 hours, with an additional 135 (40.1%) who said that could “wait” up to three days from their initial report.

In your opinion, how long should it take to get you an initial message that your case has been taken into consideration?
     24 hours at the very most 110 (32.6%)
     Between 1-3 days 135 (40.1%)
     Between 3-5 days 43 (12.8%)
     In a week 49 (14.5%)
Total respondents 337 (100%)

The trend is somewhat confirmed with the follow-up question about the timings to receive an update on reports: the majority of respondents (120, 35.6%) would wait up to a week, but there is an almost-equivalent number of person (123, 36.5%) who would like to receive an answer in a relatively shorter time.

In your opinion, how long should it take to receive an update on your report?
     Between 1-3 days 62 (18.4%)
     Between 3-5 days 61 (18.1%)
     Within a week 120 (35.6%)
     Within 2 weeks 94 (27.9%)
Total respondents 337 (100%)

“Outlier” responses

[edit]

Among the very first comments about the UCoC text, there was an issue raised by a user about the fact that UCoC would forbid to raise the fact that a user might be in “conflict of interest” because of their ethnicity, if the user engages «in non-neutral editing of pages» that might be of importance for the ethnicity in case. The whole discussion was held on-wiki in January.

Some of the vandalism on Wikidata is due to users wanting to advocate for a particular interest. In conflict between different ethnicities it frequently happens that users who are involved in the conflict because they belong to one of the ethnicities engage in non-neutral editing of pages that are relevant for the content. Being able to say that those users engage in conflict of interest edits is valuable for the goal of having a neutral Wikipedia and currently it seems the draft intends to forbid speaking about ethnicities.

— User:ChristianKl

Despite the whole issue stemming from a personal interpretation of a portion of the UCoC, the point raised is definitely a tough one to be addressed, also because it might reverberate on content. While reflecting accurately the facts of the dispute is the most rational, most pillars-aligned, and most common answer, we might need to take into consideration that sometimes identification and/or belonging to a group does play a role in such discussions, and that stifling discussions on this point might be a mistake.

Statistical representation of the data

[edit]

The first main take from the consultation is, of course, the demographics breakdown. Of the 363 people who took part to the consultation, 239 of them (65.8%) self-identify as male, 41 (11.3%) as female, and 14 (3.9%) as “non-binary or other gender identities.” 69 more (19%) users declined to self-identify.

This breakdown, however, changes dramatically if we consider separately the survey and the other channels: the survey itself accounts for 337 of the 363 overall users reached – a staggering 92.8% of the overall feedback received. Even more interesting is the fact that, of the 26 users reached through the on-wiki pages and other channels (private mail, Telegram, Skype, or Facebook), only 3 of them self-identify as female, while the remaining 23 are all males.

Total      Male      Female      Other      N/A
On-wiki consultation 21 19 2
Other means of communication 5 4 1
Survey 337 216 38 14 69
Total feedback 363 239 41 14 69
On-wiki consultation Other means of communication Survey Total feedback

The consultation saw a large participation of established users, especially on-wiki: almost all of the people who took part to the on-wiki consultation have contributed substantially to Wikidata; in addition, seven people who expressed their opinions (on-wiki or through private means) are administrators and one was also a bureaucrat; only two users were active since less than one year or had less than 500 edits. This was substantially confirmed in the survey: around half of the people who took it (164, 49.8%) revealed they have been active on Wikimedia projects for more than 10 years, while only 44 (13.4%) were active since less than a year.

Wikidata is also not necessarily the primary project of many of the people who intervened in the project – which is not surprising, given the peculiarity of the project. Of course, the project on which participants are more active is Wikipedia, followed by Wikidata and Wikimedia Commons. It is curious to note that the three projects are also the ones more edited in the community and arguably those more strictly connected to each other.

How many years have you been active on Wikimedia projects?
     More than 10 years 164 (49.8%)
     Between 5 and 10 years 43 (13.1%)
     Between 3 and 5 years 36 (10.9%)
     Between 1 and 3 years 42 (12.8%)
     Less than 1 year 44 (13.4%)
     Didn’t answer 8 (2.4%)
Total feedback from survey 337
How do you participate in Wikimedia projects?
     As a registered user 221 (67.4%)
     As an administrator 46 (14.0%)
     As a member of other user groups 30 (9.1%)
     As an unregistered user 31 (9.5%)
     Didn’t answer 9 (2.7%)
Total feedback from survey 337
How many projects are you active on?[4]
Wikidata 259 (79.4%)
Wikipedia 295 (90.5%)
Wikimedia Commons 227 (69.6%)
Meta-Wiki 104 (31.9%)
Wiktionary 76 (23.3%)
Wikisource 66 (20.2%)
Wikiquote 39 (12.0%)
Wikibooks 37 (11.3%)
Wikivoyage 32 (9.8%)
Wikiversity 25 (7.7%)
Wikinews 24 (7.4%)
Wikispecies 24 (7.4%)
Didn’t answer 11 (3.6%)
Total feedback from survey 337

Conclusions

[edit]

The main data that stands out is the marked difference, both quantitatively and qualitatively, in turnout between the “public” on-wiki consultation, and the “private” off-wiki consultation. There is no conclusive evidence about the reasons of such difference – even if it might be safely assumed that the delicacy of the theme would require a more private, therefore “safe,” environment for the users to freely share their opinions. Apart from this, the Wikidata consultation was conducted in relative tranquility, and no major incidents were registered.

About the results of this consultation, it was more easy to observe consensual suggestions about the enforcement process, than about who should be in charge of it. The latter, as it would have been easy to guess, was a more polarising argument, mostly due to its potential effects on the internal equilibrium of the community. This is even more important in a multilingual community such as Wikidata, that also has to deal with the difficulty of describing the basic data about reality and its many manifestations.

There was no clear consensus on preferring a “local” solution instead of a global solution. Generally speaking, it can be inferred that Wikidatans might prefer a solution that involves directly and transparently the community, but with a “fail-safe option” of recurring to an external “solution”, in case admins are involved. It was even more complicated to outline any kind of consensus about a potential global enforcement body within the Wikimedia community: there is a slight preference for it to encompass all of the Wikimedia projects, but there is a consistent part of people who intervened who argue otherwise.

A more clear consensus was, instead, easy to find on the principles of such process, in particular fairness in investigation (providing also the possibility to the accused to share their point of view) and transparency to the community, which are clearly core principles of the larger Wikimedia community.

References

[edit]
  1. Statistics from Special:Statistics, last consulted on April 2, 2021. “Active users” are those users who have performed at least one action in the 30 days prior to April 2, 2021.
  2. This answer was optional and conditional to another question.
  3. These five users are grouped in the “Other answers” row in the table below.
  4. a b c d e Respondents could check more than one answer.
  5. a b c Respondents could provide personalised feedback. All answers that do not fit in the other fixed answers are collected here. Qualitative evaluation of such answers will be demanded to the Phase 2 drafting committee.