Core Concepts
Volunteer moderation styles on WhatsApp vary greatly between public and private groups, influenced by factors like social ties and cultural norms, demanding nuanced design solutions beyond a one-size-fits-all approach.
Abstract
This research paper investigates the moderation practices employed by administrators in both public and private WhatsApp groups situated in India and Bangladesh. The authors conducted semi-structured interviews with 32 admins and observed 30 public groups to understand the challenges of content moderation on an end-to-end encrypted platform.
Research Objective:
The study aimed to explore how WhatsApp group admins exercise care and control when dealing with problematic content and identify potential improvements for volunteer moderation on the platform.
Methodology:
The researchers employed a qualitative approach, conducting semi-structured interviews with 23 private and 9 public group admins in India and Bangladesh. They also observed user activities and admin responses to problematic content in 30 public WhatsApp groups. The study utilized Baumrind's typology of parenting styles as a lens to analyze the observed moderation practices.
Key Findings:
- Admins in private groups, particularly family and friends groups, often adopted a permissive moderation style, prioritizing social harmony over strict content control.
- Authoritative moderation, characterized by clear rules and communication, was more prevalent in private groups with weaker social ties, such as educational or professional groups.
- Public groups exhibited either authoritarian moderation, with admins restricting group interaction, or an uninvolved approach, with admins neglecting their moderation responsibilities.
- The study highlighted the influence of cultural factors, particularly in collectivist societies like India and Bangladesh, where offline relationships significantly impact online moderation decisions.
Main Conclusions:
The authors argue that a one-size-fits-all approach to moderation is inadequate for WhatsApp, given the diverse range of group dynamics and cultural contexts. They recommend designing tools that empower admins while ensuring accountability and a balance of power.
Significance:
This research provides valuable insights into the complexities of volunteer moderation on end-to-end encrypted platforms, particularly within the context of the Global South. The findings have implications for designing more effective moderation tools and policies that consider the diversity of user needs and cultural norms.
Limitations and Future Research:
The study acknowledges limitations regarding the demographic scope and the focus on India and Bangladesh. Future research could explore moderation practices in other geographical regions and cultural contexts. Additionally, investigating the effectiveness of the proposed design recommendations would be beneficial.
Stats
WhatsApp is the largest social media platform in the Global South.
WhatsApp has the second largest active social media user base globally.
India has the world’s largest WhatsApp userbase.
The researchers interviewed admins of 32 diverse WhatsApp groups.
The researchers reviewed content from 30 public WhatsApp groups in India and Bangladesh.
Quotes
"Although my aunt created the group, she became busy with household chores and kids and made me an admin instead."
"Earlier only the elders in our housing society could become admins. But, they were not tech-savvy and couldn’t understand all the features of WhatsApp. Then they recruited us because I always have Internet connectivity and check the group actively."
"I made others admin so that they could add their acquaintances to my group instead of forming new groups. This will help my group grow bigger and popular."
"A Hindu colleague left our office’s WhatsApp group due to hate speech. When we noticed, we decided to apologize to him in person instead of contacting him online given the severity of the matter. After meeting, we requested him to rejoin the group."
"We don’t add any unknown people to our group. We only add those who are affiliated with the press or are able to provide news materials."
"When somebody wants to join the group we ask for their building and apartment numbers. We have a list of contact details for all residents in our housing society. We dial the corresponding apartment to verify if the person actually lives there."
"In our group, people share religiously charged posts that would blame the Muslims for the 2019 Delhi riot, the 2020 COVID pandemic, the 2023 Odisha train collision, or almost anything that might go wrong in this country."
"Every year during Durga Puja [Bengali Hindu religious festival] there are posts with anti-Hindu sentiment, that would blame the Hindus for disrespecting the Quran, the Prophet, or Muslim women to justify violence against them."
"People think writing on WhatsApp is safe. I doubt if WhatsApp’s encryption would work in Bangladesh given the country’s strict digital law against anti-government content. The government might trace such messages on WhatsApp and accuse admins."
"After clicking on a spam link shared in the group, many group members’ Facebook accounts got hacked. Some girls’ sensitive photos were leaked and when we informed our teachers, they filed a cybercrime police complaint. The police interrogated everyone, recovered the hacked accounts, and asked the admins to disable the group."
"Recently while everyone was paying respect to a deceased colleague, someone shared a joke without paying attention to the ongoing conversation. This is insincere and displays a lack of common sense."
"Communal hate speech has been normalized in India over the years and none has the time or energy to protest such content. Most group members just care about staying connected with college friends instead of constantly arguing with them."
"During COVID many group members blamed Muslims for the rise of COVID in India. This triggered not only Muslim but other considerate group members from different religions, who decided to give up networking opportunities instead of being in groups that discriminated against people for their religion."
"Sometimes people intentionally share phishing links in the group. If I don’t notice, other classmates will call them out as spams and question the group member who shared that content."
"Previous admins had fights because the elderly admin argued that non-blood relative should not send too many messages. But, the younger admin disagreed and was forced to leave the group."
"Since all members in our group are women healthcare workers and the other co-admins are men, it’s appropriate that I [a female admin] deal with the group affairs."