- But that relentless focus on private, easy sharing did not account for second- and even third-order effects at scale: What happens when there are more than a billion people using the service? What happens when some of those people have a limited understanding of the technology they’re using, of the perfidy of the broader internet? And what happens when an incitement to violence can be shared instantly with hundreds of people who can each share it with hundreds more?
I'm pretty certain I disagree with this notion. Still, got me thinking: can you put blame on anyone when tech meets people who don't understand said tech (or are too gullible for it)?
Whatsapp groups nearly encourage rumours, gossip, outright lies. Its the secrecy and insulation of being in a enclosed groups with others that feels almost like you have anonymity to the rest of the social media world.
In July, residents of a rural Indian town saw rumors of child kidnappers on WhatsApp. Then they beat five strangers to death. People spread rumors. People destroyed a village. People beat five strangers to death. The people involved in this need to take responsibility for their actions, or this will never improve. Pointing the blame at a foreign technology company is a cop out.Vicious Rumors Spread Like Wildfire On WhatsApp — And Destroyed A Village
Immediately reminded me of the Blue Whale challenge, a VK-based hoax that led to alleged suicides, arrests, and moral panic across Russia. Compare also RTLM, a Rwandan radio station that played a central role in the propagation of genocidal calls to action. I'd argue that whatever communication technology prevails can (and sadly probably will) be abused to incite violence. This is not a WhatsApp-specific problem.And what happens when an incitement to violence can be shared instantly with hundreds of people who can each share it with hundreds more?
It’s not a technology-specific problem, I completely agree. But I don’t think communications methods are entirely neutral either. The design of the app or website or whatever plays a big role in how people interact with it, and that responsibility does lay with the company or organization, up to a reasonable point. I’m just not entirely sure where that point is.
Here's the problem: They didn't fail to foresee. They failed to be responsible. There's a world of difference between "to give people the power to build community and bring the world closer together" and "to build community and bring the world closer together." Facebook was created so that Harvard boys could rate Harvard girls without the girls knowing. This whole "empower community" thing is a speculative side-benefit of the core mission of social media apps, which is monetizing jealousy. By saying, "well, people could use it for good if they wanted" those collecting the checks absolve themselves of all responsibility for any actions or behavior other than "good" (for which they can claim credit). Meanwhile, the organizations tasked with enforcing "good" have no more handle on the platform than anybody else. Generally they have a lot less. And here we are.In attempting to fulfill Facebook’s current mission — to “give people the power to build community and bring the world closer together” — Zuckerberg and his team of Silicon Valley–based executives failed to foresee its malignant applications: misinformation, propaganda, rumor, hate.
Agreed. That, and... Since when do we take the word of any reporter that has absolutely no stake in the game? Encryption absolves the reporter of any culpability. What the hell?The videos, whose origins are impossible to trace because of WhatsApp’s strong encryption, had been making the rounds in WhatsApp groups in India months before the incident in Rainpada.