Reading Zuck’s post actually made me quite sad. I’m in perhaps the minority of folks that don’t really care too deeply about FB tracking and advertising to me, but I feel strongly about the social implications of FB and the fake news problem.
FB has had an undeniably negative affect on society as a whole because of its ability to let people dive deeper into their own bubbles of reality (whether truthful or not).
Putting curtains on the blinds only means the problem will get WORSE, not better. People won’t have that silly emotion called shame (not that they have much of it anyway) getting in the way of a good racist rant or news story about the end of the world or how people with darker skin are so scary.
At least now there’s a chance someone with rationality will chime in if a story gets spread far enough. In the new private FB our divide is going to turn into a canyon :(
I'm curious to hear more from you on this, because I've always felt like Facebook-imposed censorship would be far worse than open dialogue on the site. People already wall themselves off inside of echo chambers that only reinforce already-held opinions, so is your thought that the worst of these echo chambers can be eliminated via FB filters?
Honestly, the most toxic content I've seen on FB hasn't been gore or racism or sexism or anything else distasteful, it's been blatantly false clickbait/opinion pieces that flare up existing divides.
I've always thought that the best way to make FB a decent place again would be to simply ban sharing of links, so that the feed could become what it was originally meant to be: a place where I can see what my friends are up to. But that would also mean FB would have to dial back its targeted advertisingm, which is all links (seriously, would contextual advertising really be that bad? Bad enough that we can justify the insanity that is FB data collection) and stop scooping up all the data they can get... which doesn't really seem their MO.
Here's the thing. Facebook already editorializes in two ways. First its community standards, and second its algorithm. THEY CHOOSE what to put at the top (first thousand) results of your feed.
They dont need to censor what people post to make BETTER decisions about what floats to the top. They just need to do a better job of prioritizing and placing better content. It is out there and it exists, surface it. They want to APPEAR impartial, as if your newsfeed is just what your friends post, BUT they are still deciding which posts are more important than the others. They are using the wrong signals to float the cream.
The question is less is censorship bad, and more "can/should fb better own that it is NOT impartial, and IS injecting its voice into people's conversations." If facebook wants to be impartial, then step back and act like an infrastructure, and if it wants to have a say in community standards, fucking own it and do a better job of judging what is worthy of the top of a feed.
FB has had an undeniably negative affect on society as a whole because of its ability to let people dive deeper into their own bubbles of reality (whether truthful or not).
Putting curtains on the blinds only means the problem will get WORSE, not better. People won’t have that silly emotion called shame (not that they have much of it anyway) getting in the way of a good racist rant or news story about the end of the world or how people with darker skin are so scary.
At least now there’s a chance someone with rationality will chime in if a story gets spread far enough. In the new private FB our divide is going to turn into a canyon :(