Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
[dupe] Zuckerberg’s So-Called Shift Toward Privacy (nytimes.com)
88 points by pulisse on March 8, 2019 | hide | past | favorite | 27 comments



Some companies just have business models that only work if the quality of life for the world as a whole declines. They focus on extracting value for a few at the expense of the many (many of which don't realize what's happening).

Facebook is one of them. Its entire business revolves collecting and selling information about its users. One day, the information released in this way will be recognized as the obviously harmful stuff that it was all along. We're not there yet, though.


Seems like society is taking the position that if you can't prove it does harm then it must be fine. I'm starting to understand where conservatism comes from.


Facebook shows ads, but it’s not fair to say that it sells user information. Working in ad tech, there are tons of companies that actually sell use information — it’s a different business model.


Excerpt: "Here are four pressing questions about privacy that Mr. Zuckerberg conspicuously did not address: Will Facebook stop collecting data about people’s browsing behavior, which it does extensively? Will it stop purchasing information from data brokers who collect or “scrape” vast amounts of data about billions of people, often including information related to our health and finances? Will it stop creating “shadow profiles” — collections of data about people who aren’t even on Facebook? And most important: Will it change its fundamental business model, which is based on charging advertisers to take advantage of this widespread surveillance to “micro-target” consumers?

Until Mr. Zuckerberg gives us satisfying answers to those questions, any effort to make Facebook truly “privacy-focused” is sure to disappoint."


FB cannot be privacy focused, the business model prevents it, there history belies it. How many times do you bang your head on a wall before you say no more?


FB's average revenue per user is $7.37.

Unless they move into fully subscription based service (good luck), I don't see any way for them to make that kind of money without using carefully targeted ads.


I don’t believe Facebook will ever offer true end-to-end encryption. They will hold the master keys and allow themselves to inject whatever they want into “your private living room” while giving users the perception of privacy and secrecy.

Or they will try to monetize the metadata but keep the message body encrypted.

This feels like Zuckerberg embracing the backlash he’s ignored and dismissed for so long and now trying to use it as a selling point because his traditional growth model is beginning to show signs of slowing.


They could also do "real" e2e encryption but perform keyword analysis on the message's content on the client


Did a web search for that number to complete: it's $7.37 per user in Q4 of 2018, and that number is averaged over the whole world. In the US, it's >$30.


Everyone I talked to, as well as any article I read about Zuckerberg's privacy statement, is not believing it. There are major trust issues Facebook and its CEO have created for themselves.


Reading Zuck’s post actually made me quite sad. I’m in perhaps the minority of folks that don’t really care too deeply about FB tracking and advertising to me, but I feel strongly about the social implications of FB and the fake news problem.

FB has had an undeniably negative affect on society as a whole because of its ability to let people dive deeper into their own bubbles of reality (whether truthful or not).

Putting curtains on the blinds only means the problem will get WORSE, not better. People won’t have that silly emotion called shame (not that they have much of it anyway) getting in the way of a good racist rant or news story about the end of the world or how people with darker skin are so scary.

At least now there’s a chance someone with rationality will chime in if a story gets spread far enough. In the new private FB our divide is going to turn into a canyon :(


I'm curious to hear more from you on this, because I've always felt like Facebook-imposed censorship would be far worse than open dialogue on the site. People already wall themselves off inside of echo chambers that only reinforce already-held opinions, so is your thought that the worst of these echo chambers can be eliminated via FB filters?

Honestly, the most toxic content I've seen on FB hasn't been gore or racism or sexism or anything else distasteful, it's been blatantly false clickbait/opinion pieces that flare up existing divides.

I've always thought that the best way to make FB a decent place again would be to simply ban sharing of links, so that the feed could become what it was originally meant to be: a place where I can see what my friends are up to. But that would also mean FB would have to dial back its targeted advertisingm, which is all links (seriously, would contextual advertising really be that bad? Bad enough that we can justify the insanity that is FB data collection) and stop scooping up all the data they can get... which doesn't really seem their MO.


Youre saying censorship is bad right?

Here's the thing. Facebook already editorializes in two ways. First its community standards, and second its algorithm. THEY CHOOSE what to put at the top (first thousand) results of your feed.

They dont need to censor what people post to make BETTER decisions about what floats to the top. They just need to do a better job of prioritizing and placing better content. It is out there and it exists, surface it. They want to APPEAR impartial, as if your newsfeed is just what your friends post, BUT they are still deciding which posts are more important than the others. They are using the wrong signals to float the cream.

The question is less is censorship bad, and more "can/should fb better own that it is NOT impartial, and IS injecting its voice into people's conversations." If facebook wants to be impartial, then step back and act like an infrastructure, and if it wants to have a say in community standards, fucking own it and do a better job of judging what is worthy of the top of a feed.


"Things can get better if we want them to — through regulatory oversight and political pressure"

aka Regulatory capture. Raise the bar high so no other social network startup can succeed.


I've always taken exception to this way of thinking. Assuming for a second that the regulations are _good_, for instance that companies must DELETE (really delete) all data after 90 days, why should I care if a Johnny-come-lately can't compete? The regulation is good for the consumer right?

You may then say that the assumption is big, and the regulations will be of a different (worse) nature. Well, then it's the regulation that is the problem, not the fact that only moneyed entities can afford to implement them.


What I find aggravating about debates over regulatory capture is that there's no space for saying that both government and private industry are bad, because concentration of power is always bad no matter where it occurs.

Regulatory overreach is real, market failure is real, and regulatory capture represents a failure of all parties.


If the regulation prevents a new startup from trying to emulate Facebook's damaging business model, that's a good thing. We don't need any more surveillance/manipulation corporations, the market is already thoroughly saturated.


This would be perfect. Prevent this horrible business model completely.


He wants to build a western WeChat right? Isn't that the core idea of this supposed shift currently in progress?


If Facebook really wanted to shift to privacy, they would add a paid, ad free, no tracking option.


Would that really fix anything? Nobody would trust that Facebook isn't tracking them anyway.


They already have shadow profiles of users not on the platform, so I'm sure they'd still track users even if they paid. There would just be some cleverly worded language about not using the data to show you ads, but that leaves the door open for other uses of the data.


That few would opt for. I don't see how this would actually solve anything.


More quotations from the article: "The plan, in effect, is to entrench Facebook’s interests while sidestepping all the important issues...Will it change its fundamental business model, which is based on charging advertisers to take advantage of this widespread surveillance to “micro-target” consumers? ... We don’t want to end up with all the same problems we now have with viral content online — only with less visibility and nobody to hold responsible for it...I think the problem is the invasive way it makes its money and its lack of meaningful oversight."


Is any big company willing to pull from FB ad platform on grounds of privacy/ethics? I doubt even Apple, so called Epitome of Privacy can afford that.

Is there any organization where you can pledge to stay out of FB? I'll consider buying my products from Members of such organization.


Basecamp started a “Facebook Free” movement for businesses who pledge to not use Facebook in any way. Their original announcement: https://m.signalvnoise.com/become-a-facebook-free-business/

Some press coverage of it: https://www.cnbc.com/2019/03/06/some-advertisers-are-quittin...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: