How To Change Your Facebook Settings To Opt Out of Platform API Sharing | EFF

With the Facebook scandal casting a shadow on anything even remotely tech related, we're not short on opinion. What's surprised me most about the whole situation, is that anyone should be surprised at all. What's more, I can't see how the proposed changes will do much.  The most expedient thing right now would seem to be sharing information like this from the Electronic Frontier Foundation. Locking your profile down, insofar as it can be locked down. While you defintely should — lock it down — sadly the horse has bolted, and with your data.

Over the weekend, it became clear that Cambridge Analytica, a data analytics company, got access to more than 50 million Facebook users' data in 2014. The data was overwhelmingly collected, shared, and stored without user consent. The scale of this violation of user privacy reflects how Facebook's terms of service and API were structured at the time. Make no mistake: this was not a data breach. This was exactly how Facebook's infrastructure was designed to work.

My point exactly, this is how it was designed to work. Nobody should be the least bit surprised at this situation. If you’re similarly cynical about the efficacy of the plan to address the situation, and at the same time caught in a bind like most people on the question of whether to keep using the service. The minimum requirement is another look over those settings.

You shouldn't have to do this. You shouldn't have to wade through complicated privacy settings in order to ensure that the companies with which you've entrusted your personal information are making reasonable, legal efforts to protect it. But Facebook has allowed third parties to violate user privacy on an unprecedented scale, and, while legislators and regulators scramble to understand the implications and put limits in place, users are left with the responsibility to make sure their profiles are properly configured.

Not only should you not have to do it, but you shouldn’t expect that settings will routinely change to such a degree that maintaining the level of privacy you desire requires you to check over it every time Facebook rearranges the furniture.

 

The Case Against Retweets | The Atlantic

The Atlantic For all those people abandoning Twitter, I am preparing to share some thoughts on micro.blog. In the meantime, here is a modest proposal for those of you holding on to the bow.

Somewhere along the line, the whole system started to go haywire. Twitter began to feel frenetic, unhinged, and—all too often—angry. Some people quit. Others, like Schulz, cut way back. I felt the same urge, but I wanted to do something less extreme, something that would allow me to keep the baby, even as I drained the bathwater. So I began to take note each time I experienced a little hit of outrage or condescension or envy during a Twitter session. What I found was that nearly every time I felt one of these negative emotions, it was triggered by a retweet.

Permalink

Do Not, I Repeat, Do Not Download Onavo, Facebook’s Vampiric VPN Service

Old news, yes I know. However if anything bears repeating, this is over qualified. If clarification is needed, the Onavo VPN does not enable any kind of new practice from Facebook. No, it simply makes it dramatically more efficient for Facebook to do what they always do, track everything. What’s particularly nauseating in this instance, is how they’re taking advantage of general misunderstanding around security and privacy. To my mind, this meets the modern definition of a lie. Onavo is spyware.

If you’re someone who can’t live without Facebook or simply can’t find the courage to delete it, the Onavo appears under the “Explore” list just above the “Settings” menu. I’d recommend you never click it. Facebook is already vacuuming up enough your data without you giving them permission to monitor every website you visit.

Gizmodo Australia | I Can’t Believe How Stupid Facebook’s News Feed Update Is

That I’m linking to this, instead of the actual post has everything to do with Facebook’s relentless assault on web standards. The idea that Facebook doesn’t know what it’s doing here is far fetched at best. The bubble effect is not some esoteric theory, it’s common knowledge that Facebook is a confirmation bias machine. This addresses the problem by doubling down on the affect. Nobody should be surprised.

From the horse’s mouth,

We could try to make that decision ourselves, but that's not something we're comfortable with. We considered asking outside experts, which would take the decision out of our hands but would likely not solve the objectivity problem. Or we could ask you – the community – and have your feedback determine the ranking. We decided that having the community determine which sources are broadly trusted would be most objective.

You could read this as a cop out, but that would be too kind. Copping out in this instance is conveniently intentional. Gizmodo asks the obvious misanthropic question this move invites,

If people cannot tell truth from garbage, why are those same people being used to rank publications on a scale of trustworthiness?

Other supposedly sober outlets seem to be missing the point here, from Slate

At first blush, it looks like Facebook is doing exactly what I and other critics have long been calling for it to do: acknowledge that its algorithm plays a crucial role in determining what news people read, and take some responsibility for its profound effects on the media and the spread of information. It’s about time, right?

Except that, based on its announcement, Facebook’s approach to a notoriously difficult problem—figuring out which media to trust—appears to be painfully simplistic and naïve.

I think it’s naive to think they are being naive. This approach gives the appearance of doing something, and achieves exactly what they want. Welcome to hyper-reality.