Frequent thinker, occasional writer, constant smart-arse

Category: Privacy

Platform growth over user privacy

Facebook announced that data about yourself (like your phone number) would now be shared with applications. Since the announcement, they’ve backed down (and good work to ReadWriteWeb for raising awareness of this).

I’ve been quoted in RWW and other places as saying the following:

“Users should have the ability to decide upfront what data they permit, not after the handshake has been made where both Facebook and the app developer take advantage of the fact most users don’t know how to manage application privacy or revoke individual permissions,” Bizannes told the website. “Data Portability is about privacy-respecting interoperability and Facebook has failed in this regard.”

Let me explain what I mean by that:

This first screenshot is what users can do with applications. Facebook offers you the ability to manage your privacy, where you even have the ability to revoke individual data authorisations that are not considered necessary. Not as granular as I’d like it (my “basic information” is not something I share equally with “everyone”, such as apps who can show that data outside of Facebook where “everyone” actually is “everyone”), but it’s a nice start.

http:__www.facebook.com_settings_?tab=applications

This second screenshot, is what it looks like when you initiate the relationship with the application. Again, it’s great because of the disclosure and communicates a lot very simply.
Request for Permission

But what the problem is, is that the first screenshot should be what you see in place of the second screenshot. While Facebook is giving you the ability to manage your privacy, it is actually paying lipservice to it. Not many people are aware that they can manage their application privacy, as it’s buried in a part of the site people seldom use.

The reason why Facebook doesn’t offer this ability upfront is for a very simple reason: people wouldn’t accept apps. When given a yes or no option, users think “screw it” and hit yes. But what if they did this handshake, they were able to tick off what data they allowed or didn’t allow? Why are all these permissions required upfront, when I can later deactivate certain permissions?

Don’t worry, its not that hard to answer. User privacy doesn’t help with revenue revenue growth in as much as application growth which creates engagement. Being a company, I can’t blame Facebook for pursuing this approach. But I do blame them when they pay lipservice to the world and they rightfully should be called out for it.

Another scandal about data breaches shows the unrealised potential of the Internet as a network

The headlines today show a data breach of the Gawker media group.

Separately, I today received an email from a web service that I once signed up to but don’t use. The notice says my data has been compromised.

Deviant Art community breach

In this case, a partner of deviantART.COM had been shared information of users and it was compromised. Thankfully, I used one of my disposable email addresses so I will not be affected by the spammers. (I create unique email addresses for sites I don’t know or trust, so that I can shut them off if need be.)

But this once again raises the question: why did this happen? Or rather, how did we let this happen?

Delegated authentication and identity management
What was interesting about the Gawker incident was this comment that “if you logged in via Facebook Connect, in which case you’ll be safe.”

Why safe? For the simple reason that when you connect with Facebook Connect, your password details are not exchanged and used as a login. Instead, Facebook will authenticate you and notify the site of your identity. This is the basis of the OpenID innovation, and related to what I said nearly two years ago that it’s time to criminalise the password anti-pattern. You trust one company to store your identity, and you reuse your identity in other companies who provide value if they have access to your identity.

It’s scandals like this remind us for the need of data interoperability and building out the information value chain. I should be able to store certain data with certain companies; have certain companies access certains types of my data; and have the ability to control the usage of my data should I decide so. Gawker and deviantART don’t need my email: they need the ability to communicate with me. They are media companies wanting to market themselves, not technology companies that can innovate on how they protect my data. And they are especially not entitled for some things, like “sharing” data with a partner who I don’t know or can trust, and that subsequently puts me at risk.

Facebook connect is not perfect. But it’s a step in the right direction and we need to propel the thinking of OpenID and its cousin oAuth. That’s it, simple. (At least, until the next scandal.)

Let’s kill the password anti-pattern before the next web cycle

Authenticity required: password?I’ve just posted an explanation on the DataPortability Blog about delegated authentication and the Open Standard OAuth. I give poor Twitter a bit of attention by calling them irresponsible (which their password anti-pattern is – a generic example being sites that force people to give up their passwords to their e-mail account, to get functionality like finding your friends on a social network) but with their leadership they will be a pin-up example which we can promote going forward and well placed in this rapidly evolving data portability world. I thought the news would have calmed down by now, but new issues have come to light further highlighting the importance of some security.

With the death of Web 2.0, the next wave of growth for the Web (other than ‘faster, better, cheaper’ tech for our existing communications infrastructure) will come from innovation on the data side. Heaven forbid another blanket term for this next period, which I believe we will see the rise of when Facebook starts monetising and preparing for an IPO, but all existing trends outside of devices (mobile) and visual rendering (3D Internet) seem to point to this. That is, innovation on machine-to-machine technologies, as opposed to the people-to-machine and people-to-people technologies that we have seen to date. The others have been done and are being refined: machine-to-machine is so big it’s a whole new world that we’ve barely scratched the surface of.

But enough about that because this isn’t a post on the future – it’s on the current – and how pathetic current practices are. I caught up with Carlee Potter yesterday – she’s a young Old Media veteran who inspired by the Huffington Post, wants to pioneer New Media (go support her!). Following on from our discussion, she writes in her post that she is pressured by her friends to add applications on services like Facebook. We started talking about this massive cultural issue that is now being exported to the mainstream, where people freely give up personal information – not just the apps accessing it under Facebook’s control, but their passwords to add friends.

I came to the realisation of how pathetic this password anti-pattern is. I am very aware that I don’t like the fact that various social networking sites ask me for private information like my e-mail account, but I had forgotten how used to the process I’ve become to this situation that’s forced on us (ie, giving up our e-mail account passsword to get functionality).

Argument’s that ‘make it ok’ are that these types of situations are low risk (ie, communication tools). I completely disagree, because reputational risk is not something easily measured (like financial risk which has money to quantify), but that’s not the point: it’s contributing to a broader cultural acceptance, that if we have some trust of a service, we will give them personal information (like passwords to other services) so we can get increased utility out of that service. That is just wrong, and whilst the data portability vision is about getting access to your data from other services, it needs to be done whilst respecting the privacy of yourself and others.

Inspired by Chris Messina, I would like to see us all agree on making 2009 the year we kill the password anti-pattern. Because as we now set the seeds for a new evolution of the web and Internet services, let’s ensure we’ve got things like this right. In a data web where everything is interoperable, something that’s a password anti-pattern is not a culture that bodes us well.

They say privacy is dead. Well it only is if we let it die – and this is certainly one simple thing we can do to control how our personal information about ourselves gets used by others. So here’s to 2009: where we seek the eradication of the password anti-pattern virus!

Thoughts on privacy – possibly just a txt file away

The other week, a good friend of mine through my school and university days, dropped me a note. He asked me that now that he is transitioning from being a professional student to legal guru (he’s the type I’d expect would become a judge of the courts), that I pull down the website that hosts our experiment in digital media from university days. According to him, its become "a bit of an issue because I have two journal articles out, and its been brought to my attention that a search brings up writing of a very mixed tone/quality!".

In what seemed like a different lifetime for me, I ran a university Journalist’s Society and we experimented with media as a concept. One of our successful experiments, was a cheeky weekly digital newsletter, that held the student politicians in our community accountable. Often our commentary was hard-hitting, and for $4 web hosting bills a month and about 10 hours work each, we become a new power on campus influencing actions. It was fun, petty, and a big learning experience for everyone involved, including the poor bastards we massacred with accountability.

control panel

Privacy in the electronic age: is there an off button?

However this touches on all of us as we progress through life, what we thought was funny in a previous time, may now be awkward that we are all grown up. In this digitally enabled world, privacy has come to the forefront as an issue – and we are now suddenly seeing scary consequences of having all of our information available to anyone at anytime.

I’ve read countless articles about this, as I am sure you have. One story I remember is a guy who contributed to a marijuana discussion board in 2000, now struggles with jobs as that drug-taking past of his is the number one search engine result. The digital world, can really suck sometimes.

Why do we care?

This is unique and awkward, because it’s not someone defaming us. It’s not someone taking our speech out of context, and menacingly putting it a way that distorts our words. This is 100% us, stating what we think, with full understanding what the consequences of our actions were. We have no one but ourselves to blame.

nice arse

Time changes, even if the picture doesn’t: Partner seeing pictures of you – can be ok. Ex seeing pictures of you – likely not ok.

In the context of privacy, is it our right to determine who can see what about us, when we want them to? Is privacy about putting certain information in the "no one else but me" box or is it more dynamic then that – meaning, it varies according to the person consuming the information?

When I was younger, I would meet attractive girls quite a bit older than me, and as soon as I told them my age, they suddenly felt embarrassed. They either left thinking how could they let themselves be attracted to a younger man, treating me like I was suddenly inferior, or they showed a very visible reaction of distress! Actually, quite memorably when I was 20 I told a girl that I was on a date with that I was 22 – and she responded "thank God, because there is nothing more unattractive I find, than a guy that is younger than me". It turned out, fortunately, she had just turned 22. My theory about age just got a massive dose of validation.

Now me sharing this story is that certain information about ourselves can have adverse affects on us (in this case, my sex life!). I normally could not care less about my age, but with girls I would meet when I went out, I did care because it affected their perception of me. Despite nothing changing, the single bit of information about my age would totally change the interaction I had with a girl. Likewise, when we are interacting with people in our lives, the sudden knowledge of a bit of information could adversely affect their perception.

Bathroom close the hatch please

Some doors are best kept shut. Kinky for some; stinky for others

A friend of mine recently admitted to his girlfriend of six months that he’s used drugs before, which had her breakdown crying. This bit of information doesn’t change him in any way; but it shapes her perception about him, and the clash with her perception with the truth, creates an emotional reaction. Contrast this to these two party girls I met in Spain in my nine-months away, who found out I had never tried drugs before at the age of 21. I disappointed them, and in fact, one of them (initially) lost respect for me. These girls and my friends girlfriend, have two different value systems. And that piece of information, generates a completely differing perception – taking drugs can be seen as a "bad person" thing, or a "open minded" person, depending on who you talk to.

As humans, we care about what other people think. It influences our standing in society, our self-confidence, our ability to build rapport with other people. But the issue is, how can you control your image in an environment that is uncontrollable? What I tell one group of people for the sake of building rapport with them, I should also have the ability of ensuring that conversation is not repeated to others, who may not appreciate the information. If I have a fetish for women in red heels which I share with my friends, I should be able to prevent that information from being shared with my boss who loves wearing red heels and might feel a bit awkward the next time I look at her feet.

Any solutions?

Not really. We’re screwed.

Well, not quite. To bring it back to the e-mail exchange I had with my friend, I told him that the historian and technologist in me, couldn’t pull down a website for that reason. After all, there is nothing we should be ashamed about. And whilst he insisted, I made a proposal to him: what about if I could promise that no search engine would include those pages in their index, without having to pull the website down?

He responded with appreciation, as that was what the issue was. Not that he was ashamed of his prior writing, but that he didn’t want academics of today reading his leading edge thinking about the law, to come across his inflammatory criticism of some petty student politicians. He wanted to control his professional image, not erase history. So my solution of adding a robots.txt file was enough to get his desired sense of privacy, without fighting a battle with the uncontrollable.

Who knew, that privacy can be achieved with a text file that has two lines:

User-agent: *

Disallow: /

Those two lines are enough to control the search engines, from a beast that ruins our reputation, to a mechanism of enforcing our right to privacy. Open standards across the Internet, enabling us to determine how information is used, is what DataPortability can help us do achieve so we can control our world. The issue of privacy is not dead – we just need some creative applications, once we work out what exactly it is we think we are losing.