Friday 12 February 2016

Fixing The Algorithm

In the wake of \the Tweetstorm caused by Twitter's quasi-introduction of its algorithmic timeline, - with entirely foreseeable consequences -  I took a quick spin around the various services at Goodwins Central which already enhance my life through automated decisions made on my data. There are lots of them: Facebook, eBay, Amazon, and more Google AIs than you can shake a broomstick at.

They are all terrible. Terrible in different ways, to be sure: the shopping stuff assumes because I've searched for or bought something, I must want more like it' Google Now assumes I work a 9-5 and commute between home and office, and am fixated by stock prices and the news headlines. Facebook... ah, who knows what Facebook thinks. But I miss most of the things my pals say unless I tell it to show me most recent, a setting I can't make stick. I want to know the news that isn't making the news yet. And when I've searched for something, I either don't buy it or I do - either way, I've moved on with my life.

But the algorithms have nothing else to go on, so they go on going on, like toddlers who think because something was good to say or do just now, it'll be better and better the more they repeat it.

The law wisely if tiresomely forbids infanticide. Algorithms lack this protection and so, merely tiresomely, live in clouds beyond our reach. Because they know and I know and you know what we'd do to them if we could reach in there.


You know how it is. You have to use the same stuff I do. None of us likes it. These services have taken on the responsibility of running our lives, and they have become essential. They are proud to do this and be this; their marketing crows about it to excess.  They have become our representatives in cyberspace, providing - and crucially regulating - access and actions.

And they are all run on an an authoritarian dictatorship model. You cannot tell them what to do; they might choose to do something you like or they may not, but you cannot do that which they deny, and you must do that which they compel.

It's all extremely corporate, and this is how companies work of course, internally and externally. You don't get a vote on corporate strategy; whether you work for a place or if you use its services. The boss knows best. If they don't, then there are various forms of codified coups available, but you'll never see a CEO leave because customers or employees have voted them out.

The only people who exert control on companies from outside are the shareholders, the VCs. And they want to see numbers, they want ROI, they want blood and growth, quarter by quarter. It doesn't matter how. Twitter is no good unless it is becoming Facebook - the idea that it could be perfectly fine just being Twitter, with numbers naturally smaller than Facebook and a service set circumscribed to doing one thing well, is inadmissible. Facebook is huge. It has algorithms. Therefore...

It doesn't matter that Twitter and Facebook are actually doing very different jobs; in the eyes of Wall Street they are both 'social media' and only one can win at it. (The real media is just as bad: half the time I go on telly or the radio, it's to talk about Apple v Google in one form or another. But Google's an advertising company and Apple is a manufacturer, I say: nope, they're both 'tech'. The fact that in a sane world they'd be as symbiotic as Ford and Ogilvy is not on the radar.)

You see how it works. It is no bloody good for anyone, of course, but authoritarian dictatorships rarely survive because they're good for people, they survive through instinctive and relentless control.

And right now, that control is being expressed by brain-dead algorithms that get in the way of the very things the companies ostensibly want to provide for us; on the grounds they'll drive 'engagement' and 'exposure' and advertising. And they're doing it in the most personal, ways, tampering with how we communicate, how we interact, how we learn from and educate our friends. I don't really care that Amazon assumes because I've bought some audio patch leads and read about Tarantino breaking a priceless antique guitar I must want to buy a new guitar myself; it's a waste of its time and my bandwidth, but it doesn't stop me searching for socks. If Twitter doesn't show me the things I want to see, which are surprising things or the things my friends have been surprised by, then it does matter. Algorithms don't like surprises.

We have no say in this. You can't argue with an algorithm. You can, however, tell it what to do.

What we need is control, something that scales, something a billion of us can do, whether we[re technical or not, whether we use phones or PCs or whatever. We need a good old fashioned control panel where we can say - do not mix this data with that.  Do not prioritise this sort of data event over that. Do let me know if something from here happens.

All this has to be is a file that sits on the Internet somewhere safe and which is accessible to anything, with permission. It has to be in a standard format, and the only thing in the world that can change it has to be you, and it has to tell you who looks at it and when, but anything that proposes to use or filter your personal data can only do so if it reads that file first and agrees to act on it. It's a bit like robots.txt, but for people - it's terms and conditions, but for them when they want to use us.

Sounds only fair to me.

This presupposes we have a language that can describe personal data, and what algorithms do with it, in ways that people can understand and use. That doesn't really exist yet, but by Toutatis it should. Deciding how we want to control our personal data and the algorithms that use it is a very necessary step to take, for without it we lack the basic tools to express our anger and frustration - and desires and joy - to the companies that think they know these things better than we do.

It will also provide a vocabulary for regulators and lawmakers, should they ever decide to use them.

That done, the tools to effect our retaking of control will be trivially cheap and easy to implement, and their use by companies made effectively compulsory.

It'll be introducing a small level of democracy into the dictatorships. \They'll find it shocking, annoying and a terrible intrusion into their affairs, and I can live with that, but who knows - actually doing what customers want may yet turn out OK after all.

(It becomes even more fun if you apply such ideas to actual politics and real governments, turning the whole electoral concept on its head, but that's for another time...)