Monday, 27 April 2009

The Singularity is far from near

There are lots of grand theories in science fiction, and that’s what makes it fun to read and watch. Some of those ideas even make it from fiction to reality, such as Arthur C Clarke’s prescient visions of geosynchronous satellite orbits. But one concept which came from science fiction and has become much-talked about in less entertainment-oriented circles is The Singularity.

The concept is explained very nicely here, but in a nutshell it’s the point where computers get smart enough to design themselves, so they can begin to evolve without our input. From that moment forward the human race will be extinct. Except that the whole theory is based on a massive philosophical preconception.

The problem with theories about artificial intelligence is that almost all of them treat human beings like static machines, which will eventually be understood in the same way we now understand problems in the physical sciences. But we don’t know that humans are purely machines, even if parts of us clearly are. We could be. However, theories like that of The Singularity assume we are.


It all comes down to that infamous ancient Greek concept, hubris. We are arrogant enough to believe that the world is the way we see it. And it never is. No scientific theory is ever finished – there is always a bit more detail to fill in when you look a little closer. Underneath atoms, we found subatomic particles, and then quarks, and so on. The voyage of discover never stops.


So whenever The Singularity finally takes off, it will be based on an incomplete picture of what human intelligence is. That’s not to say that evolving machines won’t happen. But they won’t necessarily mean the whole human race is obsolete. Just as pocket calculators have meant we don’t need to know how to add up as well as we used to, future machines will increasingly enhance our faculties, without having to replace us completely.


In particular, the Turing test so often quoted as the litmus test of artificial intelligence is highly misleading. It argues that computers will have become humanly intelligent when they can fool humans into believing that they are. But that doesn’t necessarily mean computers will have become human, just that humans can be fooled.


To cut a long story short, talk of The Singularity sidelines whole areas of human activity which are still a very long way from being understood, like creativity. It’s something I feel so strongly about, I’ve written a whole book where I argue against such complete theories of human intelligence,
Can Computers Create Art. Digital technology will continue to create some amazing things over the next few decades, many of which we simply cannot predict now. But human beings will still be playing a central role, albeit a dramatically enhanced one.

Sunday, 26 April 2009

The true genius of George Lucas

If you grew up in the 1970s or later, your life will have been affected by Star Wars in some way or other. The movie franchise is the most successful in US history, and third most successful worldwide. But that isn’t George Lucas’s most significant legacy.

In my last post, I argued that one of the side effects of piracy in online video distribution is that it will drive a much closer synergy between content and advertising. But George Lucas saw this coming in the 1970s, and his dominance of the movie industry is testament to this. When the original Star Wars was made, nobody was sure it would do as well as it did, especially not the people who stumped up the cash so it could be made. So they let George Lucas retain sole rights to any merchandise resulting from the movie.

As someone who lives in a house where it’s sometimes hard to move because of Star Wars Lego (it’s all owned by my two sons, honest), I can vouch for how smart George Lucas was to make that deal. The movies and cartoon spinoffs (ie, the content) clearly make a packet in their own right, but they also sell an absolute truckload of material goods. So they are very effective advertising too.

Content which has purely been created to sell merchandise is often very shallow and lacking in real entertainment value. Many die-hard Star Wars fans reckon the current Clone Wars cartoons on Cartoon Network, and the movie which served as a pilot for them, are a bit too far in that direction. But this has really always been the subtext of Star Wars. Nevertheless, even if they do have their faults, the movies are far more than mere marketing tools – and yet they do that job extremely well too.

So as you ponder where media is headed in the free-for-all of online video distribution, spare a thought for George Lucas. He worked out how to build an audience and make a profit from them beyond the content itself 30 years ago. That’s why he looks so kindly on Star Wars fan movies, and even judges a category in The Official Star Wars Fan Film Awards (now known as the Star Wars Fan Movie Challenge). Because if you can make content that not only people want to watch, but also makes them so obsessed they go and buy stuff, you’ve really cracked how to earn money in the brave new digital age.

Friday, 24 April 2009

Pirate generation

The conviction of the founders of Piratebay last week has brought the debate about file sharing systems back into the headlines again. With appeals now in progress and accusations that the judge had a conflict of interest, the story is going to run and run. Like Twitter, everyone has an opinion about it. But I can’t help feeling a lot of those opinions are still missing the fundamental implications of unrestricted file sharing. Piracy is not just changing the way we get our content. It’s going to alter the nature of content itself.


Many people on both sides of the argument now realise that piracy is a competing business model, not just either a right of freedom (if you’re a pirate), or a heinous crime (if you’re a content owner). For example, in his column on Gizmodo, Brian Lam explains how he uses legal services when they provide a better experience than Bit Torrent. He mentions Hulu.com, but here in the UK we could easily substitute the increasingly excellent BBC iPlayer, although Hulu.com could be on its way over here as well.


However, the use of Bit Torrent to download films and TV programmes has become so much a part of life, it’s no longer being given a second thought by many on the pro-piracy side of the argument. Just look at the backtracking update Lam puts at the end of his column, when he realises he has just openly admitted to persistent copyright infringement. It was so natural to him, he didn’t even notice at first.


The usual defence of piracy revolves around how the existing broadcasting and DVD distribution systems are too restrictive, and can’t deliver content when, where and how it is desired. Some on the pro-piracy side mitigate themselves a bit further by explaining that they do subscribe to premium channels which carry the content they have Bit Torrented, and they will buy the TV show or film on DVD when it becomes available. So they are paying for the content eventually. It’s just that they want to watch the content now, and the current system won’t allow them to do that.


Now, I don’t mean to judge the people I’m referring to here. Their arguments have some validity, and I don’t think they are particularly criminal, nor indeed are many of the people who regularly use Bit Torrent. It’s a classic case of Tragedy of the Commons – the opportunity is there, and people take it. The technology has enabled modes of content consumption which legal distribution systems just haven’t caught up with yet. Just as the original Napster let the genie out of the bottle for music downloads, Bit Torrent and streaming sites have enabled modes of video distribution which many people prefer to the more rigid, structured approach of scheduled broadcasting.


However, the very content which pirates are downloading is at risk, because piracy breaks the economic system which makes it possible for that content to exist in the first place, at least in its current form. We’re lucky in the UK to have a completely unique form of television economics in the shape of the BBC. We pay an annual license fee for BBC content. In return, we receive content which doesn’t just satisfy the masses, but has a remit to serve minority interests as well. Sharing this content doesn’t necessarily even have a negative effect, because the content was already paid for by our license fees, although the BBC does make an increasingly important income from international syndication. The more people watch it, the better.


This is a very unusual system, though, and nowhere else in the world has anything like it, bar totalitarian regimes with national TV services which are essentially vehicles for government propaganda. Most of the time, in most parts of the globe, TV is made for commercial reasons, with advertising revenue in mind. Film, on the other hand, makes the majority of its income from theatrical release, although there is also significant money to be made from DVD sales.


When you Bit Torrent either type of content, you obviously avoid the makers of that content receiving any of this return on their investment. At bottom, though, it’s not just the loss of income which is a problem. In an advertising model, it’s the lack of being able to track and predict viewing figures which is the underlying threat. This means that the content cannot be ‘monetized’, because the makers can’t argue to their funding organisations what the audience for their content will be, nor can they control how advertising is sold alongside that content.


This situation cannot be solved by trying to push the genie back in the bottle, which cases like the Swedish one against Pirate Bay are attempting to do. And you can’t blame people from using Bit Torrent sites when they’re easier to operate than iTunes, whilst the punishments for doing so are so completely disproportionate and indiscriminate that they are no deterrent at all. Legitimate services need to provide the same easy user experience as Bit Torrent, or even better. Lots of people realise this now, including mainstream content owners. Hulu and BBC iPlayer are getting in that direction, and Spotify is shaping up nicely for music too.


But more radical changes are in the wings. It’s just too easy to avoid advertising nowadays, so advertising needs to take a different strategy in order to remain a relevant business model. It’s very significant that ITV is currently trying to obtain permission from Oftel for product placements in its programmes. That sacred separation between advertising and content is what is really under threat. They will be increasingly intertwined. TV will become more like MTV, which was the genius result of realising that music videos weren’t just adverts for songs, but could be content in their own right. The indirect result of piracy will be this – content which is advertising, so you can’t have one without the other. Then it won’t matter if you steal it or not.

Saturday, 18 April 2009

My two pence about Twitter (or Late developer, part 2)

Twitter has hit the mainstream news headlines in the UK over the last few months. Opinion columns discussing its merits or vacuousness have been flowing thick and fast. I even overheard some middle-aged ladies on the train last week asking each other if they knew what it was (they didn’t). As with other “social media” sites, opinions about Twitter are bitterly divided. A good friend today joked that it was a sackable offence for his employees to use Twitter or Facebook. Others reckon Twitter is the second digital coming, Jesus Christ reborn in bits and bytes. I’m sure there will be plenty in this camp attending the Media140 event on May 20th.


The problem with Twitter is that if you look at it from outside it really can seem like a complete waste of time – another pointless fad for geek fashion victims. This was definitely my opinion of it for some years. One or two friends had been urging me to join in since it launched. It was the persistence of one of these which eventually convinced me to give it a try. Before that, though, it was just wibble – like the worst aspects of blogging, only shortened and crystallised into gleaming gems of pointlessness. Who wants to know if you’ve just made yourself some tea, and you put too much sugar in it? Maybe a few close friends will be amused, but turning your daily life into a constant aphoristic stream of consciousness has little obvious value, unless you’re really famous and have a fascinating life. That certainly doesn’t apply to the majority of Twitterers.


The problem is that Twitter is harder to get your head round than Facebook. It’s a much more raw, single-purpose tool, and its description as “microblogging” is very misleading. Facebook is very obviously about friends and acquaintances. Once you’re linked (or relinked) on Facebook, the connection is permanent, at least for the life of the service. Your Facebook page is directly connected to your identity, and all the rest of the paraphernalia is built on that. It’s a tool for keeping in touch – as frequently or infrequently as you want. It’s a record of your life, or the bits you want to present publicly. If you post an album of pictures on Facebook it stays there, for friends to take a look at months later.


Twitter, on the other hand, is a constant barrage of updates and very little else. A Twitter posting can be lost in the stream within just a few hours. You could probably track it down if you focused on just the postings from the particular Twitter user. But that is not what Twitter is all about – and herein lies the reason why it’s hard to understand. It’s not about checking up on what your friends have been up to. It’s more like a mixture between RSS feed, online forum and instant messaging. Unlike blogging (and Facebook status updates), where comments are secondary, with Twitter they’re almost as important as original postings. The entire point of it is that insubstantial “buzz” of what people are talking about – be they close friends, people you hardly know but have a professional link to, or official news sources. Maybe even complete strangers, whom you feel like “Following” out of a pure whim.


I’ve only been using Twitter a few months myself – like this blog, I’m not exactly leading the curve with this particular technological development. But as someone who works from home more than I spend in an office environment, Twitter provides something very special. It’s like the banter I was used to during the near-decade I spent working at Dennis Publishing. Not the same – no digital medium has yet can come close to replacing real in-the-flesh experience. But as a means of keeping a finger on the pulse of the global conversation which builds our culture, it’s extraordinarily effective.


If you don’t get Twitter, it’s probably because you have enough banter in your social environs already, or just don’t want that much. Certainly, if you Twitter instead of talking to real people that’s a clear sign of digital addiction – just as bad as spending your time down the pub with your mates sending texts to other mates who aren’t there. But used as an extension of existing social life, rather than a replacement, Twitter is a truly revolutionary tool, up there with email, the World Wide Web, the printing press and breathing. Well, maybe not as important as the last of these, but close. Just make sure you avoid the Fail Whale.

Late developer

Starting a new blog in 2009 seems like a pretty futile thing to do. There are literally hundreds of thousands of dormant blogs out there – including a couple of my own. But I have to admit to not really “getting” blogs until quite recently. This will seem very weird if you know me. I was well ahead of the curve for online culture, taking a course online in 1991 (in philosophy at the New School for Social Research in New York City), reading Mondo 2000 and Wired US since issue one around the same time, and now in my fourteenth year as a technology journalist.

This is far from my first blog, too. I tried to create one about digital video, which died after about three posts and no longer exists. Another was entirely personal, but very self-promotional. That one hasn’t been much more productive, although I haven’t deleted it yet. My most successful blog is my collection of satirical fake news stories, Tehwinquirer.net, most of which came from my regular spot in Custom PC magazine. I ran out of steam on it because I ran out of steam with the magazine articles. I may go back to that one, although The Onion does it so much better and with a seriously huge budget.

So now we have Mediology, a rather pretentious and pseudo-scientific title. Maybe I should clear that one up straight away. Mediology is a particular strain of cultural theory which focuses on how ideas are transmitted across society and through time, and is usually attributed to a somewhat obscure French academic by the name of Regis Debray. I like the spirit of his endeavour, and I like the name given to his theory, because it sounds like a more serious area of research than “media studies”. But my focus, as it has been for the last couple of decades, will be on how digital technology translates itself into contemporary, hyper-mediated culture.

What I have realised about blogs, rather late (and thanks to teaching an online class on Geert Lovink's book Zero Comments at the European Graduate School) , is that they are not a democratic form of publishing, which could be your own little magazine and maybe even make you a bit of money. The chances of making money are extremely slim. Only a handful of blogs are financially successful, and even fewer earn enough for their contributors to work on them full time. Look at what has happened to Shiny Media here in the UK recently. One of the best home-grown blogs we have, and still not profitable enough to be the publishing revolution originally intended.

Nevertheless, blogs can be a valid business. They can be all kinds of things. But mostly what they are is an outlet for thoughts, like a diary you deliberately leave around hoping someone will get nosy and read. They are part of the burgeoning surveillance culture we are building for ourselves, alongside the self-imposed voyeurism of social networks like Facebook and the millions of cameras invading our cities for our “safety and wellbeing”. But there’s no guarantee anyone will bother to read them. In other words, they’re written primarily for ourselves, and if other people read them that’s a secondary bonus.

I’m writing this blog, therefore, to work through some of my own thoughts. The main focus will be ideas for a book I am currently labouring on, bringing together research over my whole career and in particular from the courses I teach at Ravensbourne College of Design and Communication and experiences producing videos for TrustedReviews. The book is intended to trace the rise of online video distribution, placing it in historical context with TV broadcasting, film, and Internet technology. Massive change is already underway, and we’re still only at the beginning.

Let’s see if I can keep this one going.