Monday, 27 April 2009
The Singularity is far from near
The concept is explained very nicely here, but in a nutshell it’s the point where computers get smart enough to design themselves, so they can begin to evolve without our input. From that moment forward the human race will be extinct. Except that the whole theory is based on a massive philosophical preconception.
The problem with theories about artificial intelligence is that almost all of them treat human beings like static machines, which will eventually be understood in the same way we now understand problems in the physical sciences. But we don’t know that humans are purely machines, even if parts of us clearly are. We could be. However, theories like that of The Singularity assume we are.
It all comes down to that infamous ancient Greek concept, hubris. We are arrogant enough to believe that the world is the way we see it. And it never is. No scientific theory is ever finished – there is always a bit more detail to fill in when you look a little closer. Underneath atoms, we found subatomic particles, and then quarks, and so on. The voyage of discover never stops.
So whenever The Singularity finally takes off, it will be based on an incomplete picture of what human intelligence is. That’s not to say that evolving machines won’t happen. But they won’t necessarily mean the whole human race is obsolete. Just as pocket calculators have meant we don’t need to know how to add up as well as we used to, future machines will increasingly enhance our faculties, without having to replace us completely.
In particular, the Turing test so often quoted as the litmus test of artificial intelligence is highly misleading. It argues that computers will have become humanly intelligent when they can fool humans into believing that they are. But that doesn’t necessarily mean computers will have become human, just that humans can be fooled.
To cut a long story short, talk of The Singularity sidelines whole areas of human activity which are still a very long way from being understood, like creativity. It’s something I feel so strongly about, I’ve written a whole book where I argue against such complete theories of human intelligence, Can Computers Create Art. Digital technology will continue to create some amazing things over the next few decades, many of which we simply cannot predict now. But human beings will still be playing a central role, albeit a dramatically enhanced one.
Sunday, 26 April 2009
The true genius of George Lucas
In my last post, I argued that one of the side effects of piracy in online video distribution is that it will drive a much closer synergy between content and advertising. But George Lucas saw this coming in the 1970s, and his dominance of the movie industry is testament to this. When the original Star Wars was made, nobody was sure it would do as well as it did, especially not the people who stumped up the cash so it could be made. So they let George Lucas retain sole rights to any merchandise resulting from the movie.
As someone who lives in a house where it’s sometimes hard to move because of Star Wars Lego (it’s all owned by my two sons, honest), I can vouch for how smart George Lucas was to make that deal. The movies and cartoon spinoffs (ie, the content) clearly make a packet in their own right, but they also sell an absolute truckload of material goods. So they are very effective advertising too.
Content which has purely been created to sell merchandise is often very shallow and lacking in real entertainment value. Many die-hard Star Wars fans reckon the current Clone Wars cartoons on Cartoon Network, and the movie which served as a pilot for them, are a bit too far in that direction. But this has really always been the subtext of Star Wars. Nevertheless, even if they do have their faults, the movies are far more than mere marketing tools – and yet they do that job extremely well too.
So as you ponder where media is headed in the free-for-all of online video distribution, spare a thought for George Lucas. He worked out how to build an audience and make a profit from them beyond the content itself 30 years ago. That’s why he looks so kindly on Star Wars fan movies, and even judges a category in The Official Star Wars Fan Film Awards (now known as the Star Wars Fan Movie Challenge). Because if you can make content that not only people want to watch, but also makes them so obsessed they go and buy stuff, you’ve really cracked how to earn money in the brave new digital age.
Friday, 24 April 2009
Pirate generation
The conviction of the founders of Piratebay last week has brought the debate about file sharing systems back into the headlines again. With appeals now in progress and accusations that the judge had a conflict of interest, the story is going to run and run. Like Twitter, everyone has an opinion about it. But I can’t help feeling a lot of those opinions are still missing the fundamental implications of unrestricted file sharing. Piracy is not just changing the way we get our content. It’s going to alter the nature of content itself.
Saturday, 18 April 2009
My two pence about Twitter (or Late developer, part 2)
Twitter has hit the mainstream news headlines in the
The problem with Twitter is that if you look at it from outside it really can seem like a complete waste of time – another pointless fad for geek fashion victims. This was definitely my opinion of it for some years. One or two friends had been urging me to join in since it launched. It was the persistence of one of these which eventually convinced me to give it a try. Before that, though, it was just wibble – like the worst aspects of blogging, only shortened and crystallised into gleaming gems of pointlessness. Who wants to know if you’ve just made yourself some tea, and you put too much sugar in it? Maybe a few close friends will be amused, but turning your daily life into a constant aphoristic stream of consciousness has little obvious value, unless you’re really famous and have a fascinating life. That certainly doesn’t apply to the majority of Twitterers.
The problem is that Twitter is harder to get your head round than Facebook. It’s a much more raw, single-purpose tool, and its description as “microblogging” is very misleading. Facebook is very obviously about friends and acquaintances. Once you’re linked (or relinked) on Facebook, the connection is permanent, at least for the life of the service. Your Facebook page is directly connected to your identity, and all the rest of the paraphernalia is built on that. It’s a tool for keeping in touch – as frequently or infrequently as you want. It’s a record of your life, or the bits you want to present publicly. If you post an album of pictures on Facebook it stays there, for friends to take a look at months later.
Twitter, on the other hand, is a constant barrage of updates and very little else. A Twitter posting can be lost in the stream within just a few hours. You could probably track it down if you focused on just the postings from the particular Twitter user. But that is not what Twitter is all about – and herein lies the reason why it’s hard to understand. It’s not about checking up on what your friends have been up to. It’s more like a mixture between RSS feed, online forum and instant messaging. Unlike blogging (and Facebook status updates), where comments are secondary, with Twitter they’re almost as important as original postings. The entire point of it is that insubstantial “buzz” of what people are talking about – be they close friends, people you hardly know but have a professional link to, or official news sources. Maybe even complete strangers, whom you feel like “Following” out of a pure whim.
I’ve only been using Twitter a few months myself – like this blog, I’m not exactly leading the curve with this particular technological development. But as someone who works from home more than I spend in an office environment, Twitter provides something very special. It’s like the banter I was used to during the near-decade I spent working at Dennis Publishing. Not the same – no digital medium has yet can come close to replacing real in-the-flesh experience. But as a means of keeping a finger on the pulse of the global conversation which builds our culture, it’s extraordinarily effective.
If you don’t get Twitter, it’s probably because you have enough banter in your social environs already, or just don’t want that much. Certainly, if you Twitter instead of talking to real people that’s a clear sign of digital addiction – just as bad as spending your time down the pub with your mates sending texts to other mates who aren’t there. But used as an extension of existing social life, rather than a replacement, Twitter is a truly revolutionary tool, up there with email, the World Wide Web, the printing press and breathing. Well, maybe not as important as the last of these, but close. Just make sure you avoid the Fail Whale.
Late developer
This is far from my first blog, too. I tried to create one about digital video, which died after about three posts and no longer exists. Another was entirely personal, but very self-promotional. That one hasn’t been much more productive, although I haven’t deleted it yet. My most successful blog is my collection of satirical fake news stories, Tehwinquirer.net, most of which came from my regular spot in Custom PC magazine. I ran out of steam on it because I ran out of steam with the magazine articles. I may go back to that one, although The Onion does it so much better and with a seriously huge budget.
So now we have Mediology, a rather pretentious and pseudo-scientific title. Maybe I should clear that one up straight away. Mediology is a particular strain of cultural theory which focuses on how ideas are transmitted across society and through time, and is usually attributed to a somewhat obscure French academic by the name of Regis Debray. I like the spirit of his endeavour, and I like the name given to his theory, because it sounds like a more serious area of research than “media studies”. But my focus, as it has been for the last couple of decades, will be on how digital technology translates itself into contemporary, hyper-mediated culture.
What I have realised about blogs, rather late (and thanks to teaching an online class on Geert Lovink's book Zero Comments at the European Graduate School) , is that they are not a democratic form of publishing, which could be your own little magazine and maybe even make you a bit of money. The chances of making money are extremely slim. Only a handful of blogs are financially successful, and even fewer earn enough for their contributors to work on them full time. Look at what has happened to Shiny Media here in the UK recently. One of the best home-grown blogs we have, and still not profitable enough to be the publishing revolution originally intended.
Nevertheless, blogs can be a valid business. They can be all kinds of things. But mostly what they are is an outlet for thoughts, like a diary you deliberately leave around hoping someone will get nosy and read. They are part of the burgeoning surveillance culture we are building for ourselves, alongside the self-imposed voyeurism of social networks like Facebook and the millions of cameras invading our cities for our “safety and wellbeing”. But there’s no guarantee anyone will bother to read them. In other words, they’re written primarily for ourselves, and if other people read them that’s a secondary bonus.
I’m writing this blog, therefore, to work through some of my own thoughts. The main focus will be ideas for a book I am currently labouring on, bringing together research over my whole career and in particular from the courses I teach at Ravensbourne College of Design and Communication and experiences producing videos for TrustedReviews. The book is intended to trace the rise of online video distribution, placing it in historical context with TV broadcasting, film, and Internet technology. Massive change is already underway, and we’re still only at the beginning.
Let’s see if I can keep this one going.
