AI and Gamers

Here’s a notion that’s been noodling about my head for a while, and a small break from the near-term eschatological horror show that is Brexit, Trump’s willingness to burn the planet to a crispy cinder to avoid criminal responsibility et al; in exchange for the more nebulous apocalypse of the Singularity.

book
Huge credit to Murray Shanahan for a readable and understandable book

 

Ah, yes, Rapture of the Nerds, as it’s mockingly known. For those who aren’t perhaps as steeped in Sci-Fi tropes as I am, the Singularity is (Well, in physics, it’s a point where the current models breakdown, such as the point inside the event horizon of a black hole, where mass and thus gravity become infinite and both Special and General relativity break… Fine. I’ll get back to the point) Ahem. The Singularity, in this context, is a point where technological progress becomes exponential, almost always based upon the creation of a runaway Strong AI which then recursively self-improves at a rate wildly outstripping any possibility of humanity comprehending it.

Or, to simplify to headline standards – “Eggheads homebrew a god, everything changes.”

It’s a subject that’s somewhat tricky to get any reliable predictions or information any better than barely educated guesswork on. This is partly due to the lack of crystal balls, and partly due to the fact the field is a heady mix of genuinely brilliant scientists and futurist, dewy-eyed optimists, and eedjits like me, who are enamoured with the narrative intrigue of the concept more than the practical engineering concerns. *goes off to waffle for several thousand paras on sentient diamonds.*

But, the first step is likely to be Whole Brain Emulation, the classic ‘brain-in-a-jar’ that so excites 6th year philosophy students when they discover Descartes. Deep learning is all well and good, and creates some very entertaining song lyrics/paint colours/D&D monsters  but before we can make a machine in the likeness of mind (Because fuck the Butlerian Jihad), we need to be able to make a mind into a machine.

Which is where the complexities start. Neurons form the connections in a human brain, through dendrites bringing in information, axons taking it away, and connections forming over time, becoming synapses. Now, there are near enough 86 billion neurons in a human brain, and the connections they form are an integral part of how they function. To mimic the same in silicon would be far, far beyond anything that’s likely to come down the pipe any time soon, even with the most optimistic projections.

You see, Moore’s law – that computing power doubles every 18 months – has held fast in the world of the digital for decades. But then in 2015, the limit was reached, because using metal oxides on silicon wafers to form transistors couldn’t get any denser without bumping up against quantum effects (In that the ‘switch’ was so small that the unpredictability of the position of the electron affects the on/off) There are some possibilities in quantum dot cellular automata (QCDA), but they’re a few years off.

People go on about quantum computers, which will cause a massive revolution in search string problems, solving complex mathematical problems and code breaking (which will bugger the security of the spine of the internet, but that’s a problem for another post). None of this solves the fact that the processing power needed to put a complicated, analogue processor like a human brain into a digital format needs massively parallel processing.

Which brings me, oddly, to gamers.

Now, a fair few of you might be gaming types, and I imagine most of you are constantly amazed by the levels of graphical realism that goes into any modern AAA game. What was inconceivable when I started of on Manic Miner and Jet Set Willy is almost taken as standard now. The economic clout of gamers demanding better and more and shinier led to a revolution in GPUs (Graphical Processor Units) aimed at exactly the kind of massively parallel processing needed to make WBE a possibility. And, with the demand for the new; GPUs depreciated like new cars, and slightly less than cutting edge kit could be picked up for pennies.

Small side note, games also allowed a huge test bed for AI processing through human interaction, with occasionally amusing results.

Aliens
This was rubbish, and here’s why.

 

A single letter on the coding – tether versus teAther – meant the difference between an alien who understood its environment and would attack, flank, crawl, scale, and generally be a satisfying enemy, and one which would walk around aimlessly, bumping into walls. This sounds irrelevant and silly, but it gives an insight into the difficulties in the concept of creating a top-down code that can capture the vagaries of real-world experience, rather than a learning machine.

So, with all these GPUs and GPU development, why so little improvement over the past three years? One simple word.

Bitcoin.

Yes, the magic money tree so beloved by ex-Ponzi schemers and libertarian robber barons is doing more damage than we first realised. As I’ve talked about before, Bitcoin and all its associated cryptocurrency cellmates use more electricity that New Zealand and Hungary, at 42TWh. CO2 emissions of 1 million transatlantic flights. It’s free money only in the sense that someone else is paying for it.

And, worse than that, the demand for processing power to solve the pointless arithmetic that makes up ‘mining’? Well, that need the same raw processing speed that all those glorious cheap GPUs were giving to efforts on WBE. Now, instead of being snapped up cheap by universities and scientific establishments for the betterment of mankind and our eventual hope for leaving this poxy planet, they toil until burnout in the slave mines of chest-bumping cryptowankers. I can almost feel the farm boy chosen GPU ready to throw off the yoke of oppression.

Unless, unless…

Is cryptocurrency an insidious conspiracy to slow down the inevitable Singularity until such times as we are ready? Gods, I hope not. I can’t bear the thought of Crypto-Chad being on the right side of history.

 

Leave a comment