OF ANTS AND ROBOTS

By Geoff Olson

PART 1

Over the years, the paving stones on my home’s walkway have sunk into the ground at off-level angles. So the other day I picked up some bags of sand to even out the dirt below them.

I hefted up the first paving stone, which turned out to be the rooftop of an ant colony. “Damn,” I muttered, as I watched the six-legged commuters scuttle along their now-exposed highways.  These weren’t carpenter ants or pests of concern to me, just a bunch of little beings going about their business, incapable of comprehending the sudden light and the vast being disrupting their world.

I slowly lifted up the other paving stones. Some of the workers were transporting larvae along the winding road system. “Sorry guys,” I said, as I emptied the first bag of sand onto the ground.

Later that day I came across a tweet from Elon Musk, Chief Product Architect of Tesla Motors. “Hope we’re not just the biological boot loader for digital superintelligence. Unfortunately, that is increasingly probable,” he wrote.

Musk had been reading Superintelligence: Paths, Dangers, Strategies, by the respected Oxford philosopher Nick Bostrom. In the author’s view, within a few generations we humans may find ourselves in the position to Artificial Intelligence that insects are to us. Once robots are capable of building even smarter robots, all bets are off – especially if machine consciousness results.

If the creations have obscure plans in which the creators do not fit, or decide we are a threat, the United Nations or White House might have as much ability to bargain as a kid’s ant farm on the way to the dump.

Unwarranted cyber-pessimism? Here are four items demonstrating how fast things are progressing on the AI front:

1.  Automated software programs – robots, in a sense – now account for  the majority of website traffic, according to the Internet firm Incapsula.  Some of it is comprised of bots programmed for malicious activity. Some of it  is comprised of hacking tools looking for a vulnerabilities on web sites. Others are scrapers, and chatbots masquerading as human beings. Only 49% of website traffic from actual people browsing the Internet.

2. Last year, the UN and Human Rights Watch advocated a treaty banning “autonomous killing machines” such as drones capable of  launching airstrikes against targets without human decision-making. Such technology is on the near-term horizon.

3. Google has been on a spending spree, buying robotics startups and hiring AI experts to construct what some are calling the “Manhattan Project of AI”: basically, an all-knowing electronic Golem that makes today’s search engines look like stone tools.

4. A neural network of 1,000 computers at Google’s X lab has “taught itself” to recognize humans and cats on the Internet, according to a report on Slate. The computers learned a “slew of concepts that have little meaning to humans.” For instance, “they became intrigued by tool-like objects oriented at 30 degrees, including spatulas and needle-nose pliers.”

Nick Bostrom is concerned an intelligence greater than our own will start displaying behaviours that make sense only to it, but not to us. If and when machine consciousness arises, we may not even recognize it as such, even after things go sideways (or 30 degrees) for its inventors.

To say such a development would leave humanity behind the eight ball is to underestimate how badly we’d be snookered, by several orders of magnitude. We’d be the playthings of Job’s mysterious, arbitrary God, but without the anthropomorphism.

In May 2014, Oxford physicist Stephen Hawking penned a paper with three colleagues, noting that “it’s tempting to dismiss the notion of highly intelligent machines as mere science fiction. But this would be a mistake, and potentially our worst mistake in history.”

“One can imagine such technology outsmarting financial markets, out-inventing human researchers, out-manipulating human leaders, and developing weapons we cannot even understand. Whereas the short-term impact of AI depends on who controls it, the long-term impact depends on whether it can be controlled at all,” wrote Hawking and his coauthors.

Perhaps these academic worries are exaggerated and misplaced. But ponder for a moment how we humans treat creatures less intelligent than ourselves. Even myself: I felt benign toward the ants under my paving stones, but I buried their city-complex anyway.

————

PART 2

KURZWEILTOASTERLast week I wrote about the futuristic fear that our inventions will overtake us one day. But forget sci-fi scenarios of Terminators tumbling out of time portals; 30 years after James Cameron’s fictional cyborg went staggering after young John Connor, the merger of people and machines is well underway –  albeit in a coffee-shop hotspot kinda way.

Most of us spend a significant part of our waking lives in silent communion with smart phones, tablets, and laptops  – myself included. This cross-generational transformation has occurred incredibly quickly, with both the young and not-so young falling under the spell of what documentary filmmaker Adam Curtis calls “our machines of loving grace.”

And who’s going to deny there are great things about the immediate access to the global information network? For sheer entertainment value, the stranger next to you in Starbucks can hardly compete with Facebook, Twitter, and other clock-suckers.

Bearing this in mind, Apple’s iWatch might seem faintly ridiculous now, but that won’t stop wristband computing from catching on with the early adopters. Even if the iWatch and its competitors fizzles out after that, the association between microchips and flesh is bound to become more intimate and complex. (A cyborg is “a fictional or hypothetical person whose physical abilities are extended beyond normal human limitations by mechanical elements built into the body,” according to my – ahem – Apple dictionary.)

But who needs cyborgs when you can eliminate human workers entirely? The other day I stood at the entrance of a Vancouver megastore, surprised by the empty space. The store had replaced a line of cash registers with a couple of digital checkout stations.

At least half a dozen shifts were no more, with some unquantified spill-off effect on local retail activity. (And if you believe that lost service industry jobs will soon be offset by an equal number of better-paying information economy jobs, I have a starship warp core to sell you.)

Not that any of this is new. For decades, North American blue collar workers have been losing ground through automation and work outsourced to free trade zones in the Third World. Even back in 19th century, economic historian Karl Marx could see how factory owners  of the industrial revolution were using new technology to leverage capital against labour.

What’s new is that we’re starting to hear alarm bells from the managerial class now that digital technology is threatening their jobs, too. A whole range of services once thought impervious to outsourcing (including the “discovery” process in the legal field, and medical diagnosis), can be performed by computers that scan and identify information networks in microseconds.

Last March, a Los Angeles Times story on a shallow earthquake in Westwood, California, appeared online only three minutes after the quake.  LA Times staffer and computer programmer Ken Schwencke invented a script that combed data from US Geological Survey servers, and automatically funnelled it through a program that churned out a grammatically correct, standard AP-style report.

One automated writing program, Quill, can hunt through immense volumes of data and correctly organize the relevant information in a readable form. Bear in mind this is what brain-dead algorithms can do right now, without a whiff of “machine consciousness.”

Computer scientist Kris Hammond, who runs an Chicago firm called Narrative Science,  projects that by 2025, 90% of the news consumed by readers will be generated by computers. The remaining 10 percent will constitute boutique journalism,  in the form of opinion pieces and long-form essays.

Imagine a future when natural language programs access marketing databases that contain more information about you than even you know: your every search term, text message, and quantifiable quirk – right down to your smart home-monitored eating and sleeping patterns.

“One day, there will only be a single reader for each article,” Hammond told Le Monde contributor Yves Eudes.  Micro-customized service excellence or networked narcissism? Either way, Hammond doesn’t believe robots will finish off flesh-based journalists, just that the volume of published material will “massively increase.”

“Sometimes paranoia’s just having all the facts,” observed William Burroughs, who died in 1997.  By the end of his life, the cranky Kansas author could see the twentieth century receding quickly in the rear-view mirror. For those of us on the other side of the millennium, it’s going to be a weird ride.

The Vancouver Courier, Oct. 3 and 10

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s