TAR-THING

ImageThe Vancouver Courier, July 25

Advertisements

IMMANUEL KANT WAS A REAL PISS-ANT WHO WAS VERY RARELY STABLE…

…Heidegger, Heidegger was a boozy beggar who could think you under the table…

Last March, Encyclopedia Britannica announced the final publication of its print edition. After 244 years, the 32-volume edition will abandon its shelf-swallowing format for a web presence exclusively. Unlike Wikipedia’s crowdsourced commons, Britannica’s gated community in the cloud will offer fact-freaks access for $70 a year, although some content will be free for all users.

So it’s with perverse pleasure that I recently staggered away from a library book sale with a 1972 edition of The Encyclopedia of Philosophy, after talking a staffer down from 50 to 25 bucks.

“That’s the thing about philosophy,” she said as I heaved the heavy volumes onto the counter. “It never goes out of date.” Absolutely. For example, the ancient Greek philosopher ARISTOTLE (Volume 1, Abbagno to Entropy) was flat wrong when he said the brain is a device for cooling blood. But it’s a cold, hard fact that he promoted this innovative notion in his essay “On Sleep and Sleeplessness.”

This magisterial set from Collier-Macmillan has been described online as “the highest achievement of 20th century philosophy,” and so far no one’s turned it into an app—4,300 pages, 1,500 contributors, one myopic reader.

So why would someone with over 2,000 books, dozens unread, want to weigh down his nonexistent shelf space with more obscure knowledge? Glad you asked. As a scribbler for the press, I may not find anything directly relevant to Rupert Murdoch’s misadventures in the entry, THE CORRESPONDENCE THEORY OF TRUTH, but the entries on DEMOCRACY and SOVEREIGNTY might prove useful for a future rant about Canada’s current prime monster.

There’s also such a thing as knowledge for knowledge’s sake, although that’s an increasingly quaint idea in the digital age, when people are using their iPhones and Android devices as back-up brains. (Who needs personal memory when you have Apple’s Siri to answer any question, from the status of the God particle to the location of your parcel?)

When it comes to philosophical digressions, whether it’s rambling in print or buttonholing dinner companions, I always can use some printed help. But don’t get me wrong. As a fully qualified nerd, I can recite “The Philosopher’s Drinking Song” by Monty Python from memory. It’s just that I’ve always considered my knowledge of western philosophy a bit spotty. I’m more likely to fill in the blanks with a tome and table lamp than I am with a tablet and power chord.

Plus, I can safely say the chances are precisely zero I will come across anything in these volumes about Simon Cowell or Snooki. You can’t get that kind of certainty from digital media.

The other night in bed I pored over a long entry on GNOSTICISM, a pre-Christian belief system. The article wasn’t written in a breezy, page-turning way, but I managed to learn some interesting things about the ancient world. This doesn’t mean I’ll be studying these spleen-squashing cinderblocks from end to end. That would be madness. This encyclopedia set is for quick dips, not marathon swims.

Alas, the irresistible force of my book collecting sometimes collides with the immovable object of my partner. She’s no slouch at higher reasoning herself. In so many words, she recently offered this gem of a syllogism:

1. All hoarders obsessively collect things, and some hoarders collect books. 2. Geoff obsessively collects books. 3. Therefore, Geoff is a hoarder.

I’m hoping The Encyclopedia of Philosophy will act as a microscope to magnify little holes in my partner’s logic into lunar-sized craters. Perhaps I’ll counter with the entry on EDDINGTON, ARTHUR STANLEY, a knighted British physicist who argued that the table he wrote upon was 99.99999 percent empty space, due to the vacuum-like structure of the atoms composing it. From that perspective, all material objects—books included—are whill o’ the whisps, hardly there at all. (It’s a risky line of defence, so a word of caution to any guy with a cluttered man-cave and a neat-freak partner: your excuse mileage may vary.)

Or I may go for a more practical defence: I’ve managed to eke out a published column from a cheaply acquired, out-of-print encyclopedia set that’s unavailable online, with used editions going for $158 upwards on Amazon.ca.

Fine print deadweight or steal of a deal? Really, it’s all about PERCEPTION (Volume 3: Logic to Psychologism).

The Vancouver Courier, July 18

SUPERHERO SUMMERS OWED TO JACK “KING” KIRBY

ImageWith Spiderman’s retooled origin and the renewal of the Batman franchise, the summer of the big-screen superhero is upon us. Actually, most summers have come with computer-generated clobbering since July 2003, when Ang Lee’s Hulk first stomped its way through the Cineplex.

You can put it all down to Marvel comic artist Jack “King” Kirby, whose four-decade output has become Hollywood’s version of blood diamonds. The majority of comic-to-film superheroes – Iron Man, Thor, Captain America, The Fantastic Four, The X-Men, The Hulk, and Spiderman – originally sprang from the fevered imagination of a guy who never held the rights to his own work.

When I say fevered imagination, I mean it literally. Born into the violent precincts of New York’s Lower East Side in 1917, the young Jacob Kurtzberg literally had to fight to survive. Ray Wyman’s biography recounts how the boy once lay dying of pneumonia, a disease that often proved fatal to poor immigrant families of the time. In a last-ditch effort to save his life, rabbis performed an exorcism, demanding the names of the demons possessing him.

I have a pet theory that Kurtzberg/Kirby’s youthful illness and exorcism was his shamanic initiation, which cracked open his creative unconscious like an egg. Years later, a bottomless pantheon of superheroes, supervillains, sidekicks, monsters and freaks flowed freely from his pencil.

My second theory is that twentieth century superhero mythology is as Jewish as lox and bagels. Superman was the Depression-era brainchild of Joe Siegel and Jerry Shuster at Detective Comics, while Kurtzberg/Kirby’s most inventive work came from his Marvel partnership with Stan Lee (born Stanley Lieber) in the sixties. One of the prime motifs of the Marvel/DC universe is the ‘secret identity’, a fantasy of the assimilated Jew moving undetected through a society of goyim.

An American friend recently told me how Kirby sometimes showed up at his synagogue in Southern California in the seventies, to give talks to kids. “What did he talk about?” I asked, fascinated. “Morality,” he replied.

Kirby wasn’t just mining Yiddish fables for his two-fisted morality tales, however. He reinvented Norse mythology with The Mighty Thor, and whipped up space operas in which the Fantastic Four confronted ambiguous, God-like entities and interdimensional terrors. No one in comics could match Kirby for composition and dynamism. Yet to read of his dealings with Marvel is to weep. To support his family, for years the comic artist churned out four or more complete comic books a month – 80 pages of material – and as his output increased, his drawing style became more rushed and eccentric-looking.

Tired of his restrictive treatment at Marvel, the comic artist relocated to DC in the early seventies. There he came up with a whole new collection of characters called The New Gods. The presiding supervillain was a helmet-wearing, caped figure called “Darkseid” (pronounced Dark Side). Six years later, a young filmmaker called George Lucas revealed a similar-looking villain called Darth Vader, who siphoned his power from the “Dark Side.”

The creative bulk of the Kirby/Lee partnership lay mostly on the artist’s side, but it was the editor who became wealthy through the marketing of their ideas. Kirby died in 1994, without royalties or rights to his own work. In August 2011, Kirby’s heirs lost their court battle to reclaim the copyright on his creations. (In the first two weeks of its release, The Avengers made a billion dollars worldwide.)

The comic artist is still getting the shaft in another way. Before the rise of computer generated imagery, it was impossible to translate his epic visions into film. Unfortunately, with increasing processing power, the studios’ storyboard artists and software programmers have gone crazier than Dr. Doom on bath salts. With the exception of the first Iron Man film and perhaps an X-Men or two, the raid on Kirby’s catalogue has mostly resulted in big-screen video games, in which greenscreened, A-list actors fail to emote appropriately to the chaos happening around them.

I know we’re talking about fictitious characters in unitards, but Hollywood is playing fast and loose with the childhood fantasy worlds of boomers and Gen-Xers. As a guy who still owns several boxes worth of Kirby comics from his childhood, I say the King deserves better than ham-fisted, paint-by-digital-numbers productions based on his work. At the very least, his estate is owed a piece of the action.

The Vancouver Courier, July 13

 

 

 

 

 

MANIACAL FINANCE: Algorithms from Wartime to Wall Street

ImageIS IT TRADING, OR TRON?

The Hungarian-born émigré and mathematician John von Neumann is remembered as an urbane and witty man. His colleagues admired his finely tailored clothes and superhuman capacity to handle liquor, to say nothing of his important contributions to a wide range of fields, from game theory to quantum physics. The man’s bald dome housed one of the most powerful brains of the twentieth century.

Von Neumann’s career arc took him from the top secret Manhattan Project to the Princeton Institute for Advanced Study, where Albert Einstein also worked. The hawkish Hungarian did not share the frizzy-haired German’s fears about nuclear weapons. After the horrors of Hiroshima and Nagasaki, von Neumann approached the US government with a proposal for a new computer that would overtake its hand-driven predecessor ENIAC (Electronic Numerical Integrator and Computer) in running the mathematical simulations essential to atomic testing.

Einstein petitioned against building the machine at Princeton, but the US government approved the proposal. In 1951, von Neumann’s team unveiled the Mathematical and Numerical Integrator and Computer, known by the acronym MANIAC – meaning something crazy and uncontrollable.

“Because city-destroying bombs couldn’t be built by trial and error, computers were required to simulate the physics of detonation and blast waves. A computer helped build the bomb and the bomb necessitated ever more advanced computers,” observes author William Poundstone in the New York Times. MANIAC was the bulky, slow-moving granddaddy of today’s supercomputers, desktop systems, notebooks and smart phones. Most of our computing gadgets owe their attention-fracturing existence to John von Neumann and the Cold War’s nuclear stalemate.

A Punch cartoon from 1959 portrays two scientists in lab coats standing next to a huge mainframe computer that has been programmed to answer the question, “Is there a God?” They are holding a printout of the computer’s response: “There is now.” A mere half-century after that unnerving punch line, computers have come to dominate our lives in uncanny ways. Who would have thought that the humdrum telephone – a Canadian inventor’s century-old brainchild – would break free from the home to become a beeping, buzzing vector for social connection/disconnection? Or that the architecture of the processors in smart phones is traceable back to von Neumann’s MANIAC, a device designed to crank out estimates for nuclear blast range and fallout?

Yet even while our phones and computers went mobile, other changes were happening on the digital front, far below the public radar. These changes make for a story with mythic dimensions, which we are only partway through. Just as the Biblical creation myth involved a Tree of Knowledge, there’s also a tree of knowledge in this unfinished tale and, as an added bonus, it has a colourful selection of serpents.

Over 200 hundred years ago, businessmen in breeches and buckled shoes regularly gathered at a buttonwood tree at the foot of Wall Street, to wheel and deal. On May 17, 1792, under the tree’s dappled shade, 24 stockbrokers signed “The Buttonwood Agreement,” initiating the New York Stock & Exchange Board at 68 Wall Street.

Proximity to the area was key. Firms set up broker offices near the exchange so their employees could run over as fast as possible to buy and sell. Today, subatomic particles perform the legwork on Wall Street. Consider this: it takes you approximately 350,000 microseconds to blink and 500,000 microseconds to click a mouse. Computers can now perform trades in the span of just a few microseconds. Needless to say, even a bipolar, coked-up day trader can’t touch the cheapest Chinese-made netbook in response time. So what has this meant for Wall Street, where timing is critical in the buy-and-sell process?

It’s meant that Wall Street has given decision-making over to the machines. Every day, electrons race across electronic networks at near the speed of light to perform financial transactions. It’s been called “black box trading” or “high frequency trading.” The decisions to buy and sell are made by computer programs. These automatic computing procedures are called algorithms: coded sets of rules that define a precise sequence of operations.

These procedures are often massively repetitive. In consumer-grade software, algorithms repeat their routines over and over, until, for example, they have compressed a high-res photographic image into a web-friendly jpeg, or shoehorned a CD track into a tinny-sounding mp3. In high-frequency trading, algorithms continually sniff out stocks according to pre-programmed criteria. Amazingly, 70 percent of equity trades within the US in 2010 were by HFT.

That’s right. Most of the market activity on Wall Street is performed by machines, not human beings.

Here’s how it works: in the morning, a firm decides on a trading strategy for the day. The firm then releases its computational hounds into the secure, electronic backbone of the exchange. Millions of shares may be bought and sold throughout the day, but a young manager manning one of the HFT ‘special desks’ may know little to nothing about the value of the companies involved. He or she is only there to watch the screens and pull the plug if market activity gets out of hand. At the ring of the market bell, the firms close their position and rake in the bucks.

The HFT traders aren’t after a few big fish; they’re targeting an ocean’s worth of minnows. The gains and losses are tiny per trade, just a fraction of a penny per share or currency unit. Yet with algorithms darting in and out of short-term positions millions of times a day, and the firms liquidating their entire portfolios daily, the takings add up. Goldman Sachs netted $300 million in 2009 and Citadel hedge fund made $1 billion in 2008 from high-speed strategies, notes Sarah Anderson of the Institute for Policy Studies.

According to a study by TABB Group, 48 percent of HFT has been traced to a few hundred proprietary trading houses, 46 percent by investment banks and six percent by a dozen or more hedge funds.

Paper currency originated as a virtual representation of precious metals and electronic currency acts as a virtual representation of paper currency. HFT, which gambles on electronic currency free of human oversight, is virtualization run amok. It’s not so much trading as Tron. Critics say it has created “dark pools” and a kind of “Shadow Wall Street” where transparency and fairness is trumped by cabbalistic code.

HFT first hit the news after May 6, 2010, when nine percent of the DOW Jones Industrial Average momentarily vanished, almost plunging the world into chaos. The 1,000-point drop was later traced to a mutual fund market that dumped $4.1 billion of securities in a 20-minute period, which were gobbled up and sold within microseconds by algorithms. The incident became known as the “Flash Crash,” after the US Securities and Exchange Commission suggested some blame lay with a variant of high speed trading called flash trading. (This is when selected players are allowed to see incoming orders to buy or sell securities a few microseconds earlier than the general market participants, in exchange for a fee.)

Could such untracked activity leverage a market mood swing into another financial meltdown? A 2011 survey of global financial firms found that 67 percent of executives believe that “rogue algorithms” are inescapable, versus 78 percent of US financial executives surveyed.

It’s not like Wall Street code hasn’t got us into trouble before. The granddaddy of financial algorithms, the so-called Black-Scholes equation, won its creators the 1997 Nobel Prize in Economics. Unfortunately, the equation makes no allowance for “Black Swan” events like market crashes. Ian Stewart, a respected science writer and professor of mathematics at the University of Warwick, insists the Black-Scholes equation is dumb as dirt in dealing with real world market gyrations, and may even amplify them.

“Any mathematical model of reality relies on simplifications and assumptions. The Black-Scholes equation was based on arbitrage pricing theory, in which both drift and volatility are constant. This assumption is common in financial theory, but it is often false for real markets,” Stewart observed in the Guardian. Financial managers who use ever more complicated derivatives – bets on bets on bets – eventually became prisoners of their instruments, like Mickey Mouse and his out-of-control brooms in the Disney film Fantasia.

Herd-driven market crashes are “virtually impossible under the model’s assumptions,” notes Stewart. The professor doesn’t blame the great subprime scam and credit crisis of ‘08 on bad math alone. “Black-Scholes may have contributed to the crash, but only because it was abused. In any case, the equation was just one ingredient in a rich stew of financial irresponsibility, political ineptitude, perverse incentives and lax regulation.”

The mathematician sees only more problems in hyper-speed finance. “The facility to transfer billions at the click of a mouse may allow ever-quicker profits, but it also makes shocks propagate faster,” he observes.

Critics accuse some HFT traders of ‘front running’ and other illegal activities. Andy Brooks, head of US stock trading at the mutual fund seller T. Rowe Price, hinted at the dimension of the problem. “We know that some high-frequency trading strategies have cancellation rates in the 95 percent range,” he told the Baltimore Sun this year. “So that means that 95 percent of the time that you say you want to buy 100 shares of IBM, you don’t really buy it. And that begs the question: Why have you said you want to buy? Are you trying to influence someone to do something else? And is that manipulative?”

In 2010, the Financial Services Authority in Britain fined one firm and froze the assets of another, for HFT abuses on the London Stock Exchange. The US Security and Exchange Commission has proposed monitoring HFT in real time with a consolidated paper trail, but for its part, the Obama Administration has not signalled much interest in doing anything to alienate the incumbent’s few remaining campaign benefactors on Wall Street. The European Commission’s proposal for a financial transaction tax on speculators, which might reduce HFT volume on world exchanges, has gone nowhere.

A 2010 study of the NYSE flash crash commissioned by Britain’s Government Office for Science contends the world narrowly avoided a “true nightmare scenario,” with the market contagion spreading to the global exchanges. The report concludes, “On the afternoon of May 6, 2010, the world’s financial system dodged a bullet.” (With the concern that HFT is playing chicken with a “Black Swan Event,” exchanges have reportedly installed post-flash crash ‘circuit breakers’ to halt trading in the event of extreme volatility.)

In a must-see 2011 talk on algorithms delivered at the Technology Entertainment and Design forum (TED), entrepreneur and new media maven Kevin Slavin insists the flash crash of 2010 indicates we have written code “we can no longer read. And we’ve rendered something illegible. And we’ve lost the sense of what’s actually happening in this world that we’ve made.”

This goes far beyond Wall Street. As an example, Slavin cites an anecdote from e-commerce, when a book for sale at Amazon.com, The Making of a Fly: The Genetics of Animal Design, rose from $1.7 million to $23.6 million in the space of a few hours. “When you see this kind of behaviour, what you see is the evidence of algorithms in conflict, algorithms locked in loops with each other, without any human oversight, without any adult supervision to say, ‘actually, $1.7 million is plenty,’” Slavin dryly observes.

UK software engineers now offer story algorithms to Hollywood. A company called Epagogix can run a script through its proprietary code to quantify whether it portends a $30 million movie or a $200 million movie. This is no longer about finance, it’s about “the physics of culture,” notes Slavin. “And if these algorithms, like the algorithms on Wall Street, just crashed one day and went awry, how would we know, what would it look like?”

We know already that high speed trading is affecting the hard-edged world of people and property. In a weird replay of the buttonwood tree era, trading distance is more crucial than ever. A distance of 20 miles from a high-speed trader to the exchange means a few extra microseconds, a critical amount of travel time for algorithms. Financial firms are locating their offices as close as possible to the NYSE hub to shave off these microseconds and the NYSE itself has installed supercomputers in its basement for high-paying clients.

In a world of whizzing financial code, what is the “market value” of slow, squishy human beings, with their circadian rhythms that evolved in tandem with a slowly turning planet? Not much. In his TED talk, Slavin recalls meeting “an architect in Frankfurt who was hollowing out a skyscraper – throwing out all the furniture, all the infrastructure for human use, and just running steel on the floors to get ready for the stacks of servers to go in” – just so algorithms could be closer to the financial electronic networks. For the same area of leased space, a human being squeezes out far less profit than a supercomputer firing off digits at the stock market.

Over the past few years, a company called Spread Networks has built an 825-mile trench between New York City and Chicago. This massive project is for a cable to transport algorithms 37 times faster than you can click a mouse. One of the newer projects in this field is a $300 million fibre-optic line beneath the Atlantic Ocean, intended to shave a few milliseconds off the data transmission time between London and New York markets. Algorithms have “a kind of manifest destiny” that will always seek out a new frontier,” Slavin observes. Along with nature and man, there is now a “third co-evolutionary force,” he says: the algorithm.

Many of us feel life is going faster and faster these days. The intuition isn’t without a real-world basis. Financial market time operating in microsecond cycles is driving our stock-obsessed 24-hour news channels and Internet blogs, which in turn demand a manic, megahertz pace from political campaigns and pop culture. And God help the hapless consumer if he or she is not fully wired and connected at all times to “the Cloud.”

The coming out party for John von Neumann’s MANIAC arrived in the summer of 1951, “with a thermonuclear calculation that ran for 60 days nonstop,” writes George Dyson in the 2012 book Turing’s Cathedral. The author quotes von Neumann’s second wife, Klari, who recalls his anxiety over what his invention might mean for the world. One evening in 1945, the mathematician proclaimed, “What we are creating now is a monster whose influence is going to change history, provided there is any history left.” Von Neumann wasn’t so much concerned about the atomic bomb per se as “the growing powers of machines,” Dyson observes.

“Is there a God?” scientists asked a huge mainframe computer in an ancient Punch cartoon. No real-world device could answer such a question in the era of von Neumann, who passed away in 1957. But perhaps the algorithms racing through the world’s electronic networks will one day give our machines a voice. Actually, in a sense, they have already through Apple’s smart phone app, SIRI. Not surprisingly, the interactive voice recognition software originated as a project funded by DARPA (Defence Advanced Research Program Agency), a Pentagon nursery for classified high technology.

I’m afraid to ask SIRI if there’s a God.

Common Ground, July