Future flash crashes
Remember a few years back when a bogus AP tweet instantly wiped $100bn off the US markets? In April 2013 the Associated Press’ Twitter account was compromised by hackers who tweeted “Breaking: Two Explosions in the White House and Barack Obama is injured.”
For illustrative purposes only.
Source: The Washington Post, 04/23/13, Bloomberg L.P., 04/23/13.
The tweet was quickly confirmed to be an alternative fact (as we say in 2017), but not before the Dow dropped 145 points (1%) in two minutes.
Well, my view is that we are heading into a far more ‘interesting’ era of flash crashes of confused, or deliberately misled, algorithms. In this concise paper titled “Deceiving Google’s Cloud Video Intelligence API Built for Summarizing Videos”, researchers from the University of Washington demonstrate that by inserting still images of a plate of noodles (amongst other things) into an unrelated video, they could trick a Google image-recognition algorithm into thinking the video was about a completely different topic.
Digital Darwinism
I’m not sure I totally buy the asserted causality on this one, but the headline story is just irresistible: “Music Streaming Is Making Songs Faster as Artists Compete for Attention.” Paper abstract:
Technological changes in the last 30 years have influenced the way we consume music, not only granting immediate access to a much larger collection of songs than ever before, but also allowing us to instantly skip songs. This new reality can be explained in terms of attention economy, which posits that attention is the currency of the information age, since it is both scarce and valuable. The purpose of these two studies is to examine whether popular music compositional practices have changed in the last 30 years in a way that is consistent with attention economy principles. In the first study, 303 U.S. top-10 singles from 1986 to 2015 were analyzed according to five parameters: number of words in title, main tempo, time before the voice enters, time before the title is mentioned, and self-focus in lyrical content. The results revealed that popular music has been changing in a way that favors attention grabbing, consistent with attention economy principles. In the second study, 60 popular songs from 2015 were paired with 60 less popular songs from the same artists. The same parameters were evaluated. The data were not consistent with any of the hypotheses regarding the relationship between attention economy principles within a comparison of popular and less popular music.
Meanwhile, in other evolutionary news, apparently robots have been ‘mating’ and evolving in an evo-devo stylee. DTR? More formal translation: Researchers have added complexity to the field of evolutionary robotics by demonstrating for the first time that, just like in biological evolution, embodied robot evolution is impacted by epigenetic factors. Original Frontiers in Robotics and AI (dense!) paper here. Helpful explainer article here.
The resurgence of hardware
As we move from a Big Data paradigm of commoditized and cheap AWS storage to a Big Compute paradigm of high performance chips (and other non-silicon compute methods), we are discovering step-change innovation in applied processing power driven by the Darwinian force of specialization, or, as Chris Dixon recently succinctly tweeted: “Next stage of Moore’s Law: less about transistor density, more about specialized chips.”
We are seeing the big guys like Google develop their specialized chips custom-made for their specific big compute needs, with a very significant increase of speed of up to 30 times faster than today’s conventional processors and using much less power, too.
Also, we are seeing increased real-world applications being developed for truly evolutionary-leap technologies like quantum computing. MIT Technology Review article on implementing the powerful Grover’s quantum search algorithm here.
And, finally, because it just wouldn’t be a week in big compute-land without a machine beating a talented group of humans at one game of another: Poker-Playing Engineers Take on AI Machine – And Get Thrashed.
Key points:
- People have a misunderstanding of what computers and people are each good at. People think that bluffing is very human, but it turns out that’s not true. A computer can learn from experience that if it has a weak hand and it bluffs, it can make more money.
- The AI didn’t learn to bluff from mimicking successful human poker players, but from game theory. Its strategies were computed from just the rules of the game, not from analyzing historical data.
- Also evident was the relentless decline in price and increase in performance of running advanced ‘big compute’ applications; the computing power used for this poker win can be had for under $20k.
PDF Download (Paid Subscription Required): http://www.epsilontheory.com/download/16079/
Start the discussion at the Epsilon Theory Forum