Information technology, gadgets, social media, libraries, design, marketing, higher ed, data visualization, educational technology, mobility, innovation, strategy, trends and futures. . . 

Posts suspended for a bit while I settle into a new job. . . 

Entries in AI (1)


Artificial Intelligence

There's nothing like a hit movie to reanimate popular interest in a formerly hot/hyped technology. (I haven't seen the movie Her and likely won't anytime soon; for me, it's just a little too uncomfortably portentous.) 

Here's a nice little piece in Wired's Innovation Insights by Charles Silver about the current direction in artificial intellignece

Artificial Intelligence Is No Longer a Four-Letter Word -- And Could Even Win an Oscar 

January 16, 2014

Despite enormous excitement when the field of artificial intelligence was established back in the 1950s, AI has repeatedly given itself a bad name. Again and again, leading researchers vastly underestimated the difficulties, overpromised the outcomes, and mainly succeeded at burning through billions of dollars at a time when that was real money.

By the late 80s, things had gotten so bad that the surest way to scare off technology investors from a promising project was even to hint at artificial intelligence. “AI” had become a digital dirty word, the pre-millennium equivalent of “WTF.”

Not any more. Just a few months ago, the phrase “artificial intelligence” suddenly started being tossed around presentations, blogs, headlines, seminars -- even a Facebook earnings meeting – as if it were the most benign concept in the world. AI could actually win an Oscar, thanks to Scarlett Johansson’s riveting voice-only performance as Samantha, the AI-enabled OS in the new movie "Her."

One reason for AI’s new respectability: Big steps have been made in solving the problems of artificial intelligence, especially in speech recognition and concept communication. Just think about how casually we now accept machines that can understand and talk, from Apple’s Siri to IBM’s “Jeopardy”-winning Watson.

Yet both these systems are starting to seem like old news. The age of super-smart computers that can read, infer, and “think” isn’t around the corner. It has already started – not least because someone has to read and make sense of the world’s galactic masses of Big Data. And it sure isn’t going to be humans.

Consider the CIA’s investment in Narrative Science, a young Chicago-based company that’s developed what one jaded reporter called “artificial intelligence that actually works!” Narrative Science uses complex AI algorithms and “robot writers” to plow through heaps of data, extract key facts and insights, and create story-like summaries and reports. Its software does exactly what the CIA may need more than anything: It makes easy-reading sense of overwhelming quantities of data.

IBM claims Watson is beginning to do this kind of work, pointing out that it can ingest and process 20-page medical reports in one gulp. But there are persistent questions about how well Watson executes the critical next step -- delivering accurate, useful information to its financial and health clients.

Article continues at link. 

Silver goes on to review AI's central importance to robotics and operating systems.