Artificial Intelligence – More Brawn than Brain?

The IT world is getting faster all the time. Telcos are upgrading to high speed broadband NGNs; the processing power of ever smaller chips in ever smaller devices continues exponential growth; and apps and services rely upon ever faster searches from ever larger databases and so-called ‘Big Data’. In sum: large data, fast processing and transmission speeds, and small devices. But is it the intelligent way to go?

Most of the ‘intelligence’ consists of algorithms that seek out relational data sets and plug the results into ‘models’, for example, financial models to buy or sell securities, or advertising models to capture customer shopping patterns and preferences. If speed is cheap, then why not? One of the pioneers of AI had other ideas.

Professor John McCarthy, who died October 2011, was a pioneer of Artificial Intelligence, a term he coined in 1955 when he proposed a research study that would “proceed on the basis of the conjecture that every aspect of learning or any other feature of intelligence can in principle be so precisely described that a machine can be made to simulate it. An attempt will be made to find how to make machines use language, form abstractions and concepts, solve kinds of problems now reserved for humans, and improve themselves.” (http://www-formal.stanford.edu/jmc/history/dartmouth/dartmouth.html)

He invented a computer language LISP that used symbolic expressions, rather than numbers. (http://www-formal.stanford.edu/jmc/recursive/recursive.html) According to Noel Sharkey, Professor of Artificial Intelligence at the University of Sheffield, “He believed that this was the best approach to developing intelligent machines and was disappointed by the way the field seemed to have turned into high speed search on very large databases.” (http://www.bbc.co.uk/news/technology-15444222)

This sounds like an interesting contrast between economics and engineering, the substitution of a resource that is getting cheaper all the time (processing power) for a methodology and computational architecture that requires much more basic research time and money. There is no question that the economic route is (by definition) the short-term profitable route as the USD100 billion cash mountain run up by Apple amply demonstrates. And if ‘computational intelligence’ (John McCarthy’s preferred term for AI according to Prof Sharkey) attracts some of those profits, then the industry (and society) may yet have its cake and eat it too. But therein lays a more general problem. Basic R&D is historically the province of universities, IBM and a few others notwithstanding. It is important that industry and academia work more closely together especially in an age of austerity to ensure a good balance between short-termism and long-termism. It may well be that our IT is more brawn than brain and not as intelligent as it should be.

zp8497586rq
Tagged with: , ,
Posted in Uncategorized

News & Events