Artificial General Intelligence – are we seeing it now?

Uber profitable

Written by Barry O'Gorman

Independent Business Advisor - Business Advantage through Technology (Strategy, Commercials, Transformation).

Post Date 20/10/2023

Artificial General Intelligence (‘AGI’) yet?

Are there any signs of Artificial General Intelligence in what we are seeing now in Generative AI, LLMs etc?  Any emerging signs of the holy grail? Or do we need to rethink or confirm what we mean by AGI?

AGI not yet – it seems

Interesting piece in Techcentral last week (13 Oct 2023) – featuring an interview with Prof. Barry O’Sullivan, one of Ireland’s leading AI experts. He is quoted as saying: ‘These systems are still extremely limited. The primary challenges of AI still stand. While it has been a great year, it’s not a solved problem, most certainly. Far from it. These systems can’t reason, they can’t really do mathematics, and they don’t really have an understanding of the world’. And the discussion leads on to some of the irresposnsible argument about existential threats arising from a lack of understanding of the current tools and their capacity. I posted previously with respect to the contracting views of Hinton and Lecun on existential treats re AI.

AGI now? – or what does it mean?

Almost the same day I read an interesting paper published by the Berggruen Institute, authored by Blaise Aguera y Arcas of Google and Peter Norvig of Stanford University. And their headline:

‘Artificial General Intelligence Is Already Here – Today’s most advanced AI models have many flaws, but decades from now, they will be recognized as the first true examples of artificial general intelligence’.

They argue that the new ‘frontier’ models ‘have achieved artificial general intelligence in five important ways: topics, tasks, modalities, languages, instructability – and flesh out each of these in the paper.

They attrubute the reluctance of many commenters to acknowledge this to any/all of four reasons:

  1. A healthy skepticism about metrics for AGI
  2. An ideological commitment to alternative AI theories or techniques
  3. A devotion to human (or biological) exceptionalism
  4. A concern about the economic implications of AGI

So – what does this mean for society and/or users of these tools?

I have a sense that we are now using tools which, even with their limitations, are offering huge opportunties for progress and change . The reality of most technological change to date will continue, I suspect: there will be winners and losers.  Will we come to  another hiatus, another ‘AI winter’.  I don’t think so this time.  And we need the likes of Prof Barry O’Sullivan, Blaise Aguera y Arcas and Peter Norvig – and many others – looking down the road to see where we are headed.

You may also like

IT Strategy – 5 elements

IT Strategy – 5 elements

IT Strategy requires understanding business, stakeholders, a policy, coherent actions and ongoing feedback