top of page
Writer's pictureGreg Robison

Artificial General Intelligence (AGI)

THE GOALPOSTS OF HUMAN INTELLIGENCE

PART 1


Artificial General Intelligence (AGI) is being discussed by leading AI researchers and companies as the next leap in artificial intelligence. But what is AGI? What does it mean to be intelligent and how close are current AI models to human-level performance on a variety of subjects? When will AGI be achieved and what could it mean for our society?


We’ll be exploring these questions and more in a three-part series on AGI, first, starting with the basics – what is intelligence? It’s commonly misunderstood as a score on an IQ test but it’s much more complex than that. We will discuss the basics of intelligence to set up the comparison for artificial versions (and hopefully dispel a few myths along the way).


The idea of intelligence first brought me to psychology where I studied and researched cognitive development – how do our brains and abilities develop over time to become what we’re using right now? It’s a fascinating topic and helps build the foundation for thinking about human intelligence – how did this lump of brain cells come to write this article? How did billions of years of evolution and our early lives as children shape our current brain? My brain tells me this is an important topic (but it is biased on the matter).


Let’s start with the basics – what is intelligence? I consider it the ability to learn from experience and adapt to new situations, which applies to bugs, crows, dogs, and neural networks, all at differing levels. Human-level intelligence also includes the ability to understand and handle abstract concepts like truth and justice as well as use knowledge to manipulate our environment. However, my very general definition doesn’t clearly depict the many facets of our intelligence, so here’s a better one:


Intelligence is the capacity for abstraction, logic, understanding, self-awareness, learning, emotional knowledge, reasoning, planning, creativity, critical thinking, and problem-solving. It can also be described as the ability to perceive or infer information; and to retain it as knowledge to be applied to adaptive behaviors within an environment or context. -Wikipedia

Gardner suggests eight distinct forms of human intelligence – since our intelligence is so multi-faceted, we have a greater advantage in learning from and adapting to our environments. While there may be different dimensions of intelligence, there is also some evidence that the dimensions are correlated and there is a central g-factor of intelligence.

graphic showing different types of intelligence

Since we like to think and talk about AI, let’s first talk about the hardware that your mind runs on. The human brain is a structured collection of approximately 86 billion neurons, with trillions of connections between them. Your brain is a massively parallel computing device. The individual neurons, like circuits, transmit electrical and chemical signals across synapses that connect to other neurons, exchanging and modulating information, all at an amazing microscopic scale. The complex interplay between neurons and their connections enables the brain to process and store information and allows us to be smart.


The brain’s organization also improves our intelligence – it’s organized into distinct regions, each with specific roles in supporting our multi-faceted intelligence. For example, the prefrontal cortex is involved in higher-order cognitive functions such as planning, decision-making, and problem solving. The cerebellum plays an important role in coordinating movements, while the occipital lobe in the back of your head is responsible for visual processing. Even more, the interconnectedness of these regions allows for the integration of information, for example the sensory and motor cortices are right next to each other, and the emergence of complex cognitive functioning.

sections of human brain
Major sections of the human brain

Our cognitive processes are the basis of our intelligence, allowing us to perceive, attend to, process, and manipulate information. Our perception allows the integration of sensory information from our surroundings, which is then processed and interpreted by the brain. Attention enables the selective focusing on relevant stimuli (for those of us with ADHD), while filtering out irrelevant information. We have memory systems, including working memory and long-term memory, which are essential for storing and retrieving information. Our problem-solving and decision-making abilities involve the application of knowledge and reasoning skills to generate solutions and make informed choices. Our emotions serve as motivational drivers, guiding attention, memory, and decision making. They help us prioritize info, make meaningful connections, and respond appropriately to social settings. Creativity enables us to generate new ideas and find innovative solutions to our problems. We pull these systems together for decisions of whether to scroll to the next video, the best chess move, or whether I should purchase the newest VR headset (the answer is “yes”).


How did all this impressive cognitive machinery get on top of our shoulders? The evolution of brains is a fascinating story that spans over 500 million years, beginning with the emergence of the first nervous systems in simple organisms. In the early Cambrian period, around 520 million years ago, the first primitive brains appeared in simple animals, such as flatworms. These early brains were clusters of nerve cells that allowed for more complex behaviors and increased chances of survival. As animals evolved and diversified, so did their brains. In vertebrates, the hindbrain, midbrain, and forebrain began to differentiate, with the hindbrain controlling basic functions like breathing and heart rate, while the midbrain and forebrain enabled more advanced sensory processing and decision-making. The evolution of the forebrain, particularly the telencephalon, was a significant step in the development of complex cognition. In mammals, the telencephalon gave rise to the neocortex, which is responsible for higher-order cognitive functions such as perception, spatial reasoning, and even conscious thought. Primates underwent a rapid expansion of the neocortex, enabling the development of advanced social cognition, language, and problem-solving skills. And our human brains are much bigger and more complex than other great apes.

17 brains from various species
Average weight and number of neurons of various species' brains

Another way to think about the question of how we got here is ontogenetically; that is, through the development of our brain from the first cells to what is up there now. From conception to adulthood, the human brain undergoes significant growth and change. During the prenatal period, the brain develops at an amazing rate, with neurons forming and migrating to their designated locations. At birth, a baby's brain contains nearly all the neurons it will ever have, but it is still only about a quarter of its adult size. Over the first few years of life, the brain experiences rapid growth and development, with synapses forming at an incredible pace. This period of synaptic overproduction is followed by a pruning process, where unused connections are eliminated, allowing for the refinement of neural networks for speedier processing of commonly used pathways. As children engage with their environment, their brains continue to develop, with experiences shaping the structure and function of neural circuits. The brain's flexibility and plasticity enable it to adapt and learn throughout childhood and adolescence, with the prefrontal cortex, responsible for executive functions and decision-making, being one of the last regions to fully mature in early adulthood. Our brains need a lot of time to grow and experience the world before becoming fulling formed.

brain development from 29 days to adult
Average weight and number of neurons of various species

Earlier, I mentioned intelligence isn’t as simple as an IQ test – it’s a complex and controversial topic. Historically, intelligence quotient (IQ) tests have been the most widely used method for assessing cognitive abilities. These tests typically measure a range of skills, including verbal comprehension, perceptual reasoning, working memory, and processing speed. However, critics argue that IQ tests are often culturally biased, focusing primarily on skills valued in Western societies, and fail to capture the full spectrum of human intelligence. Moreover, IQ scores can be influenced by factors such as education, socioeconomic status, and test-taking experience. In addition to measuring the various dimensions in Gardner's Theory of Multiple Intelligences measure such as emotional intelligence (EQ) and practical intelligence, emphasize the importance of non-cognitive skills in overall intellectual functioning. However, measuring intelligence remains an imperfect science, and no single test or theory can fully capture the complexity and diversity of human cognitive abilities. Whether we’re measuring across people or across species, using multiple measures of intelligence is important.


Given this explanation of human intelligence, what happens when computers can reach our levels of intelligence? That’s AGI. Although there is some disagreement, Artificial General Intelligence (AGI) refers to an artificial intelligence that can understand, learn, and perform any intellectual task that a human being can. Unlike narrow AI systems that are designed for specific tasks, such as image recognition or language translation, AGI would possess the flexibility and generality of human cognition. An AGI system would be able to reason abstractly, solve novel problems, learn from experience, and adapt to new situations without requiring explicit programming or training for each specific task. Like us, it would have the ability to combine knowledge from multiple domains, draw insights, and make decisions based on incomplete or uncertain information. AGI could also be capable of exhibiting human-like qualities such as creativity, emotional intelligence, and even self-awareness. It is currently a matter of debate among experts regarding its feasibility, timeline, and potential implications for society. Some researchers believe that AGI could lead to transformative breakthroughs in science, technology, and human progress, while others caution about the potential risks and challenges associated with creating machines that can match or surpass human intelligence.


Check out Part 2 where we examine how we’ll know AGI if/when it happens. AGI Recognition through Benchmarks and Evaluations and Part 3 that discusses the Potential Benefits of AGI to Society.

button says "let's stay connected"


bottom of page