Are We Close to Creating AGI?
AI, or artificial intelligence, has been in the news constantly for much of the last few years. It has transformed the business world in both good and bad ways. People have lost their jobs due to being replaced by AI. Companies can get more done, and use AI to produce much of the ‘grunt work’ they would usually rely on outsourcing to freelancers or junior staff to deal with. AI, then, is here to stay.
But at one stage, it is assumed that AI will develop. Evolve into something new. Something better. Something that can genuinely reach the levels of self-productivity that AI marketers claim current generative AI systems can do. To many, it’s known as AI – Artificial General Intelligence.
If you read excited AI pushers and promoters, they’ll tell you that AGI is just around the corner. Yet, we heard that a few years ago. Even recently, big names in the AI world have been providing politicians with ideas that it would arrive in the near future. If you Google search ‘AGI’, you can find people saying that it’s imminent. Some even claim it’s already here, or will be by the end of the current year.
So, is that true? Hmm. It’s hard to say. You can take the optimistic view of AGI developers who say that the next big breakthrough is around the corner. Or, you can listen to people who seem to be a bit more realistic. If standard generative AI is still miles away from what some of its biggest pushers claim, then how realistic is it that we’re about to break through into the era of AGI?

Why Is AGI So Important?
People have fallen head over heels with AI. It can quickly turn around basic projects and deliver increasingly convincing responses to all kinds of prompts. From role-playing and chatbots to generating images and content, AI has come a long way in a matter of years. However, AI is limited and constrained by what we ask it to do: when AI is allowed to work freely, mistakes happen.
AGI, though, is the concept that we can develop an AI that has human-level intelligence. It would be able to do the kind of jobs that we pay people thousands of dollars a year to do. It could work with only minimal feedback from a human user, and would be capable of producing work far beyond what current generative AI systems are capable of.
However, it’s also true that AGI is not even properly defined. Listen to the big pushers of the AI field; most of them seem to view AGI differently. In a way, it’s a little like religion: everyone has their own inner interpretation of what AGI would involve.
The AGI Reality: The Big Sell
The problem is, of course, that these people are all competitors. And they are involved in the 2020s version of the dot-com bubble. They all have a highly vaunted solution to sell, right? And they all want to get more investment – they want people to listen to their vision and buy into their products. So, that helps to explain why an ever-growing list of AI developers are saying AGI is just around the corner.
It’s like any industry that grows at a rapid pace and garners a lot of interest. The initial developments are enough to keep investors turning up with dollars in their hands, ready to buy into the next big thing – the problem is, development slows. Changes and improvements become more incremental and might lack the ‘Big Bang’ improvement that keeps investors falling over themselves to invest more.
So, developers need to come up with a convincing way to get people to buy into their vision even when the pace of development slows. The best way to do that? Promise the next big thing. AI 2.0. Offer a vision that means that if investors walk away, they might miss out on being part of the next tech revolution. That alone is a big reason why AGI is so regularly in the news: developers need to keep investors excited about their progress, because standard AI development is more steady now.
How Close Is AGI To Fruition?
Again, it depends on who you listen to. Most people who are more grounded in the AI field and spend their time working on things in the back-end, though, seem to be more nonplussed about the arrival of AGI.
Current AI systems essentially rely on us to input something, and then they help to predict what comes next. That is far, far different from how a human operates. It’s far below the limits of what a human is capable of. AGI is supposed to clear that hurdle and create self-thinking, innovative machines that can essentially replace human thinking and ingenuity. That is not close to being complete – not yet.
Yet, as the AI industry has shown, developments come at an incredible pace. Go back and look at AI help bots for customer support, and look at how much they have grown and developed. Compare AI-made art and content; it’s leaps and bounds from the generic, dry stuff that was being produced even a year ago. Who’s to say that AGI won’t experience a similar breakthrough?
Of course, there’s also the fact that to create true AGI, we would need to have a metric that would let us assess it accordingly. Given that humans still cannot find a way to define our own intellect, how can we possibly define an AI’s intellect in a way that would allow us to say it meets a human benchmark?
Most specialists believe that AGI is simply a term that sounds exciting, and that we’re still a long way off reaching that level. However, AI developers can’t exactly tell investors that the pace of the latest tech revolution is not going to be quite as drastic as originally thought, can they?
So, expect to see more AGI promotions in the weeks, months, and years to come. The reality is likely that we’re still a long way away from human-level intelligence being produced by AI systems. AGI goes far beyond simply finding data patterns – the breakthrough will likely happen, but not quite as soon as those trying to sell a dream like to claim.
Home > Blog > Are We Close to Creating AGI?