You may be surprised to learn that many A.I. systems struggle with basic mathematics
Von Steve Crilley
8 times 8
Remember back in school, maybe you were shown how to arrive at the answer to a simple maths problem (like 8 x 8); and in learning how to do that, you could probably infer the answer to 8 times 9. You used a bit of reasoning abilities to get to the new answer of 72. But most machine learning programmes lack a process of reasoning to arrive at a correct answer.
In fact, ChatGPT’s accuracy on math problems is reportedly below 60%. So if you are relying on it for your next quantum physics research paper, you may want to retrieve your old school calculator out of the bin.
Maths requires precison
A.I. is very good at using large amounts of data, to learn the patterns and structures of human language. It learns, it predicts and by a series of trial and error it can give you something that equates to a human response. But mathematics (even very simple school maths like 8 x8) is based on a very precise process yielding an exact result. It requires a degree of reasoning and inference which many machine learning systems currently lack.
Having said that, there is a level of accuracy which has been expected to grow, since most A.I. structures (by definition) have a learning mechanism. So a system is constantly improving itself - so basically watch this space!
The revolving door at OpenAI
Which brings us to recent developments at OpenAI. You may remember that CEO Sam Altman was forced to resign and then he was suddenly back in his old job. So, that was kind of usual; why would the wunderkind of the tech industry be abruptly given his marching orders and then, even stranger than that, he got his old job back within days?
OLIVIER DOULIERY / AFP / picturedesk.com
What is Q-Star?
Of course, every technology website and social media feed went wild with rumours. It has been widely reported that he was working on an advanced system before the sacking. The Reuters news agency went a step further. It reported that the new system, called Q-Star, was able to solve basic maths problems it had not seen before, and this pace of development had alarmed some safety researchers. If the rumours are true, the new system triggered such alarm bells with some OpenAI researchers that they wrote to the board of directors before Sam Altman’s dismissal.
So what are some technology experts making of the events and the possibity of Q-Star (or a Q-Star like system in A.I.)? I put this to Dr Andrew Rogoyski at the Surrey Institute for People-Centred Artificial Intelligence, who told me:
“Well, I think the truth is that there’s a lot of speculation at the moment. The wires are alive with different ideas floating around. The thing that seems to have caught people’s attention is Q-star. But it is speculation. We don’t know what’s going on and maybe over time that will become clear as they release new models. So I think we need to be quite cautious about what we’re inferring from this, but at the same time potentially quite excited. The interesting thing seems to be that it can solve unseen maths problems. It could be doing more advanced maths (in other words) - things that it hasn’t got direct training and experience (from) and that’s quite exciting".
Pixabay / CC0
Daniel Sokolov from heise.de also gave me his take on what he thinks is going on with OpenAI & Q-Star:
“We know very little about Q-Star. I have the impression it’s something leaked by OpenAI to create more of a positive hype, after all the turmoil they had at the company where the board fired their CEO, and then there were negotiations about bringing him back. And in the end, he came back, but the board resigned. What is currently happening in A.I. is a bit reminiscent of what happened with the dot com bubble about 20 years ago. Everything was going online and companies were trying to get investor money. Some had extremely high valuation without actually having a business model. And similar things are happening here (in the present-day world of A.I.). There are all these A.I. companies who all pretend to have found the Holy Grail and they want investor money. Many don’t really have a business model and some of those will not survive. If you remember the dot com bubble, the (beginnings of) online shopping, these things revolutionized a lot of industries. It really changed the economy of the globe. But a lot of companies were vastly overvalued and went bust. And my personal opinion is that there are similar things happening in A.I. with companies trying to attract attention or trying to attract investor money, hoping to be the one left standing when the tide goes out.”
Publiziert am 28.11.2023