artificial intelligence may already be plateauing

man reading about chatgpt on computer screen

progress may be more incremental … while resource use continues to boom.

by rick richardson

according to a recent story from the information, openai’s next-generation orion model of chatgpt, which is both rumored and denied to be arriving before the end of the year, might not be everything that has been hailed when it does.

more on ai: thomson reuters brings al to audit solutions | ai displacing more jobs in banking than other sectors | what is an ai pc, and should i get one? | ai-engineered enzyme could be solution to plastic pollution | educators can benefit from new generative ai course | ai named the highest-paying in-demand tech skill for 2024 | ai generates revolutionary new battery design | chatgpt is getting humanlike memory
goprocpa.comexclusively for pro members. log in here or 2022世界杯足球排名 today.

 

according to the report, which quotes unnamed openai staff, the orion model has shown a “far smaller” improvement over its predecessor, gpt-4, than gpt-4 did over gpt-3. according to the same individuals, orion “isn’t reliably better than its predecessor (gpt-4) in handling certain tasks,” particularly coding applications, even though the new model is noticeably better at general language functions like creating emails or summarizing documents.

according to the information report, one of the main reasons for the new model’s negligible improvements is the “dwindling supply of high-quality text and other data” that may train new models. simply put, the ai sector is rapidly approaching a training data bottleneck after removing the simple sources of social media data from websites like x, facebook and youtube (the latter twice). pre-release training is being slowed significantly because of these organizations’ growing inability to identify the kinds of complex coding problems that will help them push their models beyond their existing limits.

this decreased training effectiveness has significant business and environmental ramifications. the amount of energy, water and other resources is predicted to increase sixfold over the next 10 years as frontier-class large language models expand and continue to push their parameter counts into the high trillions. because of this, the country’s current power infrastructure cannot supply the power required for their expanding networks of artificial intelligence data centers. for this reason, google is purchasing the output of seven nuclear reactors, aws is purchasing a 960 mw plant and microsoft is attempting to restart three mile island.

to get around the shortage of suitable training data, openai has established a “foundations team.” these methods might entail the use of artificial training data, like that produced by nvidia’s nemotron family of models. the group is also investigating ways to enhance the model’s performance after training.

initially believed to be the code name for openai’s gpt-5, orion is now expected to arrive in 2025. it’s unclear if we’ll have enough electricity to see it through to completion without causing our municipal electrical grids to brown out.

leave a reply