{"id":136103,"date":"2024-11-26t11:55:36","date_gmt":"2024-11-26t16:55:36","guid":{"rendered":"\/\/www.g005e.com\/?p=136103"},"modified":"2024-11-26t16:10:24","modified_gmt":"2024-11-26t21:10:24","slug":"artificial-intelligence-may-already-be-plateauing","status":"publish","type":"post","link":"\/\/www.g005e.com\/2024\/11\/26\/artificial-intelligence-may-already-be-plateauing\/","title":{"rendered":"artificial intelligence may already be plateauing"},"content":{"rendered":"
<\/strong><\/p>\n progress may be more incremental … while resource use continues to boom.<\/strong><\/p>\n by rick richardson<\/em><\/p>\n according to a recent story from the information<\/a>, openai\u2019s next-generation orion model of chatgpt, which is both rumored and denied to be arriving before the end of the year, might not be everything that has been hailed when it does.<\/p>\n <\/p>\n according to the report, which quotes unnamed openai staff, the orion model has shown a \u201cfar smaller\u201d improvement over its predecessor, gpt-4, than gpt-4 did over gpt-3. according to the same individuals, orion \u201cisn\u2019t reliably better than its predecessor (gpt-4) in handling certain tasks,\u201d particularly coding applications, even though the new model is noticeably better at general language functions like creating emails or summarizing documents. this decreased training effectiveness has significant business and environmental ramifications. the amount of energy, water and other resources is predicted to increase sixfold over the next 10 years<\/a> as frontier-class large language models expand and continue to push their parameter counts into the high trillions. because of this, the country\u2019s current power infrastructure cannot supply the power required for their expanding networks of artificial intelligence data centers. for this reason, google is purchasing the output of seven nuclear reactors, aws is purchasing a 960 mw plant<\/a> and microsoft is attempting to restart three mile island.<\/a><\/p>\n to get around the shortage of suitable training data, openai has established a \u201cfoundations team.\u201d these methods might entail the use of artificial training data, like that produced by nvidia\u2019s nemotron family of models<\/a>. the group is also investigating ways to enhance the model\u2019s performance after training.<\/p>\n initially believed to be the code name for openai\u2019s gpt-5<\/a>, orion is now expected to arrive in 2025. it\u2019s unclear if we\u2019ll have enough electricity to see it through to completion without causing our municipal electrical grids to brown out.<\/p>\n","protected":false},"excerpt":{"rendered":"more on ai:<\/b> thomson reuters brings al to audit solutions<\/a> | ai displacing more jobs in banking than other sectors<\/a> | what is an ai pc, and should i get one?<\/a> | ai-engineered enzyme could be solution to plastic pollution<\/a> | educators can benefit from new generative ai course<\/a> | ai named the highest-paying in-demand tech skill for 2024<\/a> | ai generates revolutionary new battery design<\/a> | chatgpt is getting humanlike memory<\/a>
\nexclusively for pro members. <\/span><\/strong>log in here<\/a> or 2022世界杯足球排名 today<\/a>.<\/span><\/h4>\n
\n
\naccording to the information report, one of the main reasons for the new model\u2019s negligible improvements is the \u201cdwindling supply of high-quality text and other data\u201d that may train new models. simply put, the ai sector is rapidly approaching a training data bottleneck after removing the simple sources of social media data from websites like x, facebook and youtube (the latter twice<\/a>). pre-release training is being slowed significantly because of these organizations\u2019 growing inability to identify the kinds of complex coding problems that will help them push their models beyond their existing limits.<\/p>\n