BrikoX@lemmy.zipM to Technology@lemmy.zipEnglish · 6 months agoResearchers upend AI status quo by eliminating matrix multiplication in LLMsarstechnica.comexternal-linkmessage-square8fedilinkarrow-up147arrow-down11file-textcross-posted to: machine_learning@programming.devsingularity@lemmit.onlinetechnology@lemmy.world
arrow-up146arrow-down1external-linkResearchers upend AI status quo by eliminating matrix multiplication in LLMsarstechnica.comBrikoX@lemmy.zipM to Technology@lemmy.zipEnglish · 6 months agomessage-square8fedilinkfile-textcross-posted to: machine_learning@programming.devsingularity@lemmit.onlinetechnology@lemmy.world
minus-squareFaceDeer@fedia.iolinkfedilinkarrow-up1·6 months agoI don’t think that making LLMs cheaper and easier to run is going to “pop that bubble”, if bubble it even is. If anything this will boost AI applications tremendously.
Let’s pop that bubble
I don’t think that making LLMs cheaper and easier to run is going to “pop that bubble”, if bubble it even is. If anything this will boost AI applications tremendously.