What's Missing Between LLMs and AGI - Vishal Misra & Martin Casado

What's Missing Between LLMs and AGI - Vishal Misra & Martin Casado

48分钟 ·
播放数134
·
评论数0

Vishal Misra returns to explain his latest research on how LLMs actually work under the hood. He walks through experiments showing that transformers update their predictions in a precise, mathematically predictable way as they process new information, explains why this still doesn't mean they're conscious, and describes what's actually required for AGI: the ability to keep learning after training and the move from pattern matching to understanding cause and effect.

 

Resources:

Follow Vishal Misra on X: x.com 
Follow Martin Casado on X: x.com

 

Stay Updated:

Find a16z on YouTube: YouTube

Find a16z on X

Find a16z on LinkedIn

Listen to the a16z Show on Spotify

Listen to the a16z Show on Apple Podcasts

Follow our host: twitter.com

 

Please note that the content here is for informational purposes only; should NOT be taken as legal, business, tax, or investment advice or be used to evaluate any investment or security; and is not directed at any investors or potential investors in any a16z fund. a16z and its affiliates may maintain investments in the companies discussed. For more details please see a16z.com/disclosures.


Hosted by Simplecast, an AdsWizz company. See pcm.adswizz.com for information about our collection and use of personal data for advertising.