Jing Hu
1 min readSep 30, 2024

--

Hey Russell, I appreciate you sharing your thoughts here!

I agree that scenario three is the most likely path forward. Equally, it wouldn't be LLMs that play the part in scenario three.

My suspicion is, we won't achieve AGI by simply making sure LLMs have more contextual knowledge or "memory".

Regardless of how long the context is, in the end it's still a puzzle game based on probability.

I was quite optimistic about AGI's progress, but the more I learned about it, the less certain I became.

We now can "see" how our neurons connect and know about some chemistry between each neuron. But we still struggle with how we form memories and why we have dreams, and so on.

If we are to go with scenario 3, based on your comment "AI developers need to learn from human biology to even approach AGI. AGI might evolve in ways that we might not understand but it still needs to understand us and work with us.", the premise would be that we need to be able to explain the link between the brain's physical processes and subjective experiences.

Then the question would be, how long do you think it'd take until we know enough about our brain? :)

--

--

Jing Hu
Jing Hu

Written by Jing Hu

Scientist | Technologist | AI Journalist. To get more of my work, visit: https://jwho.substack.com/

Responses (1)