Discussion about this post

User's avatar
Anatoly Karlin's avatar

This is the main thing that is motivating me to aggressively prioritize and accelerate across all areas of my life. "Ideacels" are safe for now - if anything, GPT for now helps them be much more productive - but the flip side is that the clock is ticking and we might not have much time any longer to put down on the record any interesting ideas or concepts we have before this too becomes swept away by superintelligent AIs and lose the capability to make any further original contributions to the noosphere. The default guess would be a decade but it could be far sooner and even 2026 which I whimsically suggested in my AI takeover story WAGMI https://akarlin.com/wagmi/ no longer seems entirely fantastical now.

My strong recommendation to writers, thinkers, content creators, etc. is to go forwards with anything you believe in strongly now, instead of dallying any longer. The gap between the smartest humans and the dullest humans is fairly minor. This window, regardless of how it ends, is very unlikely to last long.

https://twitter.com/powerfultakes/status/1599545967052673024

Expand full comment
UndeservingPorcupine's avatar

I tend to think that if AI gets that good (and doesn’t hit an unseen wall like we’ve seen with, say, self-driving cars recently), then it shouldn’t be *that* long before we successfully request from it a gene therapy that shifts the human IQ distribution to the right by 30 points, which might then make room for some human specialists again.

Expand full comment
6 more comments...

No posts