Neophilia vs. Neophobia: Navigating the AI Revolution in Organizations
The funniest irony of the foundational models that are being trained right now is that to any of us who aren't super geniuses in a field won't have any way to evaluate how good they are (beyond just "it's smarter than me").
The neophilia (worship of what is new above all else) of technologists who can't evaluate the products of their own companies will lead to some absurdities that will make the "digital transformation" era of the enterprise seem like the smoothest implementation in history (for those without this arcane understanding of the enterprise, most digital transformation schemes inside of large organizations fail miserably in funny ways that show the realities of human nature)
Even with traditional technologies of automation, engineers face strong incentives to learn the latest generation of tools, even if those latest generations don't offer anything meaningful because the entire industry is caught up in "the new".
Don't get me wrong, I'm also aware that "the new" holds surprises for everyone and should not be discounted, but I'm also aware of the timeless persistence of human nature, it doesn't change often and it's almost always absurd to the outside observer.
Imagine when the new $10-100 billion models come out in 2025 and 2026 (as predicted by Dario Amodei of Anthropic) and what we have now hasn't been figured out in terms of how it's implemented inside of an organization. These models offer an insane level of expert detail in arcane subjects
The vast majority of people inside of organizations can't even evaluate current models (except "its smarter than me"). Human nature in general, despises things that are smarter than the group, it's a threat that they accurately predict will take their jobs.
Then you have IT organizations who are looking at the recent hack into OpenAI which was not disclosed by OpenAI. IT leaders in big companies don't have neophilia, they have neophobia. What is new can lead to them losing their jobs.
I'm not at all a pessimist, I think this technology will lead to massive changes, but one thing that it won't ever change is human nature. Human nature is remarkably sticky and despite what the modernists say, we aren't modern, what we call civilization is just one step out of the jungle (but armed with nuclear weapons and supercomputers)
In my own investing and operating, there are two factors that lead to a tension, this tension will be the one to watch to see how human nature tries to resolve it:
1. Human nature rarely changes
2. Technological acceleration is no longer a prediction, but a reality
What do you think? Am I right? Am I wrong? Some secret third thing?