I agree with you; technology is outpacing culture. And culture will suffer. The Matrix will be less of a movie and more of foreshadowing of our cultural demise.
My job has me positioned between programmers / magicians who are constantly creating and medical users / audience who are still processing the first trick of the evening. I have daily internal discussions where I state we're building Teslas for people driving Edsels. Our reach has exceed the grasp of most because there is not enough time to process and integrate. The operational cadence is so quick that the class I build today to train users will be obsolete in weeks, not years.
Personally, I advocate smaller and built-for-purpose language models rather than LLMs because my clinical assistant doesn't need to know how to make a margarita, reinvent a Shakespearean sonnet in the style of Stephen King, or how to scrub a nuclear reactor. Having all of that knowledge in one LMM is unnecessary, expensive, and creates more opportunity for hallucination / confabulation. If the user discernment isn't there, the fallacy provided to the user gets propagated. It's a point of failure waiting to spew like a college freshman at their first kegger.
Giving any AI / ML platform access to the world's knowledge without compassion and empathy as buffers, will not end well for humans. Exhibit A is the push / pull between The Architect and The Oracle in The Matrix. Or the little boy in the movie Looper. Power without comprehension or compassion is hazardous to our collective health.