Some may consider this blog post hyperbolic, and if that’s the lens you use to make a judgement, that’s your prerogative. However, these thoughts are about the business of higher education and, in particular, about how that business may be at risk.
Education as a product is something I’ve written about before (see my LinkedIn articles On the Disruption of Higher Education, and a follow up called Dad Weighs In . . . ). Some of what I have discussed in both politics and purpose, and especially cost of higher education, continues to unfold as expected. What has changed since prior writing is the speed of the advancement of artificial intelligence (AI) in higher education, combined with a rather turbulent political landscape that’s eroding the very foundation of a true liberal arts education.
These two ideas, AI and politics, are not as unrelated as they may seem, especially within a commodified version of higher education towards which we are drifting. With large language models now capable of holding massive amounts of data, it’s tempting to think that everything can be loaded into the machine and the ghosts therein will first learn to speak for us and eventually learn to think for us. If that’s true, then the value for access to AI as a tool might seem to beg the question about the value of a liberal arts education as an idea.
This becomes even more concerning when the word “liberal” is willfully misinterpreted to serve political goals rather than communicating its purpose in developing within human beings the sentient intellectual capacities like reason and judgment. The real political objection to “liberal arts” has never really been about the word liberal, but instead about its heterodoxic challenges to various conventions. Such challenges are uncomfortable for those who wish to control various constructs (including content) for profit or other reasons. Suspension of the aforementioned intellectual capacities is therefore convenient politically, and maybe even economically; but it ultimately creates a closed loop system of orthodoxy from which we may not recover.
Such is the backdrop for a new administration. It’s always been worrisome that the party that supports the current President has had a long history of anti-intellectualism, and as Julian Zelizer notes in his recent post in FP (Foreign Policy Magazine), “attacks on expertise have been an essential element of Republican politics for decades”, going back to George W. Bush, and even as far as Richard Nixon. But this newest version of Republican anti-intellectualism, spearheaded by Trump, is particularly troublesome in that basic facts are now beings surrendered to alternative ones in order to maintain popular, if not factually challenged, political support.
This last part is the most concerning, especially within the context of the transition of higher education’s purpose that I’ve written about in past articles. As we continue to wrestle with the concept of basic human rights in an increasingly commodified world, where our very existence seems to be more and more dependent on some external financial agency (think about “too big to fail”), we are at the limits of what we had heretofore assumed was a foundation of our democracy – that is, the idea that education was a public good that serves our long-term interests as a country.
The barrier to the erosion of this idea of education as a public good has often been the faculty of various institutions, holding fast to the idea that a liberal arts education was the best way to prepare future generations for their roles and their leadership. Now, we are at a crossroads of a sort, where students are being encouraged (mostly at the administrative level) to use AI in their studies as part of a “competitive” or “productivity” push, and faculty expertise is being supplanted. As we continue to trade a variety of privacy and other basic human needs for the sake of various electronic conveniences, the real worry that we should be considering is the advancement of AI.
So, for me, it’s not so much the worrisome Terminator scenario, where machines are becoming sentient and taking over; it’s that we’re gradually surrendering generational sentience and giving up without a fight.