READ MY NOTICE TO EXPRESS MYSELF
AS A FREE ADULT CITIZEN OF THE STATE OF TEXAS

Most days, I feel Karl Jaspers’ “state and society.” It is not just an abstract chapter heading in Idea of the University (1959). Rather, I experience it as an uncanny pressure in my shoulders. This happens when I log into all the diverse platforms before I can greet a single colleague. I am a fairly new PhD (May 2024), now doing both academic and administrative work in Texas public universities. Therefore, I live that tension from three directions at once. I am a scholar doing existential critique of our current world (like this blog). I am also an adjunct teaching philosophy inside a massive state apparatus at different schools. Additionally, I am a low‑level administrator obligated to keep these very platforms humming.[1][2]

State and society as platform order

Jaspers says the university exists only “through the good graces of the body politic,” only where the state decides to tolerate a space dedicated to truth that does not immediately serve its own power. Sitting here in the twenty‑first century, I cannot pretend “the state” is just the superstructure ordered from DC, Austin, or even the local system regents; it is a mesh of federal/state agencies, funding formulas, corporate vendors, and ranking systems—all converging as a platform order that grants or withholds the conditions of my work.[2][1]

When Jaspers writes that society wants “pure, independent, unbiased research” somewhere within its orbit, he presumes a society capable of desiring something beyond short‑term utility. I serve a society that desires dashboards. Our “good graces” arrive as performance‑based funding tied to graduation metrics, as compliance audits run through vendor portals, as “student success” regimes that render my classroom in the same idiom as an e‑commerce site: clicks, logins, time‑on‑task.[1][2]

So when he warns that the university “can only live where and as the state desires,” I now hear: it can only live where and as the data pipelines desire. The will of “society” has been coded into algorithms, and my task as existential phenomenologist is to read those interfaces as symptoms of a deeper metaphysics of control.[1]

The university as non‑state inside the platform‑state

If we once spoke of “bearing witness to truth,” we are now asked to “evidence outcomes.”[

Jaspers’ phrase “state within the state” has always struck me as both hopeful and perilous. He imagines a place where some of us are excused from responsibility for daily politics precisely so that we can bear “unlimited responsibility for the development of truth.” In my own institution the arrangement is stranger: we have become a platform within the platform‑state.[1]

Althoughh it ultimately will happen, the state legislature really does not need to tell me what to teach when the learning management system already formats how “teaching” must appear: discrete modules, quantifiable “engagement,” automated nudges. The regents as well do not have to issue explicit directives about scholarly value when citation databases and grant dashboards silently rank each of us by impact. Of course, they will; they just do not really need to do it. All the techno-systems already are clicking along to do it–especially with the help of over-compliant, suit-averse legal counsel who can even overrule an executive administrator. However and whomever do it, my academic freedom is increasingly exercised inside user agreements and terms of service.[2][1]

Jaspers insists that the university should meet politics as an intellectual conscience, not as another party machine. In my setting, the risk is not so much crude propaganda as slow capture: a thousand small “best practices” that encourage me to surrender judgment to predictive analytics and student‑risk scores. The conscience of the university is being refactored into a set of metrics. If we once spoke of “bearing witness to truth,” we are now asked to “evidence outcomes.”[2][1]

From the standpoint of improvising self (Existenz), this is a new boundary situation. The question is no longer only whether we submit to a party line, but whether we consent to let the platforms define what counts as reality, risk, and success.

Supervision as algorithmic care

…at a certain point, more data simply means more surveillance.

Jaspers distinguishes sharply between necessary state supervision—protection of independence, enforcement of standards—and illegitimate interference when governments demand propaganda or ideological conformity. My daily experience is of a third thing: algorithmic supervision masked as care.[1]

As an administrator, I sit in meetings where we are urged to “leverage data” to identify at‑risk students, intervene early, and prove to the state that we are good stewards of public money. On my faculty days, I watch the same tools reduce students to red‑yellow‑green risk icons. This is not neutral. It is a reconfiguration of responsibility: I am answerable not to living students as co‑Existenzen but to the dashboard that claims to know them better than they know themselves.[2][1]

For Jaspers, exams and grades already threatened to become a “busy routine of tests and marks” that obscures real formation of mind. In my world, that routine has been naturalized as continuous assessment, real‑time analytics, and AI‑generated feedback. The supervision he wanted to keep in the background has moved into the foreground of every click, and it speaks in the language of benevolence: we only want them to succeed.[2][1]

Here my queer, existential suspicion comes alive. I have long argued that we must queer the binaries that govern sex, gender, and family; now I must queer the binary between “care” and “control” in educational technology. The platforms insist that more data equals more care. My phenomenology of the classroom tells me that at a certain point, more data simply means more surveillance.[2]

The “technical faculty” in the age of AI

Whose ease is being purchased? Whose struggle is being erased? Whose purpose is being pre‑written by predictive models?

When Jaspers takes up the “technical faculty,” he is thinking about technology’s emergence as a fourth great faculty, alongside theology, jurisprudence, and medicine. He sees technology as a new power that reshapes our environment and threatens to become a “giant” beyond human control, yet he also imagines that the university might integrate this power into a renewed “cosmos of knowledge.”[1]

I write as someone whose everyday labor is already mediated by that giant. Our “technical faculty” today is not just engineering or computer science; it is the entire layer of IT, ed‑tech vendors, AI‑development labs, and “innovation” offices that run the platforms upon which all other faculties now depend. The technical faculty is everywhere and nowhere: hired out to contractors, embedded in cloud services, and only dimly visible in the help‑desk ticketing system.[1][2]

Jaspers insists that a new faculty deserves its place only if it connects to “universal ideas,” if it becomes a basic science rather than a pile of skills. That criterion gives me a way to triangulate my own position. As a recent PhD, I still carry the memory of long, slow philosophical study; as an instructor, I watch AI tools creep into course design; as an administrator, I am asked to approve pilots of the next big platform. The question I must ask in each role is simple and terrible: does this particular deployment of AI join the universal conversation about truth, freedom, and co‑existence—or does it merely optimize our ability to sort, rank, and predict human beings?[2][1]

To respond honestly, I have to resist the temptation to treat “technology” as a neutral addition. Jaspers already saw that technology “traps” us in a planned yet chaotic transformation of life. In my milieu, this trap is glossy and gamified. It rewards quick adopters, punishes skeptics with accusations of being “anti‑student,” and wraps the whole thing in the rhetoric of access and equity. My queer phenomenology must strip this rhetoric down to its dyads: Struggle–Ease, Chance–Purpose. Whose ease is being purchased? Whose struggle is being erased? Whose purpose is being pre‑written by predictive models?[1][2]

Intellectual aristocracy vs. metric nobility

The temptation is always to translate my own project—queer faith, wyrd hope, loving struggle—into the language of outcomes, deliverables, and “impact.”

Jaspers defends something he calls an “intellectual aristocracy”: a minority of people, from any social background, who are called to the “highest will” for truth and whose presence should shape the university. He is equally clear that faculty self‑governance tends to favor the “competent, the second‑rate” and that state supervision may sometimes be needed to prevent mediocrity from reproducing itself.[1]

In my world, there is still an aristocracy, but it is metric, not intellectual. Those who attract large grants, high citation counts, and respectable rankings are treated as our natural leaders, even if their work never once touches the boundary situations of guilt, struggle, or death that define the human condition. The old departmental clique has been supplemented by an algorithmic clique: recommendation systems, journal impact factors, and AI‑assisted search engines that elevate the already visible and render queer, experimental, or community‑rooted work nearly invisible.[2][1]

As someone who inhabits teaching, administration, and new‑PhD precariousness at once, I experience this as a three‑sided pressure. I am supposed to model intellectual seriousness for students, justify expenditures and workloads to supervisors, and build a “profile” that will be legible to metrics that have no place for Wyrding, loving struggle, or transistance. The temptation is always to translate my own project—queer faith, wyrd hope, loving struggle—into the language of outcomes, deliverables, and “impact.”[2]

From a Jaspers‑inflected queer standpoint, that temptation is itself a boundary situation. Either I allow the metric aristocracy to define what counts as thinking, or I wager that there is still room, even now, for an aristocracy of Existenz: those who merge their personal and intellectual existence, who accept the loneliness of thinking beyond the lifeworld’s current scripts, who insist on co‑existence rather than mere connectivity.[1][2]

Politics, propaganda, and AI publicness


Propaganda is not only crude slogans; it is any configuration of language and image that forecloses boundary situations, that makes guilt, struggle, suffering, chance, and death appear as mere technical challenges to be managed by experts.

Jaspers is careful with politics. He accepts that politics must be studied at the university but warns that when active partisan struggle invades the institution, the “idea of the university itself” suffers. He defends academic freedom, yet he also warns professors not to cloak casual political opinions in the authority of scholarship.[1]

My context is saturated with politics and depoliticization at once. State‑level culture wars dictate what can be said about race, gender, and sexuality, even as our internal communications are smoothed into neutral “messaging” fit for social media. The platforms through which we speak—learning systems, email, video conferencing, AI chatbots—are not neutral carriers. They are businesses with their own stake in what is visible, profitable, and safe for advertisers.[2]

As an existential philosopher working in Texas, I cannot pretend that this is an external problem. My own queer, trans, and abolitionist commitments are risked every time I speak plainly in the wrong meeting, every time I push back against a “content‑neutral” policy that in practice punishes already vulnerable students. At the same time, I am told that any discomfort AI systems cause—say, in proctoring disabled or gender‑nonconforming bodies—is simply a technical issue to be patched in the next update.[2]

Here Jaspers’ distinction between truth and propaganda helps me. Propaganda is not only crude slogans; it is any configuration of language and image that forecloses boundary situations, that makes guilt, struggle, suffering, chance, and death appear as mere technical challenges to be managed by experts. An AI‑saturated university risks becoming a propaganda machine precisely by offering frictionless, “personalized” experiences that never let students or faculty encounter the shock of their own freedom.[1][2]

My tripartite triangulation


Jaspers wrote that when the state no longer wants the idea of the university, the university must “keep alive its ideal in secret” and await the fall of the regime.

Because I sit in this institution as teacher, as administrator, and as only‑recently‑disciplined scholar, I occupy a tripartite vantage point that I can no longer ignore.[2]

  • As teacher, I feel the pull of loving struggle with students who show up exhausted by work, trauma, and algorithmically‑curated distraction. I want to protect their boundary situations from being harvested as training data.[2]
  • As administrator, I am implicated in decisions that will either deepen or resist platform capture: which vendors we sign with, which metrics we elevate, which “pilot projects” we normalize.[1][2]
  • As scholar, I am responsible to a tradition—from Jaspers and Ortega to Freire, bell hooks, and Graeber—that compels me to name control where it masquerades as progress, and to experiment with queering, Wyrding, and transistance as small, stubborn refusals of the given.[1][2]

Jaspers wrote that when the state no longer wants the idea of the university, the university must “keep alive its ideal in secret” and await the fall of the regime. My wager is slightly different and very local. Here, in this Texas public institution where platform capitalism, AI, and the datafication of learning already frame our days, I seek not secrecy but a different kind of visibility: small publics of co‑Existenz where we can speak frankly about dashboards as destiny, about surveillance as care, about metrics as aristocracy.[2][1]

This is my present Wyrding of Jaspers: to take his “state within the state,” his “technical faculty,” and his “intellectual aristocracy,” and re‑live them as boundary situations in a platform university—queering their binaries, mapping their dyads, and refusing to let the data have the last word on what it means to teach, to learn, and to become oneself.[1][2]

More to come soon.


SOURCES
[1] K. Jaspers. (1959). The Idea of the University. Boston: Beacon Hill.
[2] K. Brown. (2024) Queer Faith, Wyrd Hope, Loving Struggle. Denton, TX: University of North Texas. Dissertation.

Keith "Maggie" Brown Avatar

Published by

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.