A Cynical Argument for the Liberal Arts (Parts 7-12)
David Eubanks
2014-06-29 00:00:00
URL


A Cynical Argument for the Liberal Arts (Parts 0-6)

7

These individuals, however, are bound by the rules of the system, and recent history shows that state juggernauts are capable of enormities. The epistemological and ontological constraints of system-work were described by George Orwell in his 1946 "Politics and the English Language."



When one watches some tired hack on the platform mechanically repeating the familiar phrases -- bestial atrocities, iron heel, bloodstained tyranny, free peoples of the world, stand shoulder to shoulder -- one often has a curious feeling that one is not watching a live human being but some kind of dummy: a feeling which suddenly becomes stronger at moments when the light catches the speaker's spectacles and turns them into blank discs which seem to have no eyes behind them. And this is not altogether fanciful. A speaker who uses that kind of phraseology has gone some distance toward turning himself into a machine.

On the other hand, our species has a rich history of struggle within and without, and  higher education--particularly the liberal arts--is in a position to help new humans adopt an intentional stance toward the systems they find themselves in, as well as creating and awareness and response to internal biases resulting from our evolutionary and social history. Classical Cynicism closely-read from the charge to "debase the coin of the realm" is a solvent for the calcification of system-rules. It is also too a dangerous an idea to administer to young minds without care.



My argument has been that education that resembles jobs-training lacks the most essential ingredient that young minds need in order to be free. Liberal arts education does not have a single definition that everyone will agree too, but broadly speaking it encompasses intellectual discovery of the "big questions" of what it means to be human. The distinction "liberal arts versus jobs training " is described by Noam Chomsky  as [source]:

 The first kind of education is related to the Enlightenment  - highest goal in life to inquire and create; search the riches of the past; try to internalize ;  carry the quest  ---  help people how to learn on their own; it's you the learner;  it's up to you what you will master.


The second kind of education is related to Indoctrination - from childhood young people have to be placed into a framework where they will follow orders that are quite explicit.

I have framed the "jobs" versus "liberal arts" dichotomy as it often appears in the media (part of their epistemology is to make things that are similar seem more different so that they can create controversy and sell web-clicks), but in reality those who work in higher education are more nuanced about it. For example, "Bentley University Tries to Make Business and Liberal Arts Pay Off" at wsj.com:

Colleges are in the cross hairs of a debate over the relative value of a liberal arts education versus a business degree. But Gloria Cordes Larson, the president of Bentley University in Waltham, Mass., says students don't need to choose just one path.


"It's not about pitting lifelong learning skills against professional skills," says Ms. Larson, a former lawyer and economic policy adviser at the state and federal levels, who took the helm at Bentley in 2007. "A college degree should reflect both."

There are also examples of the opposite. One of our projects at my institution is creating pedagogy that engages students with external audiences in meaningful ways (see "21st Century Liberal Arts"). My math classes do a survey of campus traffic each semester and report to the safety council, for example. In networking with other colleges, we found that some professional programs eschew a portfolio-like approach because all they really care about is the pass rates on standardized tests, for example in nursing and accounting. This is an example of how system-reward-seeking produces Cynicism as a matter of course. Presumably what we really care about is having nurses that can do the many varied tasks with professionalism, skill, and humanity. The system signal for this (the coin of the realm) is a test score, which is not a very good way to asses what we want to know. Optimizing test scores is always a slippery slope, and it inherently adds noise to whatever (little) signal was there to begin with. See "Former Atlanta schools superintendent reports to jail in cheating scandal" for a full-blown example. This is Cynicism on two levels: first accepting a poor epistemology (using tests instead of demonstrations) and then the secondary noise resulting from "optimizing" this easily-debased signal.



Broadly speaking, liberal arts education is our hope for injecting critical analysis into society. This presupposes that we care about things like preventing atrocities and maintaining individual freedoms. If you buy that, the next question is how are we doing with current methods, and what can we improve? 




8

If we want to claim that liberal arts education is as important as jobs training, it's fair to ask for specifics. Even employers say they want 'critical thinkers', so what's the real distinction in practice? A definitive answer is well beyond my scope here (and beyond my ability to deliver), so I will stick to the single idea under consideration: in what ways does liberal arts education, as typically practiced in private colleges, inculcate Cynicism? It's easiest for the moment to take the (straw man) position that jobs training does nothing in this regard, but we can revisit that later.



At first glance, the picture isn't very encouraging. Cynicism, you'll recall, is a lived philosophy, not given to theory. The academy is deeply steeped in theory, and for the most part asks students to demonstrate accomplishment by writing things down. There are exceptions: labs and performances, visual arts, practica and internships, travel, and so on. Compare this list to Henry A. Giroux's recent op-ed "Noam Chomsky and the Public Intellectual in Turbulent Times," where he writes:



Chomsky is fiercely critical of fashionable conservative and liberal attempts to divorce intellectual activities from politics and is quite frank in his notion that education both in and out of institutional schooling should be involved in the practice of freedom and not just the pursuit of truth.


 [...]


On higher education, Chomsky has been arguing since the '60s that in a healthy society, universities must press the claims for economic and social justice and that any education that matters must not merely be critical but also subversive.


Criticism leans on (little-c) cynicism, impugning sources of information in order to construct new ones. The deprecation of "pursuit of truth" is particularly biting. The "practice of freedom" and "subversion" require action, and (big-C) Cynicism is a perfect tool for both.



I think that the homeopathic Cynicism in liberal arts experiences are superior to the straw person "jobs training," but it's hard for me to muster much enthusiasm for that argument. Chomsky's charge hits home there. However, as I mentioned before, I think the liberal arts provides a safe haven for students who come to us already subversive, who already want to practice freedom and not just write papers about it.



Internal practice of freedom is the second of the two domains we identified back in Part Two. We can subvert the realm by debasing its coins, but we can also use Cynicism introspectively. Here, the liberal arts education is obviously superior to jobs training. For example, the study of the history and practice of philosophy might be compared to "technology for the mind." Yet, for some reason it is one of the liberal-artsy targets that business newspaper writers find irresistible:



Most college presidents would love to find a practical use for philosophic studies and for the rest of the liberal-arts curriculum. Colleges are expensive. Reading Kant is hard -- and he doesn't seem to be the perfect preparation for a competitive job market. [source: wsj.com]


The article quoted above is more nuanced than the text above suggests, but the sentiment is one echoed with regard to the study of humanities subjects: how can I convert this into dollars? There is a deep irony here, because the facets of material ambition is richly illustrated in philosophy, literature, art, and history. Yes, we want things. Why? What can we do about it? What are the consequences?



If the realm just needs workers, and the purpose of education is to produce them, there's no advantage over simply employing robots instead. With robots, we can engineer their internal signals so they don't become subversive (Asimov's visions aside). If we want, we can program them to go home after work, drink a beer and watch TV. The point is that if the quality of mind is not a consideration, we don't need people at all. This is the image I attempted to conjure in Part One. The jobs argument is simply defeated by asking "why" until the answers stop coming. If you try the same attack on a humanities-based education, you just get the humanities recapitulated.



Next time: mind-diving to see how Cynicism works inside the head, and how this is a liberal arts thing.



Finally, here's your daily dose of real-world Cynicism: "Cisco CEO warns Obama NSA 'load stations' threaten the entire tech industry."





9

The awareness we have of the world is mediated through signals from sensory organs and the meaning we make of these. As a practical matter, the information we receive has to be compressed in order to make sense of it. For example, we receive more than a million bytes per second through vision alone, and formulating cause/effect hypotheses about the world without compression would be practically impossible. Right now there is a fork laying on the table to my right, but the tines are hidden by a bag of dried fruit. That sentence comprises a hundred or so bytes of information, but communicates many possible ways to visualize it (decompression)--picking one makes it concrete enough to build a narrative from. This is only possible because of very high data compression.



We also have simplified internal signals. We can be "hungry for red beans and rice," but when our stomach grumbles, it just signals a generic need to be indulged. Pain too certainly comes in flavors, but an itch on the back is pretty similar to an itch on the leg--the most important information (an itch, a burn, a bug crawling up your neck) is signaled efficiently. By contrast, imagine if you were presented with a full account at the cellular level of all the relevant activity and had to sort through it all for meaning.



Perhaps one of the fundamental attributes of being human is the ability to recognize perceptual signals on this meta level (as abstractions, in other words) that can be manipulated. New ones can be created, for example by slipping small magnets under the skin to directly feel electrical/magnetic flux, or developing a taste for Scotch whisky. More familiar is the interdiction of signals, as with pain medication. A more fanciful idea is described in NYmag's "Is It Possible to Create an Anti-Love Drug?".



Two heirs to classical Cynicism, the Stoics and Epicureans, addressed internal signals. For example, ideas about the nature of grief and what to do about it is described in "How (And Maybe Why) To Grieve Like an Ancient Philosopher". The signals-based viewpoint also leads directly to the idea that death is not something to be feared, because it is simply an absence of signals. Contrast this to religions that recommend optimizing actions in life so as to produce the attractive signals in the "afterlife."



We can think of internal signals as "coins of the realm" and proceed to debase them. Drug addiction is one way to do that, but also meditation, counseling, and meta-cognition can subvert our out-of-the-box internal signals. Traditional liberal arts curricula explore this idea from many angles, even if it's not usually packaged that way. For example, our intuition versus rational thought (signals of what's real) are topics in psychology (e.g. see Daniel Kahneman's book Thinking, Fast and Slow). Add social, political, ethical, and biological signals: these are all explored from innumerable angles in the sciences and humanities. These perspectives--if taken seriously--can create in the learner a sophisticated meta-cognition that can be practically applied as an existentialist project. It goes like this: all signals are abstract by definition, which means there is a fundamental arbitrariness to them from the point of view of the receiver of the signal. Given the possibility to prefer some signals over others, we imaging a project of internal engineering to attenuate or amplify signals according to our most demanding desires.



This is a caustic process, and fully as dangerous as any Cynical enterprise. If one strips away too much, tossing aside all social and moral guides, for example, one could become a sociopath (this resembles Marquis de Sade's Cynical project, as described in The Cynic Enlightenment, starting on page 106). Or strip all the signals away and you get nihilism or suicide. But, more positively, the ongoing process of constructing a personal ontology can produce a freedom of mind that was modeled by Diogenes.



Liberal arts curricula expose internal signals and ways of attacking them, with relativism, post-modern thought, critical theory, and simply the exposure to many ways of thinking, historical decisions, and thought experiments. And so on. As with the academy in general, the approach is mostly theory and exposition rather than active mind-engineering. There is undoubtedly more colleges could do to enable self-subversion, but it would also be dangerous. I think there is some middle ground where we could operate in sandbox mode, so that students could gain some experience, and there are some experience like this available. For example, an assignment to sleep on the street for a couple of nights or practice asceticism in some form. My daughter's high school history teacher runs a project for weeks that consists of secretly identifying students as being 'communist' or 'capitalist', and prohibiting one side from communicating with the other. Students don't know which side they are on, and the teacher has spies everywhere--he shows them photos and social media screenshots of their interactivity, and deducts points accordingly. This is Cynical in that it undermines normal discourse--designed to loosely model The Terror, I'm sure. The benefits to students potentially includes reflection on the active management of feelings of unfairness or even fear. Anyone who can't see the applicability to a work environment isn't trying.



Beyond dramatic life-changes, internal freedom to attenuate and amplify signals has the potential to produce better workers too. How many of our new graduates are going to fall into their dream jobs right away? How many workplaces are unfair to employees or have abusive bosses or mean co-workers, or arbitrary rules or demeaning requirements? What, exactly, in "jobs training" is supposed to prepare a young mind for these assaults? Wouldn't it be better if they'd read and internalized The Prince? Wouldn't it be better if they knew about Foucault and the evolution of ontology and power, and how signals are ultimately arbitrary and malleable, and constantly being subverted by those who can do so to further their own ends?



Well, no. That's probably not what the employer wants. Foxconn's  replacement of humans with robots apparently involves collaboration with Google to design an appropriate operating system. This is, in effect, an attempt to specify in code what a perfect employee is. You can be there won't be a subroutine named for Machiavelli or Diogenes. (Update: apparently Google's self-driving cars have never gotten a traffic citation.)



Next time: signals and subversion at work, or "Diogenes as assistant to the regional manager."





10

In contrasting liberal arts education and 'jobs training', I've compared the latter to the construction of robots. This is fair for some kinds of jobs--like assembling parts to make a consumer product--where it's clear that automation is well-advanced, but what about those "high-paying" jobs that college is supposed to prepare people for? According to AAC&U's LEAP initiative, which included surveying employers [source], the liberal arts comes out looking pretty good. According to the report::

 





We should note, however, that:


The mission of the Association of American Colleges and Universities (AAC&U) is to make liberal education and inclusive excellence the foundation for institutional purpose and educational practice in higher education.


Also, what people say on surveys is not necessarily indicative of how they act. I went looking for contrary opinions, and found "What's a Liberal Arts Education Good For?" at Huffpost.com. This article reinforces the survey with a philosophical argument, but some of the comments that follow are from unhappy liberal arts graduates. Here are some edited samples, emphasis added:



This is the same sort of garbage that got me where I am today, the poorhouse.

A liberal arts education is a hideous waste of time for nearly all those who get one. It prepares the graduate for absolutely nothing. If you emerge from 4 years of college with a degree and no one is recruiting you for a job, you just wasted 4 years of life, a lot of money and a whole lot of effort. --newsreader64


 



Liberal arts do not translate to making any money so that had better not be a factor in the choice. It is for rich people. --escobar


Recent personal events have led me to a rather different conclusion. I have a BA from a small liberal arts college, and an MA in a mushy semi-science (anthropology). [...] Now, without a professional degree, I can't even get an interview for positions which I could do with ease. I suspect this has a lot to do with the sheer volume of job-seekers on the market and the handy shortcut that a professional degree offers the HR person tasked with reading hundreds of resumes. So, despite my fervent belief in liberal arts, I am contemplating a return to school to get a law degree. -- kpod




This last comment has a kernel for the Cynic to chew on, and more fodder is served up by this last one:



As a newly minted grad with my Masters in History, fortunate enough to be teaching a a community college this semester, I am a big booster for Liberal Arts. I spent the first 25 years of my life pursuing a very successful career in a fortune 500 company and always wondered what it was about engineers and MBA's that left me feeling that some aspect of their education was lacking. After returning to school and starting with an associates degree in Liberal Arts the answer is now very clear. On the whole most of them had had the creative skills driven out of them by empirical doctrine and a value system of conformity. Give them a project or a goal and they were fine, immoral to a large degree when it came to people management but perfectly capable of meeting their objectives. --Paulo1


Although these samples are not guaranteed to be representative, it's worth considering these bullet points:

 





The first of these is just signalling, and as such is amenable to cynical or Cyncial attack--something liberal arts colleges ought to be good at. The second point is an argument that education in the humanities produces graduates with more humanity, and gets back to the 'employee as robot' metaphor. 


 


All of this boils down to the argument that liberal arts education can create valuable outcomes (those in the survey at the top), but that these are not easily marketed to employers. It's like they are saying they really want to eat healthy food, but belly up to the fast food counter in practice. Next time I'll walk deeper into the weeds. I find the more substantive issue more interesting: what place does a Cynic have in a bureaucracy?




11

The line of thought so far is that:



1. Cynicism is powerful and can be beneficial or detrimental to the individual, to society, and employers,

2. Cynicism is better learned or cultivated at liberal arts colleges than professional programs, and in particular there is a better chance that through diluted exposure, graduates are more likely to be responsible critics and users of the philosophy.

3. That this is not obvious to employers, despite what they say on surveys, and

4. That the employability deficit can be overcome by liberal arts colleges themselves.



We start by showing that organizations breed Cynicism. Because I have cast classical Cynicism as a solvent for epistemology, allow me to describe organizations accordingly. The Darwinian understanding of the production and memory of novelty is fundamentally about information and its relationship to the world, which is an ideal tool for us here. In biology, informational signals are transmitted through time by genetic and epi-genetic states that get translated into phenotypes (the bodies of plants and animals and their behavior), which then compete with each other for survival and reproduction. The result is a competitive truth-finding exercise that randomly explores the natural world (including the ecology itself) for relative advantages. Evolution only proceeds non-randomly where these truths are discoverable. For example, it may be in the long-term interests of bacteria to figure out how to travel to other planets, but this ability may not lie within the discoverable landscape of genetic traits.



Modern organizations are similar to biological entities. They encode information into processes, procedures, paperwork, job definitions, and so on, which I'll refer to as an ontology. The ontology loosely represents the way the organization "understands" the world. Of course, it's not really intelligent the way people are--it's more like a machine, which is the metaphor I began with. The machine has a certain amount of randomness in its behavior, but it will probably have well-defined ways of perceiving the world and encoding those perceptions into the bureaucratic language of its ontology. For example, a team of accountants that produce an audit report create an official understanding of the organization's monetary value, cash flow, and so on. This information can be transmuted into reality too, for example a company with solid financial statements can get a loan to build a new factory. There's nothing in the ontology that requires morality (Google's "don't be evil" aside), which lies within individuals, the not organization per se.



Just like in biological ecologies, most organizations compete for limited resources. This is truth-finding when advantages are discoverable, which implies that they can be perceived by the organization. Those with limited ability to understand the world will be at a disadvantage. You can watch this play out in real time at a basketball game. Motivations are clearly understood through the rules of the game, and the ways of knowing success are deliberately clear--the basket even has a net hanging from it so that it's obvious when points are scored (compare this to rating figure skating). This creates a competition between the two teams for truth-finding, which in this case means finding more effective ways of playing the game (better strategy, tactics, training, players, etc.). This would not be the case if points were scored entirely at random. From this point of view, the fans turn out to see the evolution of team ontologies. These unfolding histories are the subject of counterfactual conjectures ("what the coach should have done was..."), which the fans are unlikely to think of as metaphysics, but it fits the mold.



Cynicism is the life and death to organizations. If its conception of reality is sufficiently undermined, say by an enemy general using deception, an organization may make bad decisions. It can also easily fool itself. I wrote a series of articles about this for the Institute for Ethics & Emerging Technologies, which you can find here, here, and here, and I'll pass over this point, called "wireheading" in computer science literature.



But Cynical attacks on the reality of others is so effective that it can be a means of keeping an organization alive too. This morning a colleague walked in, and in our conversation volunteered the following story. I have no way to verify it, but it illustrates the point.



"Joe" gets a degree from an online college, and is delighted when an school-arranged internship turns into a real job upon graduation. He is happy at the job, but is fired after six months and a day for unspecified reasons. A new graduate is hired in his place. He discovers that the CEO of the company is also on the board of the college he graduated from, and hypothesizes that  the company is used to inflate gainful employment percentages for the college.


In this tale, the college is debasing what "gainful employment" means to the department of education. The next story should also be treated as apocryphal. It illustrates how tangled these signals get, and how Cynicism naturally emerges. The story was told to me by a historian friend who said it originated with someone in the State Department.



As the story goes, the leadership of the cold war USSR needed good information about the size of their economy. But they couldn't trust their underlings because Cynicism was a survival trait: tell the boss what he wants to hear. Instead of accepting the bloated over-optimistic estimates of their own people, they relied on the CIA to tell them the truth. On the other side of the world, the CIA had indeed calculated what they thought the size of the USSR's economy was, but the number was so small that they thought no one in Washington would believe them. So they artificially doubled the number. Therefore the Kremlin used an estimate of their own economy that was about twice as big as it should have been.


These stories, true or not, illustrate the kinds games that are played within and between organizations. When they have happy endings, sometimes we call them "disruptive technologies" or "competitive advantage." On the other hand, sometimes we call them Enron and Bernie Madoff and mortgaged-backed securities.



Bureaucrats are usually thought of as boring, but nothing could be further from the truth. They handle, with their copy-fluid-stained fingers, the neurology of the organization. The reality by which it lives and dies is contained in those forms and procedures for acting on forms, and the relationship between the content and actual reality is constantly being subverted. People (gasp) lie on paperwork to get what they want. A sufficient break with reality leaves the organization in a state like psychosis. Or like the dodo bird--choose your metaphor.



For a vivid development of a psychotic break with reality, read Michael Lewis's The Big Short, where he describes how the ratings agencies were 'gamed' to bless crummy investments with the official stamp of worth. These ratings are almost literally "coins of the realm," since they limit the behavior of institutional investors. A quote from page 98 of Lewis's book:



The big Wall Street firms [...] had the same goal as any manufacturing business: to pay as little as possible for raw material (home loans) and charge as much as possible for their end product (mortgage bonds). The price of the end produce was driven by the ratings assigned to it by the models used by Moody's and S&P. The inner workings of these models were, officially, a secret: Moody's and S&P claimed they were impossible to game. But everyone on Wall Street knew that the people who ran the models were ripe for exploitation.


This is epistemological warfare, and you want the most capable Cynics on your side.



Update: The image below is taken from the SEC's 2008 report "Summary Report of Issues Identified in the 

Commission Staff’s Examinations of Select Credit Rating Agencies", and has been reformatted around this single bullet point.





12

Last time I compared organizations to biological organisms competing against Nature and against the rest of the ecology for survival. The battlefield is physical and virtual. Arm & Hammer's factories producing baking soda with less energy cost is good for the company. Convincing consumers that they need to buy a new box to put in the fridge every month is gold. Members of an organization are valuable to it in ways parallel to these two dimensions. Engineers that can improve efficiencies or design new products are valuable. So are accountants that can make profits tax-free. At the top of the organization, the role is entirely virtual. Generals push around symbols on a map while privates sweat in foxholes. A janitor who shows genius-level proficiency with a mop is not going to become CEO due to that skill. However, a CEO who doesn't understand how the physical world works--insofar as this affects the business--is probably not going to make very good decisions.



Higher education does a fantastic job of teaching students about physical reality (assuming said students want to learn about it). There's no substitute for experts in theory and practice, and the labs and equipment needed to engage physical reality in sophisticated ways. If you want to become an expert in what happens to molecules when you "ring" them with a sudden electromagnetic pulse, you can learn all about Fourier Transforms and whatnot, but you need a Nuclear Magnetic Resonance machine to actually do it.



The Enlightenment victory over shy Nature justifies the role of universities, but I think it also lends itself to the argument that education should be about physical stuff--learning how to stick needles in someone's arm or design a turbine blade OR low-complexity information-shuffling, like learning two-column accounting or how to integrate partial fractions. These are all safely science-y, easy to verify when accomplished, and straightforward to teach.



The victories and failures of The Enlightenment in the informational co-domain do not seem to be of as much interest in the public discourse on higher education. Ideally, this is where liberal arts education provides a benefit, but this message isn't being conveyed, and perhaps the institutions themselves haven't really internalized it.



With this lens in place, let's look at the signal-domain role of education as preparation for life in an organization. The latter might be a business, the military, a government bureau, or it might mean "to be a citizen," which can be restricted to a nationality or not. The Cynics invented "cosmopolitan," and we might agree that the highest calling of any educated person is to be of service to humanity as a whole (like Elon Musk, who brilliantly navigates both the physical and virtual landscape).



These respective roles are sometimes mutually exclusive. A citizen of a country may be at odds with a citizen of the world, and be the same person. Governments do bad things sometimes, and we might agree that the role of the citizen sometimes is to correct that in the name of some more abstract notion of what it means to be a citizen. It's the same with any organization.





Imagine this hypothetical advertisement from a college:



Our business school produces graduates that have the training to meet your most stringent demands in management, accounting, marketing, business law, international relations, and many other areas. In addition, they have been indoctrinated to be completely loyal to your organization, no matter how far you want to bend the law or even human decency--you can count on them to do the right thing!


This imaginary school is trying to guarantee that any cognitive dissonance in a new hire's mind between what the business wants done and any other role (e.g. citizen, human) will be resolved in favor of the business. I don't mean to demonize businesses with this example. In a real organization, including the military, loyalty is probably limited by intent. For example, a soldier swears to uphold the constitution, not to do what generals tell him/her to do, which allows a loophole for higher order goals (like preventing coups). The point is that it's important to an organization's survival to have a "signals" strategy in order to manage the virtual battlefield it competes on. And since most organizations still have humans in them, this means being intentional about the abilities and intentions of members or employees. The later Bush administration's Justice Department hires and fires did this rather crudely, and people noticed. Machiavelli talks about the idea in The Prince. Paraphrasing: when your enemies are physically beaten back is the best time to beat them at the information game too.



Separating signals from motivations is impossible because we only care about signals we care about. So any "coin of the realm" comes as a package including:

 





If this seems complex, it is! Take the US dollar as a simple example of a signal (packaged in different informational form as currency, bonds, electronic accounting, etc.). It's clear that different nations have different intentional stances toward it, including outright debasement (North Korea). 






As a more complex example, consider an anecdote from Nadezhda Mandelstam's Hope Against Hope, where she describes spending the night at a friend's house during The Terror:



They were on the seventh story, so you couldn't hear cars stopping outside, but if ever we heard the elevator coming up at night, we all four of us raced to the door and listened. "Thank God," we would say, "it's downstairs" or "it's gone past." 


In the years of the terror, there was not a home in the country where people did not sit trembling at night, their ears straining to catch the murmur of passing cars or the sound of the elevator.


 



Stalin was sometimes presented with lists of names, beside which he would--or more likely would not--place a check mark to spare the individual. This informational signal filtered its way through the corridors of the NKVD and eventually manifest as a knock on the door at night for the unfortunate people who were identified. The Terror originated in the signal domain (to affect behavior by getting to the source of it), with physical effects (people killed or sent to the gulag). There are many rich complexities, such as the definition of "Kulak," and the show trials (see Darkness at Noon). I have written more about this in "Nominal Reality and Subversion of Intelligence."




Compared to engineering problems, where Nature may be cruel but not fickle, understanding the role of individuals in toxic situations is a hard problem. The Great Terror was not the work of Stalin alone, and it's easy enough to demonize the NKVD agents, but since they were presumably human beings too, it makes more sense to try to understand the signal/motivation balance that made them behave as they did. What were the signals and debasements thereof? What competing realms? What intentional stances toward these?



A liberal arts curriculum is bound to include more of this kind of wrestling with hard problems. I think it's mostly done on paper, as thought exercises, but this is better than nothing. There are ethical limits how what sorts of practice we can engage in (a 'lab experience' in Great Terror sounds pretty dicey), but there may be a homeopathic "The Small Fright" that can be experienced by undergraduates without damaging them, and that would let them try hands-on Cynicism.

 



----


 


The 'Cynic of the day' award goes to NPR's "'Mischievous Responders' Confound Research On Teens," which gives an amusing account of epistemological struggle.



The runner-up is the BBC's "Should we all be a bit psychopathic at work?", which asks how far we should prune back our internal signals for getting along with others.





A Cynical Argument for the Liberal Arts (Parts 0-6)