Akrasia

It’s been over three weeks since my last post on this blog. As a consequence, I would like to return to what has unfortunately become a familiar theme. There’s an old Greek word for this ancient phenomenon: akrasia—ancient because every person who has ever been born has had to grapple with this dilemma. ‘Akrasia’ means ‘weakness of will’ and applies to any instance when one faces a choice between a thing that one knows is good and another thing one knows is less good, but that one chooses anyways (a drug, a procrastination break, excessive amounts of cake…). If it’s the case that we always want what we think will be best for us, and always choose what we want, then it seems to make no sense why it is that we sometimes choose to do things that are not good for us, or choose not to do things that we want and think are important. I want to write a single blog post every week. Why am I sometimes not able to? Akrasia.

I would argue that this issue is at the root of nearly every major political problem, that natural human flakiness may in fact be the root of all evil. And here’s why: to participate in a society is to allow oneself to be dependent on other people, but there are very few people who one can actually fully depend on. As a result, the people who we trust to make decisions for us will often let us down—and we will often let ourselves down, the people who we should be able to trust most of all. If someone finds that the advantages to breaking a promise considerably outweigh the disadvantages of breaking it, then they will probably break it (and then feel bad about it later). A whole school of political thought has been founded on these principles: rational choice theory. Actors will tend to behave in ways so as to maximize whatever seems to be their personal advantage. A person might think that the institution of meat-eating is one of the greatest evils of our age, and eat meat regularly (because who can turn down a deliciously cooked steak?). Someone might think that everyone has a personal duty to try to prevent global warming, and then keep the heat on in the winter, and drive half a mile to the store because one doesn’t want to walk in the cold. Akrasia.

But the point here is that these ‘rational’ choices are less than rational, because the person will readily admit that if they just had a little bit more will-power, they would have behaved differently. And that’s what makes this uniquely (and universally) personal problem a uniquely political problem. It seems that it’s probably the case that people in politics will make what knowingly seems to them to be the wrong decision because they are either personally advantaged by it, or because they know that standing up for their beliefs will result in personal harm—very few people want to be martyrs.

And now I’d like to tie this back to a real discussion of contemporary politics. In the past, when I was on this theme, I compared personal akrasic compromise to the democratic compromises that were occuring in Egypt (with Mohamed Morsi). I asked the (potentially dangerous) question: is there something fundamental about anti-pluralistic religions that necessarily puts them at odds with democracy? If one had complete rule by the people, and the people chose Islam, then would that not be an Islamic Democracy? If the majority decided that minority viewpoints should be suppressed, does this ‘majority mandate’ confer moral legitimacy upon that government?

Of course not. It confers political legitimacy on the government, and possibly even legal legitimacy, but not moral legitimacy. Majoritarianism only guarantees the most ethical policies if the best decisions are actually those created by a mob. For democracy to work, majority rule must be tempered by a good Constitution—a Constitution that sometimes tells the majority that what they want is not what is best, and that has the authority to stop them. Just as discipline and an adherence to one’s commitments is the antidote to akrasia on the small scale, so a well-written Constitution and a means of enforcing its principles and tenets is the antidote to akrasia on the large scale.

A cottage industry has recently emerged amongst comparativist political scientists asking questions about the relationship between Islam and democracy in the middle east. Since the beginning of the ‘Arab Spring’, observers have had high hopes for democratization in two particular Islamic states: Egypt and Tunisia. In 2011, Tunisia had its first free elections after the overthrow of their leading autocrat, Zine El Abidine Ben Ali. For their new president, the people of Tunisia elected a human rights activist, Moncef Marzouki. In 2012, the leading Ennahda party announced that it wouldn’t allow sharia to be the main source of legislation in the new Constitution, that the state would remain secular. The Constitution guarantees rights for women, and women hold more than 20% of all the parliamentary seats. While the state has declared Islam as its national religion, and while 98% of all Tunisians are Muslims, the country has a secular culture that encourages religious freedom. In terms of democratization, Tunisia seems to be doing pretty good. If Tunisia works as a case-study, then it would seem that Islam and democracy may in fact be compatible. Why does this seem like more of a question for Egypt? I would guess that it has something to do with Tunisia’s commitment to its new Constitution, and Egypt’s overhaul and redrafting of its new Constitution.

Akrasia on both the small and large scales results from a person or group of persons’s inability to fully commit themselves to their goals. The incongruence of the Constitution with the popular majority, or the individual with his/her personal projects leads the actors in question to sacrifice their difficult ideals for what is easier and more concrete, more immediately gratifying or less painful. What once seemed like a good idea now seems foolish or unrealistic.

But don’t get me wrong, sometimes there are good reasons to make these kinds of sacrifices. For example, while I feel that this blog is valuable and my promise to regularly write on it is important to me… I have four term papers to do—over 60 pages to write—and just one month to complete them. So this particular commitment will have to be broken, this goal dashed by the pragmatic realities of college life… I hope that other people will take over this blog from me… but I don’t expect it. And for saying to myself that I would write every week, and then failing to do it, I must confess that I feel (just a little bit) ashamed. But let’s end on a high note! If the goal is reasonable and appropriate for the person making it, and the person makes adherence to that goal a top priority, akrasia will never be a problem. And for nations that have the proper conditions for democracy, and that also want it, democratic culture will probably thrive! The only way we’ll get better is by challenging ourselves, and if we’re truly challenging ourselves, then we will sometimes fail. So… let us try spectacularly! And in doing so, we will learn what we are capable of… probably more than we originally thought.

Panel on the State (and culture) of Academic Writing

A fat year ago, we lost one of our best political science professors to the writing center. There’s no real story about that, there were no fights, there was no drama, he didn’t quit, he wasn’t fired. From what I understand, he just didn’t meet all of the expected requirements for the tenure track—one of which was a publishing requirement—designed to ensure that our teachers are experts, and that in addition to teaching, they bring prestige back to our school. At one point, I asked him why he had not just banged out more articles so that he could meet that particular requirement. Every one of his students could tell you that he’s no slacker, and whenever it came to writing he gave us the most helpful detailed comments and criticisms on how we could improve our papers. His answer was interesting: the kind of writing that is required for the kind of political theory that he does is not the kind of writing that can be easily done on top of a full-time teaching job without compromising the integrity of one’s teaching. The institutional incentives of contemporary academic life make it nearly impossible for professors to both write in this older way and also fulfill one’s duties as a teacher, and as a consequence, these more traditional forms of writing are dying and being replaced. And few people even realize that something great is being lost. Director of the writing center, John Holzwarth, has organized a panel on this evolving trend: “On the Walls of Our Caves: A Panel on Academic Writing.”

This is a topic that concerns all of us. Since the age of the internet, there have been a multitude of books and articles about how the way in which we read has changed. From reading the great books of the past, one can see that the way we write has changed as well. Writing tends to be more succinct today, less wordy. Academic writing tries to be narrow and scientific, spending thirty pages to make a very small point, a point that one can often glean from the abstract or introduction. The papers tend to be dry, and the attempt to say something new often supersedes the aim to say something important. Essayists have been replaced by obscurantists. People have become afraid to ask big questions.

There may be important and very good reasons for why this is the case, the contemporary academic methodology may have ample justification. But so far, it seems to have been immune from serious challenge. We are taught to write in a certain way without being told that there are alternatives, and we are told that this particular form of writing is superior without being given the reasons for and against it. The way we write informs the way we think, and the way we think determines how we act. To change the norms of writing is to change a culture. To deliberately change one’s writing style is to change an individual. I think that this will be a very important and interesting panel (with some very sharp professors on it), and hope that many students and faculty will choose to attend. Wednesday, March 6th. 7:30pm. Miller 105.

Section Five: Dead or Alive?

Tomorrow the Supreme Court will hear a case called Shelby County v. Holder. The case will examine the validity of Section Five of the Voting Rights Act, and because the Court is currently dominated by conservatives, liberals worry that the VRA might essentially be gutted if/when the Court finds Section Five unconstitutional. Section Five has a long legal history, dating back to the Civil Rights Movement in the ’60s. The purpose of the VRA as a whole was to outlaw discriminatory voting practices that were used to disenfranchise or dilute the vote of African Americans. Section Five of the act required that any district with a history of racial voting discrimination must first attain approval (“preclearance”) from the Department of Justice before making any legislative changes that affect voting. As a consequence, any voting scheme that has the purpose or effect (measured by the non-retrogression principle) of discriminating against racial minorities can be rejected by the DOJ.

Practically speaking, this very rarely happens. Since the landmark case Allen v. State Board of Education (when the court decided that any structural change to an aspect of elections can be subject to preclearance), the section five claims that have been brought to court exploded. Since that case (1969), there have been over 2300 claims subjected to preclearance. Of those claims, less than 2% have been denied. When we discussed this case in our Election Law class, our professor said that this fact can be evaluated in two ways:

1. This is a watchdog with no teeth; the preclearance requirement is essentially pointless.

2. Alternatively, it’s a deterrent. People won’t try to pull the same tricks anymore, because they know that they can’t get away with it.

At worst, Section Five has apparently done nothing bad. At best, Section Five has helped tremendously in the struggle to give racial minorities adequate representation in the democratic process. So how could the Court rule that Section Five is unconstitutional? They’ll deny that Section Five has done nothing bad.

The argument is that Section Five places an unjustifiable burden on the states, and thereby compromises our system of federalism. The preclearance requirement presumes that states are guilty of racist behavior before they’ve had a chance to show that they’re not. The days of literacy tests and other racist disenfranchisement schemes are behind us, and to assume that Southern States will continue such shenanigans is only justified if evidence can be shown that the South is particularly deserving of the extra regulation. If this evidence cannot be shown, then the regulations should apply either to all states or to none. After all, Section 2 does most of the work in the VRA, so if Section 5 is truly unnecessary, it ought to be disposed of.

But this is the problem for the case. Chief Justice Roberts uses the analogy of an elephant whistle: “I’ve got this whistle that keeps away elephants. How do I know it’s working,” he jokes rhetorically… “See any elephants?” How can you prove that a deterrent is actually addressing a problem? You remove it, and face of one two consequences. Either there’s no problem. Or there is.

Is Our Liberal Education Failing Us?

A couple weeks ago, political scientist Peter Berkowitz published an op-ed on RealClearPolitics called “10 Ways Liberal Education Fails Students—and Society”. In the piece, he argues that in order to have a healthy and well-functioning liberal democracy, it is essential that liberal arts colleges do their job properly: “to transmit knowledge and teach students to think for themselves,” and in the process foster a deep intellectual integrity “which involves respecting facts, honoring evidence, vigorously exploring arguments, and cherishing the inevitable and illuminating diversity of opinion in a free society.” According to Berkowitz, this is not happening, and society is suffering as a consequence.

I don’t want to do a summary of the article, but I would like to comment on some of the main points. For those who are interested, I suggest going through it (it’s short), and asking whether any or all of the charges against the current liberal arts model are true. While I can only speak as a student of Lewis and Clark College, I think that it is still possible to acquire the liberal education that Berkowitz laments the loss of, and I think that (at least in some departments at our school), the professors do an admirable job of providing just that.

The most troubling accusation that Berkowitz makes is that liberal arts schools produce students with uniform opinions who unquestioningly embrace the left-leaning dogmas of the day. This is not an uncommon criticism from people who lean right politically. It is common knowledge that most students and professors tend to fall down left of center. But Berkowitz argues that this should be a concern for liberals in particular—for tolerance and diversity are liberal values, and arguably many liberal colleges do not have them.

Democracy requires a diversity of viewpoints in order to function properly. In science, if scientists fail to explore (and test) competing hypotheses, they fall short of aquiring accurate knowledge. In capitalist economic systems, if there is no competition, we will typically get market failure. In short, it seems that many of our institutions mirror and reinforce one another… that science, capitalism, and democracy, perhaps all depend on similar norms for their proper functioning. And yet, if we want to foster this kind of diversity in our social institutions, it seems perhaps most important to have it in our educational institutions: diversity of ideas.

How do we get this diversity? Arguably, one of the best ways of getting it is by reading from what is called the “Western Canon”: the Greeks, Shakespeare, Dante, Hobbes, Locke, Rousseau, Burke, Marx, Newton, etc. We have inherited much of what we currently believe today from these texts, and often forget that they disagreed–sometimes extremely–with one another, and provided plausible alternatives to questions that are still being asked (or at least should be). The idea is that by taking seriously the ideas of the past, one might come to question the prevailing ideologies of the present. Through this questioning, one might find that one disagrees with much of what one originally took for granted upon entering college. Or contrarily, one might find one’s original beliefs confirmed, but have better reasons for holding them. In more recent years, there has ironically been a “liberal” backlash against these books that have contributed so fruitfully to the development and maintenance of liberalism. They have been criticized as being dated, irrelevant, written by “dead white men” rather than racial minorities or women. These criticisms miss the point, for it is the ideas themselves that are important, not the identities of the authors who wrote them, nor the trivial fact that people today may disagree. Ideas are not wrong merely because people no longer believe them. And in that sense, I think that Berkowitz may be right. Affirmative action policies can create the illusion of diversity on a campus that in fact has none, at least if what we are talking about is “intellectual” diversity, which should be the most important kind of diversity in an environment that is supposed to care most about freeing the individual mind.

I mentioned that we have departments at Lewis and Clark that definitely deliver what I see to be the proper kind of “liberal” education. In my experience, our political science and philosophy departments have both been very good, and I’m sure that they are not the only ones. Lest there be any confusion, I just want to make clear: when I say “liberal” I do not mean “right-wing”. I am not saying that the political science or philosophy departments provide a conservative counterpoint to our comfortable leftist presumptions. Rather, the liberal education that one finds in our philosophy or political science departments can be described as fostering a kind of intellectual humility, the notions that common sense is frequently wrong, that ideas need to be carefully scrutinized before one accepts them, that presumptions are a kind of laziness, that feelings are not a foundation for knowledge and that unreflective partisanship weakens rather than strengthens democracy.

In sum, for students who want it and seek it out, the traditional liberal arts education is not dead. Because there exist professors who themselves possess the intellectual integrity that a liberal education cultivates, the liberal arts continue (at least in some places) to flourish.

Campaign Finance Conference

Today I had the great opportunity to visit my hometown: Salem, OR! I went with three other political science majors and Professor Lochner to Willamette University to hear eight hours of lectures on one of the world’s most surprisingly interesting topics: campaign finance. We listened to first rate experts talk about Super PACs, the 2012 presidential election, “dark money”, judicial elections, and the general costs and benefits of campaign contributions. By the time the conference had ended, I found that I had taken roughly 25 pages of notes.

So, after all that, what can I tell you that I took away from the conference? It’s a serious question. What can you tell a person after you’ve spent an entire day absorbing information about a subject you know nearly nothing about? You leave with the feeling of your head spinning, drowsy from waking up in the fives (before 6:00am), jittering from the five+ cups of coffee drank (drunk?) throughout the day, mind aclouded from trying to attend to specialists talking rapidly about an interesting, but alien subject. The one thing that we four undergraduates who attended the conference had in common: we’re all taking a class at the law school on election law as our senior capstone for political science. The other one thing we all had in common: this subject was entirely new to us.

But allow me to indulge myself. Despite the minimal background knowledge I possess on the topic, and despite the fact that it’s still the same day (sort of), I would like to try to reconstruct some of the more interesting points that I can remember being made in the course of this full day of intellectual stimulation.

The opening lecture was given by chair of the Federal Election Commission (FEC), Ellen Weintraub, trying to answer the following question: given that the US spends more money on elections than any other nation, is that influence deleterious, and if so, what should be done? Her answer to that question was ‘no’, and ‘nothing’. We want an informed electorate and spending on ads helps to give voters the information they need to make informed voting decisions. The fact that we spend so much money shows the world that we care a lot about democracy. And for those who say that we spend too much, they might be wrong. We spend more money on halloween decorations than we do on voting, which maybe indicates that even as we outspend every other nation on elections, we still do not care about the way our nation is governed as much as perhaps we should.

The problem of course with campaign contributions is that it leads to the appearance of corruption. Weintraub didn’t diminish this point; in fact she emphasized it. If contributions go to organizations devoted solely to electing a candidate, there seems to be something perverse about such influences. The FEC is supposed to regulate this process to prevent our politics from appearing corrupt, but because they’re always responding to the last thing that has happened, and because people are always trying to find new ways to influence politics in their favor, the committee cannot keep up. This is a problem for the legitimacy of US elections. On the one hand, spending is important for democracies because it’s the only way that average voters can learn about their candidates. On the other hand, such spending can be harmful because it leads to the appearance (and maybe even the actuality) of corruption. This is one of the major dilemmas of regulating campaign finance.

The next section of the conference featured a panel with such distinguished speakers as Michael Beckel, Rob Kelner, and Bradley Smith. They talked about the contemporary prevalence of Super PACs and the effect on politics that such politically active non-profits allow for. The first speaker (Beckel) emphasized that only .5% of American adults account for 2/3 of political campaign spending. This number should shock, because if true, it means that elections are a game played and won by America’s wealthiest elites. The next two speakers told a different story. Kelner claimed that Super PACs are in fact much ado about nothing. Super PACs are good, he argues, because they force corporations (who would be donating anyways) to disclose their donations leading to more transparent politics. He claimed that Super PAC controversy merely distracts from a more important and interesting change in politics, namely that get out the vote programs can no longer be adequately funded by political parties. For this reason, he claimed, Romney lost the election (the RNC couldn’t provide funding to GOTV). He argues that in future elections, Democrats will have the same problem, and parties will have less and less influence in politics, meaning that they will play less of a moderating role. And as a consequence, parties will radicalize ideologically. Time will tell if his prediction comes true. Lastly, Bradley Smith argued that campaign finance is regulated much more heavily than it should be: the reason we have limits, he argues, is to prevent quid pro quo arrangements between candidates and donors (corruption). But bribery laws won’t prevent this. They fail to achieve their intended purpose and have all kinds of unintended consequences.

I’ll do one more panel summary, just so you get the idea. In a panel titled “The Future of Public Financing,” we heard Richard Briffault, and Paul Diller pontificate on where the trends are pointing. Professor Briffault spoke quickly in an attempt to say as much as possible in the short amount of time he had allotted. If it weren’t for my quick typing skills, I would have lost most of what he said. His main idea was that public funding programs tend to reduce levels of private funding in elections. As a consequence, there are fewer deleterious influences in the political process, but candidates tend to be stuck with whatever funding limit they’ve been assigned, which may or may not be enough to win the election. Of course, just as this seems to resolve some problems it causes many more. And for this reason, Briffault concludes that we should rethink public funding, not as a solution, but as a system that is inevitably private while being supplemented/complemented by public funding.

Finally, Diller examined some reasons why “voter owned elections” failed in the one place they should have succeeded: ultra-liberal Portland, OR. While a publicly funded election scheme should have reduced the undue influence of wealthy donors and interest groups, broadened opportunities for citizen involvement, and decreased the disillusionment we have with local government, Oregon’s public financing program did none of these things. The main reason for this had to do with scandals involving the very publicly financed politicians that were supposed to bring new legitimacy to Oregon’s democratic process. Emily Boyles was revealed to have improperly amassed contributions, and Vladimir Golovan had apparently forged signatures and engaged in identity theft to try and help candidates like Boyles. Additionally, other publicly financed candidates had very poor showings (receiving 10% or less of the vote), and certain contingencies had not been well thought through in advance (like what to do when a commissioner who assigns public finances resigns in the middle of his term and you don’t know how much money should now apply). In short, when the system of public funding had been tried, it did not work out well, and while this may not have been an intrinsic failure of the new system, it does not lend support to people who believe that the old system needs to be reformed. Insofar as it can and should be reformed, reforms need to be much better thought through. And this seems to be the major problem with campaign finance. The issue appears wickedly complicated and good ideas ought to be well thought through before they are implemented if one hopes to avoid replacing one problem with several more.

In sum, that was just half of the campaign finance conference we attended. There is much more to reflect on (the panel on judicial elections, the panel on campaign finance in Oregon specifically, and the keynote address by Lawrence Norton). We learned a lot, had a blast, and spent a day in Salem. I could say more, but after a long day, much thought, and many beers, there’s only one thing left to say: all is fair in love and war, so why are we so concerned about elections?

The Trouble With Compromise

A fat month ago, I typed up a blog post about Mohamed Morsi’s power grab, and how the ‘temporary suspension’ of democracy could be enough of a wrench to lead to its permanent destruction. The circumstances were similar on my end as well. I had taken more than a week off blogging, assured myself that the problem was under control, and promised to return to a regular blog schedule post haste. And here we are. It’s been nearly two weeks since my last blog post, and the situation in Egypt has worsened dramatically. Thousands of protesters have been marching in the streets, violence has erupted between protesters and police, dozens have died in the conflict. What looked at first like a resolvable democratic compromise is appearing more and more to be leading up to an irreconcilable standoff between supporters of Egypt’s Muslim Islamic Brotherhood party and the secularist minority. It seems that this can only end in two ways: (1) Morsi transforming his presidential office into a theocratic dictatorship; or (2) The secularists ousting the leader and starting afresh, as they did two years ago under Hosni Mubarak.

There’s one awkward catch however: technically Morsi does seem to have a mandate from the people of Egypt. The current upset, arisen as a consequence of Constitutional changes to essentially make the Egyptian Constitution more in accordance with Sharia law, was not unilaterally imposed upon the people by the president. President Morsi fairly won election by the people of Egypt. And he fairly amended the Constitution through direct referendum by a majority of Egyptian voters. From what I understand, he has more or less operated within the confines of the democratic electoral institutions that brought him to power. The problem here is that the secular minority doesn’t like the outcome of natural democratic process.

So, who’s to blame here? Is Morsi a villain for changing the government to be more Islamic, when he’s from an Islamic party that explicitly has an Islamic partisan agenda? Or are the secularists to be blamed for causing instability through rioting, when really they ought to just wait their turn until the next election when they can have a chance to try to get the pendulum to swing back their way?

Let’s think of the problem another way. Suppose that the Democratic party managed to win a monopoly on the presidency, the judiciary and the legislature–as a consequence of more Democrats turning out to vote. Let’s say they use their majority support to call for a Constitutional amendment to ban guns, legalize marijuana everywhere, write in a constitutional right to have an abortion, another right to marry whoever one pleases, and a federal requirement that all public schools must now teach classes on Atheism. If a majority of Americans were in favor of these sweeping changes, would it be legitimate? I think that most democratic citizens, Republicans and Democrats alike, should say no. For, constitutional democracy is not just about majority rule, it is also about protecting minority rights (from majority tyrannies). This is one reason why ‘separation of church and state’ and ‘freedom of religion’ are often seen as such a fundamental protection in the American system. But it seems that we may have an internal contradiction. For arguably, separation of church and state, and the tolerance that it fosters leads to a more secular democratic culture, like the one we see so pervasively in the United States. It seems that religions like Islam require a degree of intolerance in order to thrive. Therefore, religious freedom protections of the kind advocated for by the Egyptian secularists appear to be directly at odds with survival and propagation of Islamic culture. It seems that more of one may necessarily entail less of another. Does this mean that Islamist democracies are doomed to fail? Maybe not… but if they’re going to survive, it seems that they have to be more willing to make political compromises, even at the risk of watering down the culture that they value so strongly. It makes me wonder, can there be such thing as a non-pluralistic democracy?

GPS and Technophobia

In this past week’s Economist, there was an article about using GPS to track the whereabouts of children. On the one hand such devices may make parents more comfortable in allowing their children the freedom to wander, on the other hand the prevalence of such technology might mean that tracking could become more commonplace for everyone.

This may not be a bad thing. Just as knowing the location of one’s child could at the very least help parents to worry less, and even potentially help to prevent kidnappings (and other crimes against children), so a greater degree of surveilance would make it easier for governments across the world to police their nations and thereby reduce crime. Studies have already been done showing that merely knowing that one is being watched reduces the likelihood that one will cheat on a test, even if one knows that they could get away with it.

And yet, while there is nothing particularly unsettling about using tracking devices to monitor children, people may feel more uneasy about using the same devices to monitor adults. What’s the difference? Aside from dystopian fears (that are perhaps irrational) about the government using surveillance technology to institute a totalitarian state in which people rigidly comply with the state’s commands (“big brother is watching you”), could it also be the case that to monitor an adult is in a fundamental sense to treat them as a child?

One is reminded of Bentham’s panopticon, a giant circular prison with an observation tower in the center, and privacy glass, so nobody can see inside or know when they are being watched. In such a system, one is not innocent until proven guilty. The presence of security cameras carries with it the presumption that without those cameras, the place would be less secure. What is lost when this privacy has been stripped away? One loses the ability to lie about one’s whereabouts. And one also loses the ability to choose not to lie. The watchers gain information about the watched, but without their consent. If the watchees knew that another set of eyes could see their every move, they might act differently, but is this not a subtle form of coercion? Maybe even… oppression? Imagine if every action you’ve ever been ashamed of was recorded and stored, never to be forgotten, and potentially accessible to the watchers on the other side of the panopticon. To the watchers, one becomes less like a person, more like an animal. But aren’t humans different from animals, such that they ought to be treated as having a higher status? Could our political institutions (in this case surveillance for security) cause us to view one another, and thereby treat one another less like people? Quite frankly, the knowledge that I’m being surveyed (when I am) no longer really bothers me. But it bothers me that it doesn’t bother me.