Skip to main content

Posts

Showing posts from 2014

What do you do with a degree in philosophy?

Anyone who majors in the humanities has had to endure a version of that question more than once. As I went through graduate school, people asked the question less and less. By the time I was teaching classes, I had a pretty ready answer (teaching is paying work, you know?). As a professor, the question answers itself. Of course, being a philosophy professor is not for everybody. The crowded academic job market alone is enough to dissuade the faint of heart. The work is demanding, involving wearing the hats of instructor, researcher, and administrator. To succeed, one has to be flexible, creative, think on one's feet, and be ready to ask hard questions of oneself and of others. As academic institutions rely on more part-time and temporary staff, success often translates into more work without longer-term commitment from the organization. One can invest a whole lot of time and energy without knowing whether that organization will continue to provide support. Living the life of a

But we've always had X...

In teaching ethics, and in paying too much attention to politics, I encounter the sentiment that "We've always had [insert great misfortune], so we'll never be without it" over and over again. The sentiment is offered as a reason not to work toward alleviating poverty, warfare, disease, and all manner of problems that simply affect the whole globe and likely look to big to overcome. Still, I think this is a problematic line of reasoning, and one that we should stamp out as if it were a logical fallacy (and might trade on one, more below). Ok, so why is it a problem? For one, it's simply conversation-stopping in any ethical debate. Should we devote resources to researching Sudden Infant Death Syndrome? Well, babies have always died for no reason, so we'll never prevent that... There is simply nothing to do but throw one's hands in the air and give up. Now, in ethics, there is some reason to take this argument seriously. There is a very general principle

History and Identity

Yesterday the European Court of Justice issued an important ruling that has the tech policy world buzzing about privacy, search engines and personal history. In short, the court ruled that the EU Data Protection Directive gives a person the right to demand that old information be purged from search results. The particular case involves an attorney seeking removal of links to announcements about a real-estate auction connected with a debt settlement in 1998. While the ECJ made a number of interesting moves in the case (including a welcome argument that the distinction between data processors and data controllers does not make as much sense today as it did in 1995 when the Directive went into effect), the big consequence everyone is talking is the right to be forgotten. The long memory of the Internet is a feature it's hard not to love and fear at the same time. Whether you have something to hide or not, if it's on the Internet, it stays on the Internet (most of the time, at l

Autocorrect: A Sailor's Perspective

For the most part, autocorrect is a useful tool for avoiding spelling mistakes. Sometimes, it feels more like a very subtle tool for censorship, and that really passes me off. Swearing is not always the last resort of the unskilled communicator. In the right hands, it can be a dam good way to express frustration or even righteous indignation. If I want to communicate my emotional state more than any semantic content, a good round of cursing just does the ducking trick. Unfortunately, autocorrect developers must keep in mind that parents will get upset if their computer teaches their kids to swear, so I understand the rationale. Still, I would appreciate being treated like an adult and having an effective "suggest offensive words" option. I've seen such options, but they work like add, and when I'm trying to send a quick message that contains a swear, I don't want to type out the entire word or phrase like an assume. In short, autocorrect, I don't want to live

Correctly Valuing the Writing Process

There are no good writing days or bad writing days. There are only days where there is writing and days where there is no writing. Recently, my main professional ambition is to minimize the latter, preferably limiting them to weekends and the occasional holiday. The imperative originated in a concern to pick up the pace on my research and to meet a submission deadline on a promising call for papers. I'm glad to say that I made the deadline and decided to use the momentum to send out some projects that have been lying fallow for a couple of months. I went from having nothing significant in submission to having three articles in submission in the course of four days. Three submissions, four days. Beyond those submissions, I started on another three projects, some now in draft, some still in extended abstract. Now that I have the back-burner projects out of the way, I can start some revision and further research on the current projects, and hopefully get those off sometime soon as w

The Death of Socrates

Today, I told my students that while Socrates was not the first philosopher, he is the one who really set what would come to be called Western Philosophy in motion. I don't know exactly how accurate that view is since there were a number of odd mystery cults circulated in the Mediterranean, Pythagoras and his crew for instance. Nevertheless, there is something about the drama of the trial and death of Socrates that seemed to energize the philosophical project, such as it was at the time. Even if created in retrospect, the narrative of a person dying for asking questions sends a powerful signal that there is something important about what he was doing. Remember that it's not quite right to say that Socrates died for his ideas. The early dialogues offer little in the way of a positive project, and what is there is usually attributed to Plato working out the early stages of his project. In the end, Socrates is executed because asking questions is dangerous. It undermines the str

Living Philosophy

Over the last year, my professional life has undergone a number of major changes. Obviously, moving to the Netherlands is on the list, but I have in mind more differences in how I view myself and my work. While finishing my dissertation gave me a sense of completion, it took a while to find a well-developed sense of myself as a philosopher. In particular, I have a very different relationship to my research today than I had when I defended my dissertation. The dissertation stage is filled with lots of uncertainties and fear along with the other challenges of actually writing the thing. For one thing, I had never written anything that long or unified. I had to design and execute a book-length argument on one topic, and I had to say something relatively novel. Thankfully, my supervisor Bruce Brower was an excellent mentor. He helped me identify the topic very early in my doctoral studies, so I spent two years or so thinking about it before I began principal writing. We worked the topic

Pedagogy of Prestidigitation

I put what might be too much thought into presentation when I teach. I say it's too much because I don't know how much of it comes across to my students, but insofar as a teacher must entertain, it seems appropriate to work on one's showmanship. Over time, I've developed some particular aesthetics of teaching that both keep me motivated and focused in the task, and hopefully contribute something unique to my students' experience. My basic model is jazz improvisation, for reasons perhaps best understood by fellow initiates of Robert Anton Wilson. The presentation slides give me an overall structure and contain the essential information. For the most part, the slides are supposed to be springboards for verbal improvisation. I like the idea of running discussion sessions, and when it happens I enjoy it, but I find it hard to get the students going. In introductory ethics courses, when I include assignments that require them to read before coming to class, it's ea

Flipped Off Pedagogy

Everyone who works in education is trying to figure out what to do with the new capabilities afforded by IT. The most prominent example is the move toward MOOCs, the massively-open online courses made visible by the efforts of EdX, Coursera, and associated institutional partners. For those of us in the trenches, MOOCs represent the least imaginative application of information tech to the classical challenge of enlightening young minds. Think about it this way: you have any and all documented facts at your fingertips, and the ability to connect with experts anywhere in the world, and you use it to turn university lectures into a Netflix product? Michael Sandel is a talented lecturer, but I don't see philosophers binging on his Justice course the way we all do with Orange is the New Black . So, if MOOCs aren't the big challenge, what is? As far as I can tell, educators (self included) have the most trouble coping with the "flipped classroom." A "flipped classroom

Ambivalence on Ethically Challenging Research

I'm the middle of one of those research projects I feel obligated to do, but at the time can't bring myself to feel entirely passionate about. There really is nothing that brings out ambivalence in me like ethics and cyber-warfare. First and foremost, I am no big fan of war, warfare, or the military broadly construed. For that reason alone, the ethics of war should be a topic of great interest. If it's the case that person most fit for office is the one who wants it least, then the best war ethicist should be an absolute pacifist. Think about it this way: what would war ethics look like according to Genghis Khan or Napoleon? I think Atlanta still wakes up in hots sweats over Sherman's ideas about conducting a just war. Of course, when you actually have to think about the ethics of just war, you have to confront the realist/idealist problem. War is awful and nothing good comes of it (anyone who says otherwise has way too much invested to be unbiased), so the most just

Surveillance and Servitude

A response to Kevin Kelly’s “Why You Should Embrace Surveillance, Not Fight It” in Wired In “Why You Should Embrace Surveillance, Not Fight It” Kevin Kelly offers some possibilities for a positive view of ubiquitous surveillance. The solution to our concerns about privacy, according to Kelly, is more, rather than less surveillance. By embracing “coveillance,” collective monitoring of one another, we can recapture some of the influence and transparency currently lost to surveillance, top-down monitoring of citizens by an authority. While Kelly is right that coveillance gives us transparency, he may be wrong about freedom. Let’s begin with the idea that Big Data firms will pay coveillers for self-monitoring and reporting. The idea that we could make our data more valuable by invoking a sense of entitlement and demanding direct compensation misunderstands the “big” in “Big Data.” The personal data of one citizen is really not all that valuable to data analysis. You