Category Archives: society and culture

2,000 Hours

Through one of the many blogs I read – The Edublogger – I heard about this intriguing new project: 2,000 Hours. A fellow English teacher, Charles Ripley, is going to document his teaching-related hours for the next year, starting with the summer.

This could be a fascinating way to approach issues like teacher pay, and is a creative way to use the blogging platform – I can already picture students documenting their learning throughout the year with a blog…

Of course, after reading Mr. Ripley’s initial post, I cannot help but recall some of the great clips from the Daily Show a couple of months ago on similar issues.

In any case, I’m sure 2,000 Hours will be a fascinating site to follow over the next year.

Wikipedia and the Wisdom of the Masses

WikipediaNot too long ago, I wrote about Jake Locker and the Wisdom of the Masses, which discussed the way in which public perception or “common sense” is based primarily on what so-called “experts” and media say on a particular subject. This particular post will look at what may be the antithesis of that piece – Wikipedia.

I have used Wikipedia for a long time now, primarily when I want to get a quick overview about a topic or find an answer to a random trivia question (like “How many home runs did Sadaharu Oh hit during his career?”). Occasionally (as I did recently), I’ll consult the References and “Further Reading” sections of an article to look for books to read on a topic.

But as a teacher, I often hear from students that they have been told to never use Wikipedia because it is unreliable. Other teachers they have had have told them that, because anyone can edit Wikipedia, it is completely unreliable. Essentially, these teachers have told students to only listen to people that are vetted “experts” on a subject.

Of course, in the Jake Locker article, I argued that the so-called “experts” were just as ignorant as many fans. In addition, their “expert” opinions influenced the masses to believe something that may or may not be true, all without using complete, factual information to support their positions. Interestingly, Wikipedia seems to be the opposite of this phenomenon, as the “wisdom of the masses” turns out to be roughly on par with the wisdom of the experts.

Take this oft-cited report by Nature magazine, which found that Wikipedia is nearly as reliable as the Encyclopedia Brittanica – with approximately 1 more error per article than the published encyclopedia (The main article is behind a pay wall (here’s a summary from CNet News), but their responses to Encyclopedia Brittanica’s objections are worth looking at). Even when there are errors in a Wikipedia entry, they are (more often than not) fixed within a matter of hours.

Another study, published by online journal First Monday, revealed that experts who read Wikipedia articles in their areas of expertise found those articles to be more credible than non-experts. In layman’s terms, if you or I read an article on nuclear fission on Wikipedia, we might treat it with a bit of skepticism (“take it with a grain of salt”). An expert in the field of nuclear physics, however, found that to be a fairly reliable and accurate article.

Of course, Wikipedia is not, nor ever will be, perfect. One of the drawbacks of having an encyclopedia that anyone can edit is that some will add misinformation (whether intentional or not). PBS’ “Learning.Now” blog posted a clear, concise summary of both sides of the Wikipedia debate. It illustrates the potential problems with Wikipedia using the story of John Seigenthaler Sr., whose erroneous Wikipedia article tied him to the assassinations of John and Bobby Kennedy. But it also highlights the story of some high school journalists who used Wikipedia (and its editing history) to out a convicted sex offender posing as British royalty.

So what should educators do about Wikipedia? Based on what my students have shared with me, many teachers are simply telling students not to use Wikipedia. I will never say this. There is far more quality information on Wikipedia than there is bad information. While Britannica is updated annually, Wikipedia is being edited every second of every day, thus has information on events like the death of Osama bin Laden, which won’t appear in Britannica for several months. Furthermore, students can access Wikipedia for free from anywhere with an internet connection, while it is much more difficult to access an encyclopedia. Then there are additional tools like Simple English Wikipedia, which contains similar content shared in simple language (rather than intellectual vocabulary). This kind of tool is invaluable for students, particularly those just beginning to learn how to do research. With all of these facts (plus the demonstrable accuracy of their articles), I will never tell students not to use the site.

However, I will also never tell them that it they should cite Wikipedia as a source in scholarly writing. Yes, the site is usually accurate. Yes, it has good information more often than not. However, it is not perfect, and it is still a secondary source. And I try to get my students to avoid citing secondary sources, instead helping them search for the primary source of the information. To me, this is one of the great advantages to using Wikipedia – their bountiful citations and connected links. If the information is good, I can typically consult the original source and use that, thus maintaining accuracy and academic integrity.

So here’s what I tell my students – Wikipedia is a great starting point. If you just want quick access to basic information, use Wikipedia. This is why I cite Wikipedia articles in my blog – the articles provide good introductions for people who don’t know about a particular subject. If you are doing research, it is a great way to get a mostly accurate overview of a topic, and an even better tool for finding other sources to aid in research (using the References). But Wikipedia should not be the core of their research, just as the Encyclopedia Brittanica should not comprise all of their research – they should seek out primary sources of information to cite in their work. This is what I try to teach my students.

But that’s just what I do. How do you handle the Wikipedia dilemma?

The Dangers of a Personalized Internet

This TED talk is brilliant and thought-provoking, though haunting at the same time.

I’m really not sure what else I can say in response to Mr. Pariser. I’ve shared similar thoughts for a few years now, not just about the internet, but about the books we read and the people we associate with. Variety is the spice of life, and those that live in a filter bubble like the one Mr. Pariser describes tend to lose perspective on the world around them.

To expand on his metaphor, when we indulge in that filtered “junk food” that isn’t building us and edifying us and challenging us to think outside of our own sheltered existence, our minds end up like Morgan Spurlock in Super Size Me – out of shape and on the verge of death.

On E-Books, Reading, and the Course of Human Events


A Little Background

A couple of months ago, my amazing wife got me a great birthday present – a Barnes & Noble Nook. I had been conflicted about getting an e-book reader, but was ready to give digital books a shot. Since I’m both technophile and bibliophile, the gift made perfect sense. And given the amount of time I spend in front of LCD screens (laptop, iPhone, TV, etc.), I was grateful not to receive a Nook Color or an iPad, which would just add to the eye strain that I already experience. And the E-Ink screen has lived up to expectations – it really does mimic the experience of reading on a page quite well.

Of course, being something of a classicist, it’s a weird mental adjustment to read digital books. I’d always been the type that loves to smell the pages of old books and I really appreciate a nice leather-bound edition of Poe’s Complete Works. That’s obviously not an option any more, so I went with the next best thing – a beautiful handmade leather cover from Oberon, which at least gives it that nice leather smell, so I can feel a little more like I’m reading a real book (side note: I’m not affiliated with Oberon in any way, but I’m really happy with my cover). Even so, there are still books that I will insist on keeping in hard copy – classics, favorites, and so on.

There are a number of features with the e-reader I really enjoy that I know I would never get with a book. One of these is obviously having one device that I am comfortable with, rather than learning the feel of a new book every few weeks. Oddly enough, another feature I love is the ability to quickly go from one book to the next, even download a book on the spot and start reading. Even if I’m thinking about buying a book, I can preview it on the Nook first and decide whether to purchase it. In the same vein, the Nook allows me to go to any Barnes & Noble and read a book for free while I’m connected to their wi-fi. All are very nice features.

The feature I most appreciate, though, is the Nook’s ability to lend and borrow books, particularly borrowing from a library. This is the sole reason I preferred the Nook over the Kindle. I think the ability to borrow books at will is an incredible feature. I can honestly say that I’ve checked out more books so far this year than I checked out in the last 4 years combined, all because they were quickly sideloaded onto my Nook. Thankfully, our public library (King County Library System) has a great selection of e-books available for checkout, so it’s been a pleasant experience. I simply download the book, load it onto the Nook, and then “return” it when I’m finished. I can keep the book for up to 21 days or return it early, just like a traditional book.

Certainly there are issues that bug me – highlighting is a nuisance and I wish I could view only highlighted passages (would be great for note-taking). The touchscreen often experiences a good deal of lag or is unresponsive. And E-Ink technology still has plenty of room for improvement. However reading books on the Nook has been a mostly positive experience.


As I mentioned, it was the last feature – lending and borrowing – that really sold me on the Nook. As I see it, that Kindle did not (until recently) allow lending or borrowing was a shame. It seems like a good piece of hardware, but we need the ability to share. Whether it’s status updates on Facebook, links through Delicious, or books, sharing things we enjoy connects us to others. Even if it’s just the library, there is now a connection between borrower and lender. Consequently, when the lender (in this case, the library) needs something, I am indebted and likely to oblige (by voting in favor of the library system, in this case). Similarly, when a friend comments on something I share on Facebook, I am much more likely to return that comment. In doing so, the relationship is strengthened by a common bond or interest.

Thus, my preference of the Nook over the Kindle was not so much about the best device, but about which one will provide me with the better opportunities. And isn’t this what reading is really all about – opportunities? Opportunities to experience something you normally wouldn’t be able to, opportunities to learn from the wisest mentors in history, and opportunities to think about one’s own existence from a different perspective. As literary critic Harold Bloom shares,

“We read deeply for varied reasons, most of them familiar: that we cannot know enough people profoundly enough; that we need to know ourselves better; that we require knowledge, not just of self and others, but of the way things are. Yet the strongest, most authentic motive for deep reading…is the search for a difficult pleasure.”

This is why we read – we read to better ourselves, to expand our minds, to understand the world around us, and to find a challenging joy – a difficult pleasure. Thankfully, my Nook has allowed me this difficult pleasure. While I know a Kindle would have met that need just as well, I guess in the end, I just couldn’t imagine choosing a device that limits opportunities for reading.

The other thought that runs through my mind (especially when I read articles like this one from CNN, reporting that Amazon now sells more digital than physical books) is what future generations will think of us as a result of this technology. I just recently started reading 1776, and as I read about the history of the American Revolution, I am struck by how most of the information in McCullough’s book is gleaned from letters and other hand-written documents. Because historical figures such as George Washington and William Emerson, Sr. engaged in such hand-written correspondence, we have a record of not only the events that occurred, but also the thoughts and emotions of the people involved. Will future generations be able to look back on us and say the same? Will emails, chats, phone calls, and digital books stand the test of time in the same way that ancient manuscripts have? Or will they disappear from the human consciousness, much like the original internet websites have slipped from our minds? I don’t have an answer for any of these questions, but I can’t help but wonder what the consequences will be of this rapid advance in technology. As Isaac Asimov pointed out,

“It is change, continuing change, inevitable change, that is the dominant factor in society today. No sensible decision can be made any longer without taking into account not only the world as it is, but the world as it will be.”

Are we thinking forward about the consequences of this sort of change? Have we considered what the ramifications are – how the world will be? I just don’t have an answer.

Is College Worth It?

CollegeA recent survey done by the Pew Research Center found that Americans both with and without college degrees accurately estimate the difference in average yearly earnings at $20,000. The official number, according to the report, is just a hair under that at $19,550.

Of course, a closer look at their research shows that even that number varies greatly depending on field of study. For example, as you can see in chapter 5 of their study, liberal arts and education degrees are worth significantly less than an engineering degree. In fact, a degree in education is worth about half a million dollars less than the average Bachelor’s degree over the course of a working life. But I digress.

The real question here is the title of the study: is college worth it? It’s a question that does not often come up in discussions about K-12 education, but one that really should. Often (as is the case in my district) the assumption is that college is not only worth it, but almost required. The majority of our students graduate and go on to college. However, “the majority” is certainly not “all,” so the question becomes much more immediate. We are in the business of preparing students for success beyond high school, and if “success” does not necessarily mean going to college, we should be preparing students for whatever “success” might look like.

I have slowly come to believe that perhaps college (particularly 4-year liberal arts study) is really not ideal or necessary for many of the students we work with every day. Let’s ignore the rapidly increasing cost of a college education. I know a number of students who, as 9th graders, are excited about the prospect of doing some sort of skilled labor. One 9th grader, in particular, is already doing an apprenticeship as a blacksmith and is incredibly excited about that opportunity. Moreover, the Pew study shows that it’s very possible for them to make a better living doing this kind of skilled labor. It begs the question: is it worth it for this student to continue with a school and curriculum that is focused on preparing him for college?

Mike Rowe, of Dirty Jobs fame, talked to the Commerce, Science, and Transportation Committee about this topic and espoused the desperate and immediate need for skilled labor across the country (read the text of his speech here). In the talk, Rowe shares a valuable insight and some interesting numbers. Most notably, he says

Right now, American manufacturing is struggling to fill 200,000 vacant positions. There are 450,000 openings in trades, transportation and utilities. The Skills Gap is real, and it’s getting wider. In Alabama, a third of all skilled tradesmen are over 55. They’re retiring fast, and no one is there to replace them. [Emphasis Added]

If we are preparing students to be successful the real world, it would seem that helping them develop valuable skills in these trades is one means to that end. These skills, as Rowe points out, are lifelong skills that don’t go away. In addition, as my own father likes to point out, skilled labor simply cannot be outsourced. Having this sort of skill is job security, and for many in those fields, it pays very well (in all likelihood, much better than teaching does).

In addition, such as that in this New York Times article, is suggesting that maybe a college education isn’t impacting students anyway. In fact,

a large number of the students showed no significant progress on tests of critical thinking, complex reasoning and writing that were administered when they began college and then again at the ends of their sophomore and senior years.

If a student goes to a 4-year college and demonstrates absolutely no gains in thinking skills (as 36% of the subjects did), there seems to be a serious problem with the quality of the “education” these students are receiving. As we in K-12 education are trying to educate students and prepare them to be successful after high school, if the colleges they attend are not helping prepare them to be even more successful, what is the point (apart from, of course, that magical degree)?

So the question posed by the title of the Pew survey remains – is college worth it? As is nearly always the case, the answer is much more complex than “yes” or “no.” However, I think it might be safe for us to say that college is likely not the best option for every student. In fact, for many, there are probably better options that will allow them to be more successful in every regard than a college education would.

Which One is “Real” Education?

I have to share a quick story from a colleague that frustrates me. [A quick note – this is my frustration, not his. He was telling me about this class and I asked him what this past week was like, thinking it must have been an amazing teaching opportunity. He was, and is, quite professional about the whole thing and is wise enough to recognize that he has no choice but to move on. I, on the other hand, remain perturbed on his behalf.]

There is apparently a class at the high school called “20th Century War and Terror.” It sounds like a fascinating class, covering everything from the Armenian genocide and World War I to the Desert Storm conflict. If this class had been an option at my high school, I would have signed up in a heartbeat; looking at history through the lens of warfare fascinates me.

Needless to say, the last week could have been an amazing opportunity for students to study the subject matter in real-time as the Osama bin Laden story unfolded in front of their eyes. Talk about relevant learning – it would be like teaching a class on theatrical tradition when Shakespeare or Christopher Marlowe died, or studying music history when John Lennon was killed. I can only begin to imagine the possibilities.

I say “could have been,” though, because the class was not allowed to use computers this past week. 8th graders in our district were taking the online version of the state’s standardized test all week (and into next week), so teachers and students across the district (including those in this class) were asked to minimize internet use as much as possible.

So rather than analyzing the reactions from different parts of the world, discussing the ramifications on international relations, or researching similarities to other historical deaths, students were left to quickly gloss over the topic and then continue on with their regularly scheduled programming.

So I’m left with the nagging question – which one is real education? The state (and federally) mandated testing or the clearly relevant current event intricately connected to the course content?


I’ve been doing some research and thinking about Ubuntu. No, not the Linux operating system, but its namesake. I started looking into it a couple weeks ago. While watching the World Cup, I noticed a commercial that included this word and I wanted to know what it meant. As I read more about it, I became intrigued and had to wrap my head around it. This post is the result of that thinking.

Before sharing this, I’ll share my classroom application for this. I have struggled over the last couple years to find balance between enforcing the rules that I want enforced and giving students input into the class rules. I’ve set  my own rules, tried to create a class constitution, but haven’t found something really effective. This year, I’ll be trying something new – a class covenant. I’ll post more about this concept later, but the applicable part for now is that I will provide guiding principles and students will identify the outcomes of those principles in different contexts. After learning about Ubuntu, I have no doubt that this will be one of the guiding principles in my classroom this year.


What is Ubuntu?

“Ubuntu” is what it means to be human. A Zulu maxim provides perhaps the simplest definition of Ubuntu: “umuntu ngumuntu ngabantu,” which means “a person is a person through other persons.” It is a philosophical belief that being human means recognizing and respecting the humanity of others. As Archbishop Desmond Tutu said, “It is the essence of being human. It speaks of the fact that my humanity is caught up and is inextricably bound up in yours. I am human because I belong.” Tutu goes on to say, “Ubuntu speaks particularly about the fact that you can’t exist as a human being in isolation. It speaks about our interconnectedness. You can’t be human all by yourself.” It is, he says, the relationships between us that make us truly human.

It is not an uncommon philosophy. English poet John Donne, in “Meditation XVII,” opined that “No man is an island, entire of itself; every man is a piece of the continent, a part of the main,” noting, as Tutu does, that being human means you are part of a greater whole. Immanuel Kant, in Groundwork on the Metaphysic of Morals, writes that all persons should “Act in such a way that you treat humanity, whether in your own person or in the person of any other, never merely as a means to an end, but always at the same time as an end.” By treating others not as objects, but as people, we not only respect them, but respect and affirm our own humanity and the ways in which we are bound to one another.

What does Ubuntu look like?

While Ubuntu is a worldview, there are certain outward characteristics that reflect the internal belief that we are all connected, most notably the traits of compassion and justice.

It speaks about wholeness, it speaks about compassion. A person with Ubuntu is welcoming, hospitable, warm and generous, willing to share. Such people are open and available to others, willing to be vulnerable, affirming of others, do not feel threatened that others are able and good, for they have a proper self-assurance that comes from knowing that they belong in a greater whole. They know that they are diminished when others are humiliated, diminished when others are oppressed, diminished when others are treated as if they were less than who they are. The quality of Ubuntu gives people resilience, enabling them to survive and emerge still human despite all efforts to dehumanize them. (Archbishop Desmond Tutu, God Has a Dream)

Tutu points out that respecting the humanity of others and having empathy towards others leads us to be positive and welcoming to other people. When we truly believe in the concept of Ubuntu, we realize that when someone else is degraded, then we ourselves are degraded. This leads us to a point where we are enacting justice on behalf of others

In addition to seeking justice, Ubuntu impels us to be compassionate and hospitable. “A traveler through a country would stop at a village and he didn’t have to ask for food or for water. Once he stops, the people give him food, entertain him. That is one aspect of Ubuntu but it will have various aspects. Ubuntu does not mean that people should not enrich themselves. The question therefore is: Are you going to do so in order to enable the community around you to be able to improve?” (Nelson Mandela). Ubuntu, according to Mandela, puts our own personal gains in a larger context – the context of something greater than ourselves, some transcendent cause. Our own personal gains, whether mental gains (such as education) or physical gains (such as money), inevitably benefit the greater community and make it a better place for everyone. Thus, those with Ubuntu are more likely to share their gains of wisdom or wealth with their neighbors.

Of course, this is no different than the Christian ethic, which values respect and justice as the highest human good. In Leviticus 19:18, the scripture says, “You shall love your neighbor as yourself,” commanding us to recognize the inherent humanity in each other. Again in the Gospels, Jesus reminds his followers not only to “love the Lord your God with all your heart, and with all your soul, and with all your mind, and with all your strength,” but also that the next greatest commandment is that “you shall love your neighbor as yourself. There is no other commandment greater than these” (Mark 12:30-31). In the scripture, loving God with everything we have leads to what Mandela and Tutu call “Ubuntu” – the respect and compassion that we have for each other within a community.  The end result, however, remains the same – people with Ubuntu are welcoming, compassionate, and affirming towards all people.

How to apply Ubuntu

With a basic understanding of the Ubuntu philosophy, the application of these beliefs should be somewhat obvious. Building meaningful relationships, treating people with respect, affirming people across cultural divides, and enacting justice on behalf of others should seem to be clear outcomes of Ubuntu.

Even so, there are some guiding principles that can help us become more adept at applying Ubuntu in a practical way. Stanlake J.W.T. Samkange emphasizes three maxims that give a sort of practicality to Ubuntu, much as Kant did in his Metaphysic of Morals. First, he said, “To be human is to affirm one’s humanity by recognizing the humanity of others and, on that basis, establish respectful human relations with them.” First and foremost, the application of Ubuntu requires us to appreciate not only with our words, but with our actions, the inherent humanity in each other. As a result of this appreciation, we are able to treat each other with dignity and respect.

Samkange’s second principle of Ubuntu is a practical application of the previous maxim: “If and when one is faced with a decisive choice between wealth and the preservation of the life of another human being, then one should opt for the preservation of life.” This particular statement emphasizes that respecting our humanity and the humanity of others should always be the primary motive for any action. Wealth, while useful for advancing the good of the community, should never be valued above preserving the humanity of another person. For example, even something as simple as an insult degrades the humanity of someone else, and given the opportunity to make money by insulting someone, we should always say no to the money, because that person’s dignity is more valuable to us.

Finally, Samkange provides a third principle: “The king owes his status, including all the powers associated with it, to the will of the people under him.” Those who are in power, he says, are only in power because the people have allowed them to be in power.  This democratic ideal, Samkange says, was a “principle deeply embedded in traditional African political philosophy.” Consequently, the practical application is one directed at those who are in leadership: lead courageously. Those in power must continue to recognize the humanity in the people they lead, and must continually affirm the dignity of others. It is particularly imperative for those in leadership roles to develop Ubuntu because they have an impact not only on the people they lead, but on other whole communities. Thus recognizing that they are “a person only through other persons” allows them to work for a transcendent cause, lead courageously, enact justice on behalf of others.


Ubuntu is not a religious belief. Truly, it is not even an all-encompassing worldview. At its most powerful, Ubuntu is scarcely a moral imperative. Rather, Ubuntu is an underlying belief – it provides the “why” for actions that we all know to be good and just. By recognizing that we are all persons only through other persons, that I am human only because of the humanity of others, and that humans are intrinsically interconnected, I have a reason to treat others with respect. I have a reason to be affirming of others. I have a reason to be kind and welcoming and generous. It is this concept of Ubuntu that gives us the motivation to become real human beings and to treat others as such. Ubuntu gives us validation in our mission to accept responsibility, lead courageously, have empathy, enact justice on behalf of others, and work for a transcendent cause. Ubuntu is knowing what it means to be truly human.