Monday, June 22, 2009

Iran, Citizen Journalism, and the Future of Education

Those of you that know me understand that I am a news junkie, especially concerning American and world politics. I've been following the crisis in Iran rather closely over the last week and a half or so, and have been struck both by the revolutionary nature of these events and by the manner in which we have been able to experience them. Those of us who have been around for a few years remember some key revolutionary events in the world of international affairs, and not all have been successful: the fall of the Berlin Wall, the massacre in China's Tienanmen Square, the standoff in Moscow that led to the collapse of the Soviet Union, and the 1979 revolution that brought the current government to power in Iran in the first place. In each case, there were official representatives of news media whose video and still image documentation established iconic memories for those of us familiar with those events.

Recently, I was listening to the podcast of the Slate Political Gabfest, with Emily Bazelon, John Dickerson, and David Plotz, discussing the recent crisis in Iran as a result of their disputed election results. It was a winding discussion (indeed, the episode was titled "The Kitchen B***h, perhaps alluding to the wide range of topics discussed), that spent a few minutes on the way we've been getting our information regarding what is happening in Iran. Because the mainstream media has been either kicked out of the country or detained, and the official state media is muzzled by the Guardian Council, much of the information regarding the current state of affairs in Iran has been obtained through monitoring Iranian Twitter posts and uploads to YouTube. The discussion touched on the crisis within journalism today, referencing falling newspaper subscriptions and the numbers of people who are abandoning official news media for online blogs that cover the news of the day. The concern is, of course, that without an unbiased, independent news media (I'm aware of the controversy surrounding the word "unbiased" and I'm not addressing that now), there is no way for the public to get accurate information. Indeed, the Gabfest addressed this point directly, making note of the fact that there is no clear way to distinguish between Iran-related posts on Twitter that are legitimate communications from the streets of Tehran and those that spread mere rumor.

One of the Gabfesters, Emily I believe, commented that this perhaps is the new role that journalism must take if it is to survive in the 21st century world of Web 2.0, the role of "aggregator". She discussed that traditional journalism finds sources for stories and then reports on them, and then postulated that the "new" journalism might consist of analyzing 200 posts on Twitter, cross-referencing them with the latest uploads to YouTube and discussions on Facebook and then determining what is the "truth", with an eye on reporting that for us, in order to help make sense of it all for the average consumer of information.

This was the point that got my head spinning, because I couldn't stop thinking of what it might imply for the future of education.

One of the biggest criticisms of education and the pedagogical study of it is that it is an "ivory tower" business; that some of its theories and practices are interesting, but of no practical use in today's classroom, with teachers overwhelmed by class sizes, misbehavior, helicopter parents, and what has been characterized as a national obsession over test results. Yet, everyone agrees that the era of fads in education must end if we are to move forward with effective, data-driven strategies that produce results; such a pursuit of strategies must by necessity rely upon university studies and peer review to be seen as legitimate. The proper application of classroom technology is perhaps a way to bridge this "educational divide" between reality and expectations.

Some readers will scoff at this notion, and rightly so. Like many of you, I work in a school district where there are many computers. Many of them work, many don't, and the rest work very slowly. Regardless of their working condition, it may seem as though there simply are not enough of them to contribute effectively to our students' education. As Steve Hargadon writes,
Hundreds of millions of dollars, if not more, have been spent on outfitting schools with computers, and most of us would appropriately claim that the impact on student achievement has been little to none. But I would submit that, as happened in our business culture 20 years ago, a set of technologies that actually transform our traditional methods will become the driving catalyst for ubiquitous access to computers at school. What we currently have are computers purchased and maintained largely by school business offices, relatively divorced from teaching methodologies, and either not in a quantity or in a condition to allow overworked teachers to change their teaching methods. Driven not by technology vendors or unproven theories, Web 2.0 instead seems likely to change education precisely because it is a disruptive external change.
The key "disruptive innovation" then, is not the actual possession of computers, but is rather what we can learn from the so-called Web 2.0 that will transform education in the digital age.

Lately, there has been much discussion of the "prosumer." The idea is that, along with our attempts to focus in education on producing citizens who can think critically about the world around them, the consumer has evolved into a being who is not merely "consuming" what is fed to them by the media and the marketplace, but is actually participating in the process. Matt Federoff, of the Vail School District, discusses this concept as a twist on the creation vs. consumption dynamic; that the new citizen is one who consumes multiple sources of information, synthesizes it with a personal flavor, and then re-publishes it as his or her own creation, producing a new source of information for others to consume and digest.

Sorry, kids. Life is a research paper. :-)

This is not an old concept, but is a new twist on an old idea. John Seely Brown and Richard P. Adler, in their article Minds on Fire: Open Education, the Long Tail, and Learning 2.0, argue that
instead of starting from the Cartesian premise of “I think, therefore I am,” and from the assumption that knowledge is something that is transferred to the student via various pedagogical strategies, the social view of learning says, “We participate, therefore we are.”
Think about how many times you have attended a lecture, viewed a television show, or listened to the radio, but didn't really get it until you had the opportunity to construct your own learning through discussion with others. If this is such a powerful tool in education, why are we so focused on clinging to the old methods? Just a few weeks ago, I sat through a day of lectures on how ineffective the lecture model has become in 21st century education. It was an irony that did not escape me as I slowly felt my brain sliding into disuse as the day progressed - indeed, that is one of the few things I remember from that day.

What can be more effective, then, than teachers coming together to share and debate best practices amongst themselves, in their own method of "peer-review"? Like Bazelon's "new journalism" that seeks to aggregate hundreds of Web 2.0 primary sources in order to help sort the wheat from the chaff and create a vision of how things truly are, perhaps it is time for educators to seize the opportunity that online collaboration and social networking presents and use it to develop a grassroots method of professional development. Steve Hargadon writes
I see an incredible educational renaissance coming, where the excitement around collaborating with other educators that was largely accomplished by meeting at annual conferences starts to take place every day. Where a teacher can find other teachers with the same interests and passions, meet "live" in Elluminate, start sharing lesson plans, and even bring their classes into collaboration. I see a day (soon) when the individual educator, pursuing a niche topic of interest that he or she loves and that excites him or her as a learner, can be brought in as a guest speaker to a class on the other side of the world. Where anyone who cares about something in education can start a weekly or monthly meeting in Elluminate to share and brainstorm that topic.
The really interesting part, of course, comes when we involve the student. And why wouldn't we? If we are training our students to be collaborative, critically thinking workers in a 21st century democracy, isn't the use of Web 2.0 an essential part of their education? We have labored for years to step back and "facilitate" instead of focusing on direct instruction, to give up our place as the sage on the stage, and yet we have either clung to that model or struggled to incorporate it into our 19th century style of one-size-fits-all monolithic education. Many modern conferences on educational technology provide a back-channel chat room in which participants or observers can "discuss the discussion" and learn both from listening to the panel at the front of the room and from discussing it with other participants online in real time.

As a former classroom teacher, I recognize the dangers in attempting this in class with teenagers or other age groups. They'll talk about other things! They'll make fun of me/each other! They'll be sexting in class! The thing is, don't we do the same thing and get through it with a deeper understanding of the topic anyway? (well, minus the "sexting" of course!)

Perhaps Chris Yeh sums it up best when he asks
How effective will lectures be when students learn by grazing on tens or hundreds of information feeds each day? How will they react to printed textbooks, when they believe that every document should be editable, commentable, and infinitely shareable? What is the meaning of the word "classroom" when video and mobile devices transmit the majority of knowledge?