→ "There's a Cheaper, More Effective Way to Train Teachers"

Virtually all beginner teachers, in our experience, meanwhile, agree that what they need more than abstract social and pedagogical lectures are tangible techniques and granular-level coaching. They need Band-Aids, not meditations on hematology.

What made my Masters program so good was that we really did have classes focusing on how to teach high school Latin. Those courses, while still not perfect, are the ones I still refer back to daily, not the generic Education and Educational Psychology courses.

There are some really good ideas here about a less-centralized, apprenticeship model of teacher training. One example:

Fortunately for Will, he teaches at a charter school that does something innovative and different. At Will’s school, the top master teachers are given an additional free period to observe and train new teachers—not in pedagogical theory, but in tools such as how to support individual students (“Elijah’s parents are responsive”); content-specific tricks (“here’s a way to explain how to derive the distance formula from the Pythagorean theorem”); or school-specific techniques (“this is how our school manages half-days”).

My school currently does some of this, but there is probably room for more. There are a a lot of master teachers in the building, and we have to use those resources well.

→ "To Foster Your Creativity, Don't Learn To Code; Learn To Paint"

The key to being creative, in any field, be it scientific, technical, or business, in the 21st century definitely requires a certain comfort level in technology. But the best way to harness the power of computers doesn’t reside in coding – it resides in letting computers do the grunt computational work that humans are bad at, so that humans can focus on the creative, problem solving work that computers are bad at.

And if you want to foster those creative, problem solving skills, the solution isn’t learning to code – it’s learning to paint. Or play an instrument. Or write poetry. Or sculpt. The field doesn’t matter: the key thing is that if you want to foster your own innovative creativity, the best way to do it is to seriously pursue an artistic endeavor.

[...] History seems to agree with him. Many of the world’s greatest scientists, in eras both ancient and modern, were also artists. Da Vinci, of course, is famous for his talents both artistic and scientific. Robert Fulton, the inventor of the modern steam engine, was a painter. The actress Hedy Lamarr was the co-inventor of the patent that underlies cell phones, wi-fi and GPS. Her partner in that invention? George Antheil, a composer and musician.

As good an argument as any for studying Latin: not because it's useful, but precisely because it's "useless." (via The Dish)

October 2013 Comic Writer Power Rankings

And now for something completely different...

Each month, my friend Jeff and I will collaborate on our own Comic Writer Power Rankings for the previous month. Because as much as we'd like to pretend we are very serious people who only ever talk about very serious topics, that's not true. The rankings are entirely subjective, though we do take into account sales data. We of course can't read everything, so the titles that at least one of us read last month are given next to the writer. Below are the rankings for October 2013. (To see last month's rankings, click here.)

Now on to this month's rankings.

Read More

Education and the "Self-Checkout" Mentality

My spiritual colleague in Jesuit education Matt Emerson wrote a very nice post over at America titled "Avoiding Education as Self-Checkout Line." It's a great read, and he does an excellent job of getting to the heart of one of the key challenges as more and more technology is introduced into classrooms, namely the potential for dehumanizing the classroom and the interactions between students and teachers. I'll quote a few things here, but please do go read the entire article.

He summarizes his thesis pretty concisely:

But the more we embrace an “app-for-everything” mentality, the more we marginalize the human role.

I think he's spot-on with that danger, but I do think he glosses over some of the ways in which using apps, even some of the specific apps he calls out, can actually increase the engagement that is so important.

For example, he mentions a "Pick a Student" app, that helps teachers call on students at random.

[The app] encourages a corresponding disengagement from those same students. In the few seconds it takes to walk around and scan the room, deciding whom to call on, teachers can learn valuable information from faces, posture, or scribbles on a notebook.

Again, he's not wrong, and an app like that (which I have used on occasion) could certainly discourage teachers from being mindful about what's going on in their classroom. But is this any different from the English teacher when I was in high school who relied on "The Fickle Finger of Fate"? He would assign every student a card from a standard deck, and, when appropriate, draw a card randomly to call on students. Was he unaware of what was going on in the classroom? Not at all. I can't speak for him, but when I have used a similar process (or app), it's because I want assistance being random. Rather than get distracted by the student who is making (or avoiding) eye contact, sometimes I just want to call on somebody without having to think about it. Using an app for that process frees me up to actually be more observant about what's happening.

Now I don't mean to focus on just that one app. Obviously it's just one example that he uses. But I do think more attention should be paid to the ways in which these apps can free us teachers for the more important tasks.

I'm also, perhaps naïvely, less worried than Matt is about losing those specific content skills because of apps.

After all, do we want math teachers who cannot generate equations? Do we want English teachers who, having depended so long on software, can no longer explain semicolons, who can no longer create sophisticated sentences that showcase various usage rules? I hope the answer is a unanimous "no."

My answer is of course "no", but is that a straw man? If I had an app that generated, say, Latin grammar practice (I haven't found one, but it probably exists), that wouldn't prevent me from writing my own exercises. Maybe that's just me, and maybe it's naïve. But if I had a way to give my students more practice at the mechanics of Latin grammar, while freeing up my time to work on entirely new kinds of projects and assessments, or to spend more time connecting with the students one-on-one, that would be a net positive. Of course I should still write my own exercises, my own test questions. But I see these new technologies as freeing me up to focus on the higher level work of a teacher, the tasks that technology cannot readily replace.

All the dangers that Matt Emerson lists are very real. But so are the benefits of this technology. Perhaps it's a bit like the "cognitive surplus" that Clay Shirkey writes about. When I look at all this new technology, I see the dangers, but I cannot help but see the potential. I'm not looking to offload any of those personal, humanizing aspects of teaching, and I try to find apps that help me focus on precisely those things.

For example, the "Speed Grader" app that works with Canvas can automate some of the process of grading, enter the scores into the grade book directly, etc. That doesn't mean I'm going to be less mindful of my students' performance. Instead, if I can cut a few minutes off the logistics of keeping track of grades, that gives me more time to ask questions like, "How did they perform on this, as a class?" or "What can I do next to really see if they understand the material?"

I couldn't agree more with Matt's conclusion:

That kind of learning requires wise and prudent guides. It requires men and women who evoke a love of inquiry. It requires teachers who know when to challenge and when to console and who offer advice more ennobling than what students see on social media.

All of this is not tangential to the curriculum; it is intrinsic to the curriculum. But none of it can be outsourced. None of it can be downloaded. It can only be lived, every day, by teachers confident in who they are and who care deeply about what they do.

He is exactly right. The most important things we do can't be replaced by technology. But the parts that can be outsourced? That's where we need to ask ourselves the right questions: What am I gaining from this technology and what am I giving up? And once answered, we have to discern what is best for our students–and for us.

Bring Your Own Context

The gears of change are turning at my school, as we continue moving toward the 2014–2015 school year when all students will be required to have iPads.[1] At a conference this summer with around 500 other teachers at Jesuit schools, seemingly everybody I talked to was at some point on this same path, either their students already had a device or they were moving in that direction. The big question I heard over and over: iPads or BYOD/BYOT (Bring Your Own Device/Tech)?[2]

In those discussions, the individuals pushing for a BYOD deployment had various practical reasons (i.e., projected cost savings by having students bring their own device), but the pedagogical defenses of BYOD almost always boiled down to this: the device doesn’t matter, it’s what you do with it. Or as the staff at one school likes to put it: “It’s about the verbs, not the nouns.” As much as that argument fit how I generally think about education, I found myself unable to really buy into BYOD as a solution.

For a while, I wasn’t able to articulate exactly why that argument didn’t sit well with me. It is about the verbs, the new styles of activities that this technology can enable, more than it is about the technology itself, right? Yet it occurs to me now that it’s precisely because I agree it’s about the verbs not the nouns that I can’t (yet) advocate for a BYOD deployment.

What do I mean? Well, it’s pretty obvious to any teacher who has started to think about these new technologies that once all students have powerful, network-connected devices in their hands at all times, the classroom is going to change. It has to. At many schools, those changes have been underway for a while now–it’s well-documented. For many teachers, that’s a scary prospect. Sure, some are excited by the changes, others are more hesitant, but I think most would admit to at least some fear. We teachers didn’t learn this way, so we have to chart this new course on our own. That’s not a simple task.

Changing our curriculum and pedagogy is a long-term, labor-intensive task. It’s not going to happen overnight, or even over one school year. It will take a lot of time and energy. I’ve only scratched the surface of this, but I’ve talked to enough teachers who are further down this road to know that it’s not a small job. For all those reasons, the actual technology, the hardware, needs to disappear as much as possible, so that teachers and students can focus on those other pieces.

A BYOD program solves this problem by letting everybody, teachers and students, use whichever device with which they are comfortable. In theory, everybody can focus on the learning (the verbs), because they already know how to use the device in front of them. That only works, of course, if most of the faculty and students are already comfortable with some appropriate piece of technology. In my context[3], the school I’m at now, that’s really not the case, either for the students or the faculty.

So, for my school, choosing one device for everybody will, I think, help us focus on the verbs. I happen to think the iPad, for all its flaws (and I’m able to enumerate its flaws better than most), is well-suited for letting the technology get out of the way. But even if the iPad weren’t an option for some reason, I would really push (again, talking only about my context here) for all faculty and students to be on the same device, whatever that may be. And then, in two years or ten, once the faculty and students have all become accustomed to a new way of learning, perhaps the device, the noun, can truly become unimportant.

There are multiple paths to success here. The key is understanding the endpoint–a classroom truly transformed by the new opportunities technology presents. In some contexts, BYOD may be the clearest path to that goal, in others it’s important to choose the correct device. The point is to expend as little energy as possible on the technology in order to focus on the real challenges.


  1. I’m trying to avoid saying “We’re going 1:1 with iPads.” How long before the term “1:1” feels out-of-date? When the printing press was invented, how long did Cambridge brag about going 1:1 with books?  ↩

  2. Interestingly, those were really the only two options a heard. A handful of schools had been using laptops for a while, and were sticking with them, and one school was going with the Microsoft Surface (to the dismay of some of the teachers), but the vast majority saw the choice as iPad or BYOD.  ↩

  3. The other piece that I have come to grips with is as obvious as it is important: my opinion, if it’s valid at all, is based entirely on my own experience and context. There is certainly no one-size-fits-all solution for taking this leap.  ↩

Backward Design, Ctd

I hate to be "Breaking Bad guy", but the critical consensus is absolutely correct. This is a show that has only gotten better since it began five and a half years ago. The writers, directors, and actors are all at the top of their craft, and, with two episodes to go, are sticking the landing.

There's a tendency to ascribe at least some of that success to the writers, especially creator Vince Gilligan, having the idea of how the show would end since it began. I have even made essentially that same point myself. And while there's definitely some truth there, it's been interesting to listen to Vince Gilligan on the Breaking Bad Insider Podcastdebunk some of that notion.

Here is one example: in episode 502 ("Madrigal"), Walter White hides a vial of ricin in his house. Then in episode 509 ("Blood Money"), the opening teaser is a flash-forward, showing an older, cancer-stricken Walter White returning to his abandoned home in New Mexico to fetch that same vial. If you watch the series, you know that Chekov's Ricin will somehow have an impact in the final episodes. It's logical to assume that this entire endgame has been planned out, at least since that episode 502, if not earlier. But on the podcast, Vince Gilligan says that's not entirely correct. When the writers had Walter White hide the ricin, they had a vague idea of how it might be used later, but nothing concrete. Even in episode 509, they still weren't sure precisely how it would all shake out. For as much as they are trying to think ahead, they are always conscious of the moment as well. For those particular episodes, those little bits felt right, so they went with it. They realized they were writing themselves into a corner, so to speak, but they trusted themselves to work through it when the time came. (Again, there are still two episodes left, but I'm confident I won't need to amend all this later and say, "They botched it!")

Vince and Bryan

I actually like to hear this, and it matches up well with how I think about my own teaching. I have a few colleagues who in August have already planned out their daily schedule for the entire year. I admire that approach, and it works well for them, but I simply cannot fathom working that way. I know what I need to cover for the year, and I sketch out roughly where I should be each week. But when it comes time to plan the weekly and daily lessons, I am focused more on what's best for that class. I still have an eye on my overall plan, of course. But if I come up with an idea for a group project, but I'm not certain how it will turn out, I often err on the side of trying it and then fixing it later if necessary. Just like the writers on Breaking Bad sometimes have to get themselves out of a corner, I may have to adjust my pace or priorities later, but I trust myself enough to do that when necessary. The trick is knowing when to change that focus, when to shift from thinking four moves ahead to doing what is best at this moment.

When it works well, the class appears to be as perfectly crafted as an episode of Breaking Bad (though generally with less meth, and fewer tears). When it doesn't work? It's more like Lost.

Gritty

Seth Godin just wrote a great post called "Red Lantern" in which he argues that we should reward kids who persevere at least as much as we reward those who have more natural talent. Winning the gene pool is one thing, but persevering is a skill that can and should be encouraged.

The whole article is brief and worth reading, but here's the anecdote that begat the title:

At the grueling Iditarod, there's a prize for the musher who finishes last: The Red Lantern.

Failing to finish earns you nothing, of course. But for the one who sticks it out, who arrives hours late, there's the respect that comes from finding the strength to make it, even when all seems helpless.

The solution here isn't simply to give out more "participation" ribbons. This piece in New York sets out the dangers of giving out unwarranted praise. The key is to somehow recognize, foster, and reward grit, that trendy personality trait) getting so much attention lately. It's not just about effort, but perseverance even when they fail the first time or the first three times.

I think my school generally does a decent job of rewarding a variety of students, not just the ones with the most natural talents. But what can I do in the classroom to foster grit–a trait I would not say I personally have in abundance? It's the same challenge faced by video game designers: it's easy to make an impossibly hard or incredibly easy game. The challenge is getting that balance–difficult enough to be worthwhile, but with opportunities for growth and success to prevent frustration.

If I go too far towards the "Challenging" side of the spectrum, I will lose a lot of students who simply get frustrated and do not see any payoff (especially since I teach an elective and they can simply quit). But if there are too many rewards just for effort, there won't be any meaningful growth.

There are a lot of complicating factors, not the least of which is a culture that often expects an A for effort. I'm not sure yet how to navigate all this. Maybe new technology can make students' effort/progress more visible to them, in a way more meaningful than a spreadsheet of grades? I'm not sure, but I do want to help form gritty students.