Preview of iPad Classroom Management (or "How I Learned to Stop Worrying and Love Candy Crush")

There are about 75 school days left this semester. That gives me roughly 180 class periods to prepare myself for next year, when all the students will have iPads. While they aren't required to have an iPad until next year, they nearly all have something this year: smartphone, tablet, or laptop. That gives me plenty of opportunities to practice one aspect of this transition for next year: classroom management.

When I see a student on their laptop when they clearly aren't taking notes, playing Candy Crush on a tablet, or Snapchatting their friends, it's tempting to just take their device away from them or otherwise "punish" them. And I've taken more than a few cellphones and kept them for the duration of the class. But that "solution" has never sat well with me, especially if I'm thinking about next year. When all the students have purchased an iPad specifically for school, spending time collecting them during class feels counter-productive. And how does that prepare them for the real world? I have my iPad at every faculty meeting. When I get bored and start looking at Twitter, is my principal going to come take it away from me, to remove the distraction? Probably not.

Tuesday was a typical example of this in a class of juniors. There was a student clearly distracted by her cellphone, so I simply walked up, held out my hand, and she gave me her phone. Her neighbor, who is often distracted as well, though not at that moment, had her cellphone sitting on her desk. I held out my hand and she gave me her phone too. Student #1, now sans-cellphone, started playing Candy Crush on her tablet. Did I take her tablet as well? Nope. Did I call her out several times during class, highlighting the fact that she was trying (and failing) to multitask? You bet. Later in that same class, the students were working in groups, and Student #2 came up to my podium and asked for her phone.

Student: Can I have my phone back?

Me: Sure. You could've had it back whenever you wanted.

Student: Really?

Me: Yep. I was just trying to help you avoid distractions. But if you can handle that on your own, go right ahead. It's yours.

Student: Ugh, I hate when you do this. (She walks back, empty-handed.)

To be clear, I'm not claiming I handled this perfectly–or even particularly well. I'm still trying to figure out how I will deal with these situations next year. This is just one typical day of interactions.

When I notice somebody playing Candy Crush in class, I don't get angry. But I will, with a smile on my face, remind them of the choices they're making. Are they going to "get away with it"? Probably. But I remind them that the consequences are entirely theirs: they are the ones who are choosing to use their time that way, and they will deal with that when attempting the homework, taking the next test, etc.

And while I'm not above keeping a pile of devices on my podium, I rarely say, "Give me your phone." Since the beginning of the year, I've tried to ask them, "Is your phone helping you right now? Want me to keep it so you aren't distracted?" Now by this point in the year, they'll just hand it over when I walk by and gesture for it. I really don't want it to be punitive. (Yes, obviously there are times somebody is using their technology in completely inappropriate ways–like cheating–and there can be more severe consequences, but those times are rare.) Rather than have them think I'm punishing them, I want them to trust that I'm trying to help them do their best. That's usually their goal too, to succeed, and they respond well when they know we're on the same team.

I know several teachers who often ask the students to put all their devices in a box at the beginning of class. There are days I'm tempted to do the same, but I'm trying to resist that urge. There are certainly times and places when I'll ask for all devices to be put away, but I'd rather assume they'll do that when I ask, rather than require me to take them. I've also read about teachers who have a "stoplight" in their classroom: green light means it's OK to use the devices, red light means put everything away. Again, I understand the impulse, but I really want the kids to be engaged throughout class, and for some that will mean more technology, for others less. They're better off learning how to make those decisions themselves.

I've been very open with my students regarding my thought process on all this. I tell them honestly that I struggle with wanting to be strict and "make" them to pay attention to me, while at the same time trying to give them freedom and hope they'll learn to monitor themselves. Next year will have a learning curve for sure, both for me and, more importantly, for my students. But the technology isn't going anywhere, so if I don't help them think about how to use it effectively, who will? I didn't get to have these lessons in high school, which is why you can find me in the back of the faculty meeting checking Twitter.

O, the Humanities, ctd.

America recently published commentary on the same report on the humanities that I wrote about this summer. The author, Raymond A. Schroth, SJ, is a long-time teacher and journalist, and he gives a spirited defense of a traditional, literary humanities education.

My experience, on the other hand—after teaching literature for over 40 years and, as editor of the Jesuit higher education magazine Conversations, talking with students and faculty in all 28 Jesuit colleges and universities—is that students are starving intellectually, whether they acknowledge it or not. Few can talk easily about a great book they have read. Mark Edmundson writes in The Chronicle (8/2) that literature is “character forming” and “soul making”—a way of life. Catholic and Jesuit universities, it seems to me, have always taught this. How well we have succeeded is another question.

His examples are mostly literary in nature, and I certainly can't argue with any of them. Obviously I think it's important to read the Aeneid (preferably in Latin), or I wouldn't be teaching it. But as I continue to think about this topic, I wonder if we also need to expand our definition of what constitutes a humanities education.

Many people define the humanities by giving examples: The Mona Lisa, James Joyce's Ulysses, the history of the French Revolution. But at the broadest level, the humanities are a study of what makes us human, the study of human culture. Art, literature, and history are certainly some of the best, most obvious examples of this, which is why most defenses of the humanities, including my own, focus on those topics. But if we stick with that broad definition, should there not be other lenses through which to better understand the human condition?

For example, I have read few defenses of the humanities that mention film. But for the last century, movies have been an important part of our culture. The same can be said about music, dance, or photography. When I think about human culture today, technology plays a tremendous role. The connections that people are making were impossible just twenty years ago, but are an important part of our world. The technologies themselves, the hardware and the software, that enable these connections play an important role too. Shouldn't a humanities education include these aspects of human culture as well? Even classes that focus on the mechanics of app design or filmmaking have room for humanities-inspired lessons on the impact of those disciplines, and the examples of greatness that have come before.

I suppose my definition of the humanities is based on what the humanities are not. The humanities are not "useful" in the traditional sense of the word. They are not job training. They are simply a study of what makes us human. Fr. Schroth is right–Jesuit schools have a history of valuing this kind of education, and I certainly hope there are always students who value reading Vergil. The modern focus on "utility" carries for us the risk of losing touch with an important part of our heritage. But at the same time, we must not see the humanities themselves as static and unchanging. The focus on our shared humanity is what's important; the Aeneid or the Mona Lisa or Citizen Kane are simply vehicles for that reflection.

→ "There's a Cheaper, More Effective Way to Train Teachers"

Virtually all beginner teachers, in our experience, meanwhile, agree that what they need more than abstract social and pedagogical lectures are tangible techniques and granular-level coaching. They need Band-Aids, not meditations on hematology.

What made my Masters program so good was that we really did have classes focusing on how to teach high school Latin. Those courses, while still not perfect, are the ones I still refer back to daily, not the generic Education and Educational Psychology courses.

There are some really good ideas here about a less-centralized, apprenticeship model of teacher training. One example:

Fortunately for Will, he teaches at a charter school that does something innovative and different. At Will’s school, the top master teachers are given an additional free period to observe and train new teachers—not in pedagogical theory, but in tools such as how to support individual students (“Elijah’s parents are responsive”); content-specific tricks (“here’s a way to explain how to derive the distance formula from the Pythagorean theorem”); or school-specific techniques (“this is how our school manages half-days”).

My school currently does some of this, but there is probably room for more. There are a a lot of master teachers in the building, and we have to use those resources well.

Education and the "Self-Checkout" Mentality

My spiritual colleague in Jesuit education Matt Emerson wrote a very nice post over at America titled "Avoiding Education as Self-Checkout Line." It's a great read, and he does an excellent job of getting to the heart of one of the key challenges as more and more technology is introduced into classrooms, namely the potential for dehumanizing the classroom and the interactions between students and teachers. I'll quote a few things here, but please do go read the entire article.

He summarizes his thesis pretty concisely:

But the more we embrace an “app-for-everything” mentality, the more we marginalize the human role.

I think he's spot-on with that danger, but I do think he glosses over some of the ways in which using apps, even some of the specific apps he calls out, can actually increase the engagement that is so important.

For example, he mentions a "Pick a Student" app, that helps teachers call on students at random.

[The app] encourages a corresponding disengagement from those same students. In the few seconds it takes to walk around and scan the room, deciding whom to call on, teachers can learn valuable information from faces, posture, or scribbles on a notebook.

Again, he's not wrong, and an app like that (which I have used on occasion) could certainly discourage teachers from being mindful about what's going on in their classroom. But is this any different from the English teacher when I was in high school who relied on "The Fickle Finger of Fate"? He would assign every student a card from a standard deck, and, when appropriate, draw a card randomly to call on students. Was he unaware of what was going on in the classroom? Not at all. I can't speak for him, but when I have used a similar process (or app), it's because I want assistance being random. Rather than get distracted by the student who is making (or avoiding) eye contact, sometimes I just want to call on somebody without having to think about it. Using an app for that process frees me up to actually be more observant about what's happening.

Now I don't mean to focus on just that one app. Obviously it's just one example that he uses. But I do think more attention should be paid to the ways in which these apps can free us teachers for the more important tasks.

I'm also, perhaps naïvely, less worried than Matt is about losing those specific content skills because of apps.

After all, do we want math teachers who cannot generate equations? Do we want English teachers who, having depended so long on software, can no longer explain semicolons, who can no longer create sophisticated sentences that showcase various usage rules? I hope the answer is a unanimous "no."

My answer is of course "no", but is that a straw man? If I had an app that generated, say, Latin grammar practice (I haven't found one, but it probably exists), that wouldn't prevent me from writing my own exercises. Maybe that's just me, and maybe it's naïve. But if I had a way to give my students more practice at the mechanics of Latin grammar, while freeing up my time to work on entirely new kinds of projects and assessments, or to spend more time connecting with the students one-on-one, that would be a net positive. Of course I should still write my own exercises, my own test questions. But I see these new technologies as freeing me up to focus on the higher level work of a teacher, the tasks that technology cannot readily replace.

All the dangers that Matt Emerson lists are very real. But so are the benefits of this technology. Perhaps it's a bit like the "cognitive surplus" that Clay Shirkey writes about. When I look at all this new technology, I see the dangers, but I cannot help but see the potential. I'm not looking to offload any of those personal, humanizing aspects of teaching, and I try to find apps that help me focus on precisely those things.

For example, the "Speed Grader" app that works with Canvas can automate some of the process of grading, enter the scores into the grade book directly, etc. That doesn't mean I'm going to be less mindful of my students' performance. Instead, if I can cut a few minutes off the logistics of keeping track of grades, that gives me more time to ask questions like, "How did they perform on this, as a class?" or "What can I do next to really see if they understand the material?"

I couldn't agree more with Matt's conclusion:

That kind of learning requires wise and prudent guides. It requires men and women who evoke a love of inquiry. It requires teachers who know when to challenge and when to console and who offer advice more ennobling than what students see on social media.

All of this is not tangential to the curriculum; it is intrinsic to the curriculum. But none of it can be outsourced. None of it can be downloaded. It can only be lived, every day, by teachers confident in who they are and who care deeply about what they do.

He is exactly right. The most important things we do can't be replaced by technology. But the parts that can be outsourced? That's where we need to ask ourselves the right questions: What am I gaining from this technology and what am I giving up? And once answered, we have to discern what is best for our students–and for us.

Bring Your Own Context

The gears of change are turning at my school, as we continue moving toward the 2014–2015 school year when all students will be required to have iPads.[1] At a conference this summer with around 500 other teachers at Jesuit schools, seemingly everybody I talked to was at some point on this same path, either their students already had a device or they were moving in that direction. The big question I heard over and over: iPads or BYOD/BYOT (Bring Your Own Device/Tech)?[2]

In those discussions, the individuals pushing for a BYOD deployment had various practical reasons (i.e., projected cost savings by having students bring their own device), but the pedagogical defenses of BYOD almost always boiled down to this: the device doesn’t matter, it’s what you do with it. Or as the staff at one school likes to put it: “It’s about the verbs, not the nouns.” As much as that argument fit how I generally think about education, I found myself unable to really buy into BYOD as a solution.

For a while, I wasn’t able to articulate exactly why that argument didn’t sit well with me. It is about the verbs, the new styles of activities that this technology can enable, more than it is about the technology itself, right? Yet it occurs to me now that it’s precisely because I agree it’s about the verbs not the nouns that I can’t (yet) advocate for a BYOD deployment.

What do I mean? Well, it’s pretty obvious to any teacher who has started to think about these new technologies that once all students have powerful, network-connected devices in their hands at all times, the classroom is going to change. It has to. At many schools, those changes have been underway for a while now–it’s well-documented. For many teachers, that’s a scary prospect. Sure, some are excited by the changes, others are more hesitant, but I think most would admit to at least some fear. We teachers didn’t learn this way, so we have to chart this new course on our own. That’s not a simple task.

Changing our curriculum and pedagogy is a long-term, labor-intensive task. It’s not going to happen overnight, or even over one school year. It will take a lot of time and energy. I’ve only scratched the surface of this, but I’ve talked to enough teachers who are further down this road to know that it’s not a small job. For all those reasons, the actual technology, the hardware, needs to disappear as much as possible, so that teachers and students can focus on those other pieces.

A BYOD program solves this problem by letting everybody, teachers and students, use whichever device with which they are comfortable. In theory, everybody can focus on the learning (the verbs), because they already know how to use the device in front of them. That only works, of course, if most of the faculty and students are already comfortable with some appropriate piece of technology. In my context[3], the school I’m at now, that’s really not the case, either for the students or the faculty.

So, for my school, choosing one device for everybody will, I think, help us focus on the verbs. I happen to think the iPad, for all its flaws (and I’m able to enumerate its flaws better than most), is well-suited for letting the technology get out of the way. But even if the iPad weren’t an option for some reason, I would really push (again, talking only about my context here) for all faculty and students to be on the same device, whatever that may be. And then, in two years or ten, once the faculty and students have all become accustomed to a new way of learning, perhaps the device, the noun, can truly become unimportant.

There are multiple paths to success here. The key is understanding the endpoint–a classroom truly transformed by the new opportunities technology presents. In some contexts, BYOD may be the clearest path to that goal, in others it’s important to choose the correct device. The point is to expend as little energy as possible on the technology in order to focus on the real challenges.


  1. I’m trying to avoid saying “We’re going 1:1 with iPads.” How long before the term “1:1” feels out-of-date? When the printing press was invented, how long did Cambridge brag about going 1:1 with books?  ↩

  2. Interestingly, those were really the only two options a heard. A handful of schools had been using laptops for a while, and were sticking with them, and one school was going with the Microsoft Surface (to the dismay of some of the teachers), but the vast majority saw the choice as iPad or BYOD.  ↩

  3. The other piece that I have come to grips with is as obvious as it is important: my opinion, if it’s valid at all, is based entirely on my own experience and context. There is certainly no one-size-fits-all solution for taking this leap.  ↩

Backward Design, Ctd

I hate to be "Breaking Bad guy", but the critical consensus is absolutely correct. This is a show that has only gotten better since it began five and a half years ago. The writers, directors, and actors are all at the top of their craft, and, with two episodes to go, are sticking the landing.

There's a tendency to ascribe at least some of that success to the writers, especially creator Vince Gilligan, having the idea of how the show would end since it began. I have even made essentially that same point myself. And while there's definitely some truth there, it's been interesting to listen to Vince Gilligan on the Breaking Bad Insider Podcastdebunk some of that notion.

Here is one example: in episode 502 ("Madrigal"), Walter White hides a vial of ricin in his house. Then in episode 509 ("Blood Money"), the opening teaser is a flash-forward, showing an older, cancer-stricken Walter White returning to his abandoned home in New Mexico to fetch that same vial. If you watch the series, you know that Chekov's Ricin will somehow have an impact in the final episodes. It's logical to assume that this entire endgame has been planned out, at least since that episode 502, if not earlier. But on the podcast, Vince Gilligan says that's not entirely correct. When the writers had Walter White hide the ricin, they had a vague idea of how it might be used later, but nothing concrete. Even in episode 509, they still weren't sure precisely how it would all shake out. For as much as they are trying to think ahead, they are always conscious of the moment as well. For those particular episodes, those little bits felt right, so they went with it. They realized they were writing themselves into a corner, so to speak, but they trusted themselves to work through it when the time came. (Again, there are still two episodes left, but I'm confident I won't need to amend all this later and say, "They botched it!")

Vince and Bryan

I actually like to hear this, and it matches up well with how I think about my own teaching. I have a few colleagues who in August have already planned out their daily schedule for the entire year. I admire that approach, and it works well for them, but I simply cannot fathom working that way. I know what I need to cover for the year, and I sketch out roughly where I should be each week. But when it comes time to plan the weekly and daily lessons, I am focused more on what's best for that class. I still have an eye on my overall plan, of course. But if I come up with an idea for a group project, but I'm not certain how it will turn out, I often err on the side of trying it and then fixing it later if necessary. Just like the writers on Breaking Bad sometimes have to get themselves out of a corner, I may have to adjust my pace or priorities later, but I trust myself enough to do that when necessary. The trick is knowing when to change that focus, when to shift from thinking four moves ahead to doing what is best at this moment.

When it works well, the class appears to be as perfectly crafted as an episode of Breaking Bad (though generally with less meth, and fewer tears). When it doesn't work? It's more like Lost.

Gritty

Seth Godin just wrote a great post called "Red Lantern" in which he argues that we should reward kids who persevere at least as much as we reward those who have more natural talent. Winning the gene pool is one thing, but persevering is a skill that can and should be encouraged.

The whole article is brief and worth reading, but here's the anecdote that begat the title:

At the grueling Iditarod, there's a prize for the musher who finishes last: The Red Lantern.

Failing to finish earns you nothing, of course. But for the one who sticks it out, who arrives hours late, there's the respect that comes from finding the strength to make it, even when all seems helpless.

The solution here isn't simply to give out more "participation" ribbons. This piece in New York sets out the dangers of giving out unwarranted praise. The key is to somehow recognize, foster, and reward grit, that trendy personality trait) getting so much attention lately. It's not just about effort, but perseverance even when they fail the first time or the first three times.

I think my school generally does a decent job of rewarding a variety of students, not just the ones with the most natural talents. But what can I do in the classroom to foster grit–a trait I would not say I personally have in abundance? It's the same challenge faced by video game designers: it's easy to make an impossibly hard or incredibly easy game. The challenge is getting that balance–difficult enough to be worthwhile, but with opportunities for growth and success to prevent frustration.

If I go too far towards the "Challenging" side of the spectrum, I will lose a lot of students who simply get frustrated and do not see any payoff (especially since I teach an elective and they can simply quit). But if there are too many rewards just for effort, there won't be any meaningful growth.

There are a lot of complicating factors, not the least of which is a culture that often expects an A for effort. I'm not sure yet how to navigate all this. Maybe new technology can make students' effort/progress more visible to them, in a way more meaningful than a spreadsheet of grades? I'm not sure, but I do want to help form gritty students.