Tuesday, March 23, 2010

Why I’m at Ypsi every week – and how I spend my time

This is a question I’ve been asked by several people: my advisor, my fiancee (both driven by a shared desire for me to finish up and get a job already), my parents and siblings, even my students on occasion.

I feel like everyone who takes a TF position does it for different reasons – I did it because:

  • I felt like I had a lot of teachers growing up who were kind of crap, and thought I could do a better job
  • I felt ambivalent about engineering going into college, basically only doing it because I wanted a job when I graduated and because my sister was an engineer.  Only later did I discover that some of the stuff engineers do is pretty cool, and began to think that it’s a shame that most high school students don’t understand that.
  • I was curious what the dynamic looks like in a failing public school, because I’m a wannabe public policy wonk and because it’s such a waste to have so much of our population educated so poorly

As a result, I spend 6 hours of time each week, 2 in prep and 4 where I’m actually in the high school and hands-on.  Now, I do graduate research in plasma physics, and I could show a lot of really cool examples of stuff you can do as an engineer in class, but unfortunately I don’t get to do that stuff all the time.  In addition to the points above about why I’m at Ypsi, I also want to address what I do at Ypsi, i.e., to quantify exactly how much time I spend impacting the class, in what manner, and what that means in this context.

On average, of my 4 hours in class each week, perhaps 50% of my time is spent impacting the classroom in some way, i.e., helping a student individually with a problem or talking to the class as a whole about something.  The other 50% is listening to the daily lecture, since ultimately I am supplementing a high school math class and they need to cover some sort of daily lesson from the textbook.

Of the 2 hours each week that I spend interacting with all or part of the class, most of the impact is in teaching individual students or small groups of them how to think about different problems.  Notice that I don’t really like think of it as teaching how to “do” problems, because that makes it feel like I am a server and the student is just a terminal downloading an algorithm from me.  My teaching style is usually to ask probing questions to force students to clarify their own thinking and understand a problem.  This is easily the most rewarding part of what I do as a TF – I really enjoy showing someone else how to think about math, to show them how it makes sense and fits together, to develop their intuition rather than just their rote memorization skills.  I spend 90% of those 2 hours doing this sort of thing. 

The remaining ~ 10 minutes each week is spent doing something I really don’t think anyone who wasn’t a STEM grad student (or even more technically proficient) could do, giving presentations on neat technical topics, like computer graphics or rocket science, or coming up with really neat applications to illustrate a particular concept.  For example, today I talked with one of my calculus students about how fast a personal jetpack, if one existed, might be likely to travel.  So, in a given week, I spend

  • 2 hours listening to the daily lectures
  • 1 hr 50 min teaching problem-solving
  • 10 min doing unique stuff

Note that this is all in averages, so really that 10 min / week is more like a 30-min presentation once in a month.  Incidentally, that occasional presentation is actually a fair bit of work, and is where much of the additional 2 hours prep goes to, again on average and in aggregate.  Other outside time expenditures are things like administering college guidance, writing the blog before you, etc.

I’ve made these two comparisons, the 50-50 ratio of impact/non-impact and the 90-10 ratio of tutoring / neat engineering, because there are two points of view where these comparisons get really important.

From my point of view, the 50-50 breakdown is frustrating because I can see students who aren’t getting what’s being “uploaded” to them during the lecture, but I’m powerless to really interact with them in the upload-download environment during the lecture except through whispers.  Even this is sometimes distracting to the rest of the class, especially with the many students who are apparently physically unable to whisper.  The effect snowballs, where students who don’t get it don’t pay attention, and then they continue to fall behind, and you reach a point where all but 6 of your students have F’s and you can’t help them catch up in the 50% of time you have left.

From the point of view of someone paying me to be there, the 90-10 balance is more troubling.  As far as tutoring goes, I am a gold-plated tutor – I’m a PhD physicist and engineer tutoring algebra I.  That’s like paying an F1 formula racing mechanic to tune up a Honda sedan.  If 90% of my interaction with students could just as well be accomplished by an undergrad for less money, why send me?  Rather than one grad student in the class 4 hours a week, with 2 hours prep, you could pay a sophomore 1/3 the money as work-study or a scholarship, ditch the prep for the engineering presentations and get 18 hours of total face time per week.

On that note, John Scalzi is a sci-fi author whose work I enjoy.  Recently he responded to a question on his blog about why he doesn’t publish his own books and goes through a publishing house (Macmillan) instead:

“What I genuinely have a hard time understanding is why people don’t seem to grasp that becoming my own publisher is an inefficient use of my time. It’s like telling a surgeon how much better his life will be if he’d just lathe his own surgical tools and cook the meals for the patients in the recovery room.”

To paraphrase Scalzi, I suspect that the TF balance right now is not an efficient use of either my time or UM’s money.  An endeavor with peak efficiency <50% probably needs some more design work.

In another recent post, I mentioned how we need to draw lessons learned from the TF program to improve it in the future.  That means not only the kind of Debbie-downer kind of stuff above, what the TF program isn’t doing well, but also looking at what it has done really well.  On that side of the balance, there are serious pluses to having the gold-plated tutor, because I am also a highly trained spy, making (semi-)regular reports back on my findings behind enemy lines in the strange land of high school.

This blog and this feedback is something that I can do that I doubt an undergrad could.  I’ve had real jobs doing engineering, I’ve done research and actually used some of that stuff you see in calculus books for real problems, and I have gone through enough education to have an idea of what good teaching and good students look like.  As the spy sent into enemy territory to figure out what the heck is going on in there, I’m in a good spot to think about how you fix it, or at least how to make as big a difference as possible with limited resources.  While it’s easy to say with 20/20 hindsight that this the current TF program is not as efficient as it could be, that’s true of most things on version 1.0.  It’s how we adapt to that information that is really important – that’s what this redesign process is all about.  Sometimes (a lot of times), inefficient data collection followed by careful sifting is the only way to get from 1.0 to 2.0. 

Return on Investment

The TF program for the last two years has funded about 10 TFs at $4k / year, so around $40,000 a year.  Of necessity, that’s $40,000 that wasn’t spent doing something else, so it’s reasonable to ask whether we are getting maximum bang for those bucks.  That’s what a few recent posts, like the “Here’s $40k, design a K-12 intervention” series are about.

In this post though, I want to ask a little different question -- what constitutes a good bang for the buck?  I know every TF takes the position for different reasons (and in another post I’ll talk about my motivations), but I want to focus on the other side of the equation: is this a good use of UM’s money?  Trying to answer this question, I made a rather startling realization that I’m not certain what UM’s goals are with this program.  So I’ll make a reasonable estimate of what I think they are:

  1. Increase the number of Ypsi students enrolled at UM, esp. in CoE.
    • Currently this number is ~5 across all years.  For contrast, my high school in Plymouth sent around 50 students to UM my year, maybe 20-30 to CoE that year alone.
  2. Be able to point to the TF outreach program in good faith when writing the ‘broader impacts’ section of NSF grant proposals
  3. Make a connection to a K-12 district, and figure out what the best way is for universities to integrate into the K-12 community

The first two are the clearly numeric parts, where you can probably draw out some dollars and cents analysis.  The third may feel like vague bureaucratese, but there is a pretty big disconnect between the public K-12 school districts and the state universities.  I remember being shocked at how hard you had to work in undergrad to succeed, and I came from a pretty good high school.  I sure didn’t know what I was going to major in, and if not for older siblings who’d already been to college I might not have known which classes to take in high school to prepare myself.

So, are these good goals?  Let’s start with increasing YHS enrollment in CoE.  If UM spends $40k a year on TFs, and as an optimistic result let’s say they get an average of 4 YHS students a year into CoE, that’s $10k per student, not counting any scholarships in the mix. 

On NSF funding, current numbers pulled from NSF’s website list, among currently active awards, $373 million awarded to date to University of Michigan PIs (nicely done Ken Wise with $34M to date!).  If, in bulk, this $40,000 effort translates into even a 1% increase in that overall funding level due to increased acceptance rates for CoE proposals, the $10k number above pales in comparison, and we would say this is a very good bang for the buck.

Of course, NSF likes to see quantifiable metrics of success in the broader impact area, and they are sadly unlikely to consider their own previous funding as evidence of success to increase funding further.  Alas, circular funding logic fails – you need concrete results to sustain the momentum.  This is where I should note that I think goal #1 springs from goal #2 as sort of a proxy metric, lacking anything better, but it’s kind of cheating when you consider the scholarships I mentioned (and then promptly neglected) before.  See, the last few Ypsi High students at UM have basically been on full rides, which literally means we’re paying them to come here.  So does that mean we’ve made an impact and broadened the pool of college-bound Ypsi students, or just convinced proportionally more of the same size pool to come here?  No good answer here, I don’t have enough data. 

So, goal 3, the K-12 connection.  UM has indeed built a connection to YHS.  What passes over that bridge?  YPSD is currently in their last year of NCLB probation, and at the end of this year barring a serious course change will to enter the total reboot phase where they break the teachers’ union, fire just about everybody, rehire some, and start fresh.  My father is a teacher and president of his AFT union local, so understand that I am more than a bit perplexed by the heavy-handedness of that action.  And yet, having spent two years up close and personal at Ypsi High, I am also deeply skeptical about the status quo there producing meaningful results lacking this sort of massive boot-meets-rear kind of action.

Anyway, the point is that if these are the metrics of success, probably goal 1 is coming along pretty well, since  I know of at least 3 students who are entering UM this year from Ypsi High.  Goal 2 is going great if my totally out-of-nowhere numbers are correct to within a few orders of magnitude, but is not sustainable without much clearer observed performance improvements, since NSF likes return on their investment (hence the title of the post).  Goal 3 is an area where I think we’ve made good progress, not necessarily in finding the solution but in identifying the relative effectiveness of a couple of ideas like the TF program, which will be undergoing substantial refinement between now and next year.

In close, we’ve kind of skirted the issue of whether this is the “best bang for the buck.”  I think that’s fair, because a) I lack good numbers, and b) this is an open question without a good answer yet anywhere despite lots of really smart people working on it across the country, so give me a break already.  But in terms of using our money wisely and in line with the goals above, esp. #2, I think the course of action is

  1. Draw out lessons learned from the TF program and present a plan to improve it next year
  2. Clarify what the metrics of success are that are driving that refinement and how we’re going to measure them
  3. Do some legwork and make sure the brainstormed metrics we come up with are good ones according to the research that’s out there right now

Now, apparently somebody somewhere agrees with me, because #1 and #2 are what tomorrow’s TF meeting is directed toward.  Of course, #3 is probably above my pay grade, falling squarely in the (OE)^2 office and the CoE leadership.  Stay tuned. 

Gates Foundation Report

This post draws heavily from a report issued by the Gates Foundation last month.  That report is a distillation of 10 proposals submitted by some very high-flying names in the consulting industry – McKinsey and BCG among them – that the Gates Foundation hired to analyze 10 different school districts (sites) across the nation and suggest methods for improvement.

Not surprisingly, a heavy component of what these firms do, and what they suggested the school districts do, is to crunch numbers.  Schools are ready-made for number crunching, because they generate so much easily quantifiable data – grades, attendance records, standardized test scores, it’s a quant’s dream – but they don’t do anything with it most of the time.  Some excerpts from the report:

“Our site had never used data for anything other than compliance purposes prior to this teacher effectiveness planning process. Suffice to say, the way in which we’ve used data to understand our site over the last few months has fundamentally changed how we operate as an organization.”

“Data without analysis are nothing more than a collection of numbers.”

  • “One site cited its limited strategic use of data as both an advantage and disadvantage. On the one hand, the site has a clean slate to build a new mindset around data and their use. On the other, the challenge to transform the site’s historically compliance driven data culture requires a significant departure from past practices and remains the focus of an intense change process.
  • “Another site noted the advantage of having a “culture of data orientation that pervades all levels of our site and our schools.” While it took a long time to achieve this culture, the site then had a huge head start in pursuing more sophisticated uses of data to improve teacher effectiveness.
  • “One site is investing in systems improvements to access new types of human resources data that will enhance its pre-existing value-added measures.
  • “One site’s teachers not only receive regular student data reports, but they also are trained in how to use such data to make adjustments to instructional strategies.
  • “One site credited the formal and informal use of data in conversations with principals and school leaders for its success in defining and evaluating accountability targets linked to school performance bonuses.

Data is a terrible thing to waste, and that’s a big chunk of what the report is all about.  In particular, I like this quote:

“Clearly, the ability to link student and teacher data is a necessary prerequisite—if not the linchpin—to define and measure teacher effectiveness.”

That’s the ultimate goal of the data analysis – identify which inputs produce measurable changes in your outputs (and in the right direction!), then figure out which ones do it for the least money, then do as many as you can with your limited budget.  Go figure, eh?

A note on x’s, y’s and pedagogy

Sometimes, variable names just make fall-over good sense.  Like using v for velocity, or d for distance, h for height, t for time, these are all fantastic.  Other times, variable names just get grandfathered in for no good reason.  Like, say, s for displacement (which is what, again, students ask?  oh, how far it went, ok) or m for slope, which we all use because that’s how we learned it and byGodifitwasgoodenoughformeit’sgoodenoughforyouwhippersnappers bah!  Get off my lawn!

But top of the list on bad variable names: x and y.  Oh yes, my venerable variables, for teaching you are atrocious.  Know why?  Because nothing starts with x or y! X-ray machines?  Xylophones?  Xtra clean socks?  Yellow submarines?  Yearly physicals?  Youtube videos?  None of these are any good at all for trying to explain to someone why 3x+2y is neither 5x, 5y, nor simply 5, or why 3-x is not 2, or 2x, or any combination in between. 

I know x and y have firmly entrenched themselves in the math psyche, and that’s likely to change about the same time you see a snowball fight in hell.  But for the love, couldn’t the first introduction to letters as variables be something simple, like a and b?  How much easier would it be to talk about apples and bananas than xylophones and yams, xeroxes and yachts, xenophobes and yurts,  xenon atoms and yeomen?