September 04, 2012

My job depends on the satisfaction ratings of my students

My last post was my 100th. Yay me. Someone called me prolific, but blogging twice a week isn't exactly a world speed record. Still, I guess it's a sign of something positive that I'm still doing it, me the prodigal quitter.

I just spent three hours grading the tests that my Access class sweated through today. With every new test I graded, my hopes rose: maybe this one will be perfect, maybe this one will demonstrate intelligence and not just a hazy knowledge. Three times out of twelve times I was pleasantly surprised. We don't grade on a curve at the career college; everything is based on a point system. Three points for this skill, two points for that skill, amass enough points and you pass the test. Pile up enough points and you pass the class. Everyone can get an A if they want. It's nice to know at least three people will probably be getting As in my Access class. As long as they turn in all of the homework, of course.

This Access test covers basics: import tables from Excel, revise the design, add some data, create a new table with a lookup field, set some relationships, create some queries with various criteria, print some database documenters. I'm no Access wizard, I can assure you. This is super basic stuff. And mostly, I think they are getting it. No one scored less than 84. I think it is a testament to my thorough test reviews. What the hell. I'm all about teaching to the test. This isn't academia, for god's sake. If you want them to learn job skills, teach 'em and test 'em. Then shove 'em out the door. Yeee-hawwwww, get along little dogie!

I arrived early today after the holiday weekend at the request of one Access student who wanted some extra help. I was fully expecting her not to show, but she did. I'm proud to say I didn't feel one twinge of regret, even though I could have gotten a lot of mileage out of some righteous indignation: She didn't show! Why I oughta—But she did, and I was glad to work through the test review with her, and to see her earn a 97% on the test. Chalk one up for me.

After the Access test, which took the full two hours, I had the Excel class. This is a whole other critter. New students are often signed up for Excel, along with Word and possibly Windows in their first term. This is the sink-or-swim method of college learning. Some of these folks have very little computer experience. Selecting a range of cells is a major accomplishment. Today we were scheduled to go through the review for the next test, which is set for Thursday. Sadly (for them), a good three-fourths of the class haven't come close to finishing the lessons they need to in order to be prepared for the test. We're beyond basic formulas now, getting into mixed and absolute cell references and nested functions. Stress levels were on the rise. One student keeps threatening to bring his shotgun. For the computer he hates so much, I presume.

So, it was a sad day for all, including me, because today was the day the administration wanted to conduct course/teacher evaluations.

When the academic coordinator came to the room, all perky and smiling, my heart rate started to rise. Not because my evaluations from this class are going to totally suck hind tit, but because I knew I would be losing 20 minutes of review time while I twiddled my thumbs in the hallway, waiting for them to finish the evaluations. In the Access class I was able to postpone the evaluation to Thursday due to today's test. (And now the ethical question is do I give them their tests back before or after they take the survey? Don't worry. I always give them their tests back first. Some terms I'm toast as far as my evaluations go. So be it.)

The career college has used a variety of methods to evaluate its instructors and courses. When I first started working for the college in 2003, they used a pencil and paper system. I was dumbfounded when I saw a copy. You've heard of a double-barrelled question, in which a question has two parts that can be answered differently, making it very difficult for the respondent to discern which part of the question to answer? Well, in this survey, questions were triple-barrelled. For example, indicate the extent to which you agree with the following statement on a scale of 1 to 5, where 1 means disagree strongly and 5 means agree stronglyMy instructor was entertaining, fair, and super nice.

“How can you possibly get actionable data from these questions?” I remember asking whatever poor sould was the academic coordinator. She gave me a perplexed look. I had to laugh. What else could I do?

A few years later, someone asked me to write a questionnaire for them, and I did, but like so many of my priceless and essential suggestions, nothing ever came of it. After a few more years, probably right after the next accreditation site visit, we switched to an electronic system. The questions, however, remained the same. I pointed out the ongoing problem. “How can I know what to work on improving, if each question I'm being rated on has three distinct and conflicting behaviors?” No one had an answer, even when I pointed out that a teacher's job performance rating was based on these evaluations. In fact, faculty were losing their jobs over these evaluations. (Yes, I'm a potstirrer, I'll admit it. The chronic malcontent strikes again.)

A few terms ago, someone introduced a Survey Monkey survey, which we are still using. I think the questions are more reasonable now, last time I checked. I can't be positive, because I can't go very far in the survey without entering data. I'm far too respectful of the research process to attempt to alter the results by entering fake data, so I just back out quietly and hope for the best. Apparently in recent terms many students have been choosing to avoid participating in the evaluation process. Because these surveys are required for the college to maintain its accreditation, now the administration is sending the academic coordinators at each site to the labs to proctor the survey. Teachers must exit, stage right.

And that is how I ended up in the hallway, fuming and twiddling, and thinking of the traffic I would encounter should I stay a bit late to demonstrate nested functions for the few diehards who wanted to stay after class. God bless 'em. I stayed and showed the amazing nested function process to a few folks, who were properly grateful. I had the impression they gave me good reviews. I suspect the students that left early without looking at me probably trashed me. I can imagine the comments. Carol is a snarky snippy teacher. Carol ignores me and spends all her time with the slow students. Carol talks too much. Carol doesn't know how to teach. Carol doesn't grade fairly. And more, in lousy grammar, sprinkled liberally with misspelled words.

Ho hum. (Have I mentioned I'm burned out on teaching?) Actually, I've gained a tremendous amount of patience through being a teacher. I am continually reminded that I cannot poke or prod someone into learning faster than he is capable of learning. I cannot convince someone of the value of learning the material unless she is willing to listen with an open mind. Being snarky or snippy certainly doesn't endear me to anyone, nor does it enhance the learning process. I've learned. I'm still working on applying what I've learned, but then, aren't we all?