Brick and Click 2010: Session II, Catherine Pellegrino

Catherine Pellegrino's "But what did they learn? What classroom assessment can tell you about student learning"

The room is packed! Reference librarian and instruction coordinator at st marys college in noter dame, Indiana.

Difference between traditional course evaluation tools and assessment of student learning, reasons for choosing one over another, looking at Minute Paper. Definitions. Course evaluations: how well did course reach intended goals, degree of satisfaction, did the session meet your needs, what new thing did you learn. Smple evaluAtion with five point likert scale - prepared, organized, helpful. Substantial body of research state that suck tools are neither valid nor reliable. Biggest problems are students who just mark one column all the way down, and that when they have free response, they're at end of form and folks leave those free responses blank. See her bibliography for the lit on this.

If not valid or relaible, why? We love numbers, they're easy, the faculty use them mand we want to assume the trappings of faculty. We've always done it this way.

Outcomes assessment - did the students actually learn something? Pre and post eats, decide in advance and then check to see of they've learned it. Tfaditionally we've measured in terms of inputs, sessions, students, books bought questions answered, gate counts, etc. Inputs don't mean X outputs or outcomes. Not what did we teach but "what did they learn". We don't care from whom or how they learn, just that they did. Outcom assessment overlaps with grading, similar tools with portfolios, exams, grades over time. But also informal options for assessing student learning. For one shots, no luxury of time for elaborate portfolios.angelo and cross "classroom massessment techniques".

Common tool is a muddiest point or minute paper. Folks in audience talking about minute papers. "most important you learned, least, and don't do again." ask students to write down to things. One useful things you learned, one thing you're still confused about.

WhAt you can learn from evaluations. To make sessions more interactive . Tell you nothing about what students are learning. Measures satisfaction, whether students were happy. Assessment measures actual learning. Happy important in certain contexts, but much more interested in whether they are learning than happiness.

You already know your preparedness if you're a reasonable reflective professional. Traditional veal highlights areas for improvement you already know. Included time mto practice: you ,now this, you were there, and there are reasons you may not have been able to give them time to practice.

Explaining and demoing search strategies relevant to research needs. The scale doesn't give you anything to work from, just says you did poorly, not how can improve. Handout wasn't helpful, butWHY? Rate the overall value of the instruction session doesn't actually fit likert scale.

Classroomm assessment minute paper. Showing actual excerpts. Comments transcribed into electronic document, wonderfully revealing. In e aggregate. Also transcribe spelling and grammar exactly. Reflects student understanding. Ex, students don't know how to locate articles in print. Journal title, volume, issue, etc is not intuitive for them. Change how handle print article instead of brief mention. Ex, flowchart of link resolver to full text was useful. (if you need a flowchart, I may be broken). Buoy no one else has said the flowchart was useful, but a screen cast was heavily commented on as beneficial. Important to transcribe quickly bc sometimes need tp translate the comments.

Need to let go of sense of absolute statistical accuracy. Is okay to not achieve that with non quant techniques. Useful though is word cloud generator for interesting view of what pops to students.

"something Im still confused about".

You cant please everyone. You will get contradictory responses. Or headscrathcers, students assume you are looking for satisfactions measures like "everything is great". As long as transcribing, share with faculty member whose course you are working with. Clear evidence of what students are confused about regarding faculty assignments. Evidence of lack of research experience thAt faculty simply don't identify with. Good opening to follow up eith faculty member and class to build collaboration and relationships. see this website for documents and additional information. Article on "i already know that syndrome".


Anonymous said…
You'll want to add a facebook button to your blog. I just bookmarked this article, although I had to complete it manually. Simply my $.02 :)

- Robson
Colleen said…
Thanks! Will do :)

Popular posts from this blog

ASIST 2017 Panel: Standards and Best Practices Related to the Publication, Exchange, and Usage of Open Data

Access 2017 Conference Day 2 Notes Sessions 4-7 #accessYXE

Access 2017 Conference Day 2 Notes Sessions 1-3 #accessYXE