[Saltassessmentworkinggroup] more details about Andi's study
acurcio at gsu.edu
Mon Aug 2 10:01:11 CDT 2010
Thanks for the questions. In a nutshell, here's the study design: Evidence students in Fall 2009 did not get formative assessments; Evidence students in 2010 got formative assessments. Both groups were roughly equivalent in terms of LGPA & LSAT score/UPGAs and were taught using the same materials other than the formative assessments. 11 [out of 20] final exam short answer/short essay questions from 2009 were used again in 2010. The questions covered a wide range of evidence topics.
The 2010 questions were graded using the same rubric as in 2009. I also blind graded some 2009 questions to make sure that I was applying the rubric in the same way both years. The raw score points for the 11 questions were compared between the 2009 and 2010 students. The study looks only at the difference in raw score points - it does not look at the difference in letter grades because at GSU, we have to grade on a curve. Thus, I was looking to see if the assessments improved raw score points not grades.
Note that I do not release my exam/exam answers after an exam - students who wish to review their exam do so in my office. This cuts down [but obviously does not entirely eliminate] the possibility that the results were due to students telling others about the exam questions.
For those who want a copy of the article, please let me know and I'll send it to you with the caveat that it has not yet been edited by the Journal and thus I'd appreciate it if you didn't circulate it yet.
>>> "Deborah Post" <DEBORAHP at tourolaw.edu> 08/02/10 10:43 AM >>>
Andi, did a grade curve apply in this case? Was there a median or mean that
meant raw scores had to be sorted and placed in particular letter grades?
And was there a control group?
Professor Deborah Waire Post
Touro Law Center
225 Eastview Drive
Central Islip, New York 11722
(631) 761-7137 (office)
deborahp at tourolaw.edu
From: saltassessmentworkinggroup-bounces at lists.washlaw.edu on behalf of Andi
Sent: Mon 02-Aug-10 7:31 AM
To: saltassessmentworkinggroup at lists.washlaw.edu; Hazel Weiser
Cc: kimberly.pray at gmail.com
Subject: Re: [Saltassessmentworkinggroup] Status onScholarship
Projectsinvolving student learning assessments
Along with a colleague from the business school I have completed a study on
the impact of formative assessments on Evidence students' [Fall semester 2Ls]
final exam scores. Our article about the study and its results has been
accepted by the Journal of Legal Ed. Below is the abstract we sent along
with the article.
Much has been written about the need to move from a single end-of-semester
law school exam to a formative assessment system that gives students on-going
feedback. However, few have examined whether this kind of feedback has a
verifiable advantage for law students. This study demonstrates that formative
assessments can be implemented with minimal professorial time and effort and
that the on-going feedback these assessments provide produces a quantifiable
difference in law student final exam performance. Using a short essay/short
answer exam format, the study discusses how a series of un-graded quizzes and
a graded mid-term, all accompanied by model answers, grading rubrics and
self-reflective exercises, resulted in a significant final exam performance
score increase for 70% of the students receiving the feedback interventions
in a large-section required second-year Evidence course.
Those on this list might be interested in the fact that our results are
similar to what colleagues and I found in an earlier study with first year
students and an essay exam format - LSAT score & UGPA factored into whether
students derived a benefit from the formative assessments. In the earlier
study, the benefit only inured to those with above-the-median LSAT scores or
UGPAs. In this study, the benefit reached approximately 70% of the students
- again those with the top 70% of LSAT scores & UGPAs. However, this study
involved second year students and we also had LGPA [law school grade point
average]. Interestingly, because LSAT score and UPGA did not correlate with
LGPA, we found that some students with below the median LGPA also benefited
from the practice exams & mid-term & accompanying feedback.
BTW, my collaborator, Carol Sargent, is one of the potential social science
collaborators on the SALT list. She's a crackerjack and if others need a
social science collaborator, I'd highly recommend getting in touch with her.
Thanks to Hazel for getting the discussion going. If others have done or are
considering doing studies about various assessments, please use this list as
a sounding board.
Hope all have had a great summer.
>>> "Hazel Weiser" <hweiser at saltlaw.org> 07/30/10 11:14 AM >>>
We hope that everyone's summer is that perfect combination of relaxing and
productive! SALT is interested in finding out what projects are moving
forward involving student learning assessments. Who is working on these
issues? How far have you gotten? Has anyone drafted a piece for peer
review? Has anyone published? Please let me know if there has been some
progress. The Standards Review Committee, as you might know, has been
dealing with these issues.
Hazel Weiser, Executive Director
Society of American Law Teachers -- SALT
Public Advocacy Center, Room 223
Touro Law Center
225 Eastview Drive
Central Islip, NY 11722
hweiser at saltlaw.org
Explore our website: www.saltlaw.org
Check out our new blog: http://www.saltlaw.org/blog
SALT Teaching Conference: December 10-11, 2010 in Hawai'i
a community of progressive law teachers working for justice, diversity, and
SALTassessmentworkinggroup mailing list
SALTassessmentworkinggroup at lists.washlaw.edu
More information about the SALTassessmentworkinggroup