20 Jan 2008 ............... Length about 700 words (6,000 bytes).
(Document started on 9 Apr 2007.)
This is a WWW document maintained by
Steve Draper, installed at http://www.psy.gla.ac.uk/~steve/best/rowntree.html.
You may copy it.
How to refer to it.
Web site logical path:
Rowntree's 17 proposals for better assessment
Department of Psychology,
University of Glasgow.
(Here are other
lists of assessment principles.)
Assessing students: How shall we know them?
(Kogan Page: London)
In 1977, Derek Rowntree published a book that ended with 17 proposals for
improving how assessment is typically done in Higher Education.
In 2007 few have been adopted, and those that have, have been forced on
academics directly or indirectly by legislation. The shameful conclusion
seems to be that politicians know better than universities what is good
Here are my paraphrases of those proposals.
The items that have been implemented are marked √.
(Of the 17, numbers 10,14 have been essentially implemented; 9,12 are in progress,
pushed from outside the universities, although we'll have to see whether they
do become effectively implemented.
To raise quality further, my own priorities would be: 1,5,7b,17.)
- Articulate the assessment criteria; including trying to express our
[N.B. this is urged by Sadler, and is an important focus in the
- Use more varied assessment methods. Make them educationally relevant.
- Give credit for what learners learned, as well as whether they learned what
[N.B. this proposal is constructivist, conflicts with Biggs' "alignment"
principle, and with the instructivist assumption that teachers know in
advance everything that a learner should learn.]
- Assess "naturalistically" i.e. use assessment processes and products that
are themselves educationally valuable. [Projects have always been like this.
Even exam essays are related where the essay form is central to the discipline.]
- Give learners maximum feedback
(not just a grade or rank, but summative of their traits/qualities).
- When criteria are judgmental, say (to learners) whether their performance
is being compared to norms, criteria, our expectations, or the learner's own
- Colleagues may have quite different perceptions.
- Accept this, don't converge unnaturally; report divergence.
- Give back exam scripts.
- Resist drifting to criteria that attract consensus marks:
stay with the educationally relevant ones.
- (√) Support portfolios: including both products and assessments
from many including peers and self. [The growing requirement to support
"personal development portfolios" is beginning to address this.]
- √ Report results only to learners (i.e. not made public).
[Data protection act.]
- Focus on eventual, not average or early, state (unless describing
[I.e. simply averaging over continuous assessment undermines the value of
assessment as a measure of a learner when they graduate (let alone
- Emphasise learners' strengths, but mention weaknesses.
- (√?) Don't conflate i.e. no portmanteau grades. Prepare a
multi-dimensional profile: with considerable narrative content.
[There is currently a move to transcripts, not simply a single overall degree
- No pass/fail except for professional competence certification. (The
reader of the report should make the judgement of how good is good enough.)
- √ No comments in confidential references that you wouldn't have
learners read. [Freedom of information act, at least in Scotland.]
- Be explicit in references that the assessment is about specific things;
that it is not about permanent qualities;
require that you are given some understanding of how the reader will use the
report; get the relevant qualities from the reference-requester.
- If we predict learners' future qualities, follow up and see how right we
- Give health warnings on certificates (transcripts) i.e. about the limits
on how much weight to give accreditations as a measure of the person.
E.g. "Relying too heavily on other people's opinions can damage your sense of
Web site logical path:
[Top of this page]