Last changed 20 Jan 2008 ............... Length about 700 words (6,000 bytes).
(Document started on 9 Apr 2007.) This is a WWW document maintained by Steve Draper, installed at http://www.psy.gla.ac.uk/~steve/best/rowntree.html. You may copy it. How to refer to it.

Web site logical path: [www.psy.gla.ac.uk] [~steve] [best] [this page]

Rowntree's 17 proposals for better assessment

By Steve Draper,   Department of Psychology,   University of Glasgow.

(Here are other lists of assessment principles.)

Rowntree,D. (1977) Assessing students: How shall we know them? (Kogan Page: London)

In 1977, Derek Rowntree published a book that ended with 17 proposals for improving how assessment is typically done in Higher Education. In 2007 few have been adopted, and those that have, have been forced on academics directly or indirectly by legislation. The shameful conclusion seems to be that politicians know better than universities what is good educational practice.

Here are my paraphrases of those proposals. The items that have been implemented are marked √.

(Of the 17, numbers 10,14 have been essentially implemented; 9,12 are in progress, pushed from outside the universities, although we'll have to see whether they do become effectively implemented. To raise quality further, my own priorities would be: 1,5,7b,17.)

  1. Articulate the assessment criteria; including trying to express our implicit assessment-constructs. [N.B. this is urged by Sadler, and is an important focus in the REAP project.]

  2. Use more varied assessment methods. Make them educationally relevant.

  3. Give credit for what learners learned, as well as whether they learned what we intended. [N.B. this proposal is constructivist, conflicts with Biggs' "alignment" principle, and with the instructivist assumption that teachers know in advance everything that a learner should learn.]

  4. Assess "naturalistically" i.e. use assessment processes and products that are themselves educationally valuable. [Projects have always been like this. Even exam essays are related where the essay form is central to the discipline.]

  5. Give learners maximum feedback (not just a grade or rank, but summative of their traits/qualities).

  6. When criteria are judgmental, say (to learners) whether their performance is being compared to norms, criteria, our expectations, or the learner's own previous performance.

  7. Colleagues may have quite different perceptions.

  8. Resist drifting to criteria that attract consensus marks: stay with the educationally relevant ones.

  9. (√) Support portfolios: including both products and assessments from many including peers and self. [The growing requirement to support "personal development portfolios" is beginning to address this.]

  10. √ Report results only to learners (i.e. not made public). [Data protection act.]

  11. (√?) Don't conflate i.e. no portmanteau grades. Prepare a multi-dimensional profile: with considerable narrative content. [There is currently a move to transcripts, not simply a single overall degree category.]

  12. No pass/fail except for professional competence certification. (The reader of the report should make the judgement of how good is good enough.)

  13. √ No comments in confidential references that you wouldn't have learners read. [Freedom of information act, at least in Scotland.]

  14. Be explicit in references that the assessment is about specific things; that it is not about permanent qualities; require that you are given some understanding of how the reader will use the report; get the relevant qualities from the reference-requester.

  15. If we predict learners' future qualities, follow up and see how right we were(n't).

  16. Give health warnings on certificates (transcripts) i.e. about the limits on how much weight to give accreditations as a measure of the person. E.g. "Relying too heavily on other people's opinions can damage your sense of reality."

Web site logical path: [www.psy.gla.ac.uk] [~steve] [best] [this page]
[Top of this page]