Web site logical path: [www.psy.gla.ac.uk] [~steve] [ilig]

Compilation (for printing) of pages on EVS and interactive lectures

This compilation was assembled on 28 March 2024.

Last changed 16 May 2009 ............... Length about 400 words (6,000 bytes).
This is a WWW document maintained by Steve Draper, installed at http://www.psy.gla.ac.uk/~steve/ilig/lobby.html. You may copy it. How to refer to it.

Electronic Voting Systems and interactive lectures: entrance lobby

(written by Steve Draper)   Short URL for this page: http://evs.psy.gla.ac.uk/

Pic version of text below why

Main EVS website index page


This is the entrance point for my web pages on Electronic Voting Systems (EVS) for use in lectures; or more generally for interactive lectures (ILIG = Interactive Lecture Interest Group); or more specifically for the PRS equipment which we mainly use, and for local Glasgow University arrangements.

If you want a quick look at what it's all about, to see if it might interest you, then try

To see all the things available on this site you should read over the main website index page; or print off all the pages to study: they are available as a single web page ready for printing: compilation on designing lectures (that use an EVS) and a comprehensive compilation.

Some of the most popular parts are:


Last changed 3 Oct 2017 ............... Length about 4,000 words (43,000 bytes).
This is a WWW document maintained by Steve Draper, installed at http://www.psy.gla.ac.uk/~steve/ilig/main.html. You may copy it. How to refer to it.

Interactive lectures interest group (ILIG): main website index page

(written by Steve Draper)   Short URL for this page: http://evs.psy.gla.ac.uk/

This is the home page for some web pages, now largely out of date, about interactive lecture methods in general, and using classroom electronic voting systems (EVS) in particular. (EVS are also sometimes referred to as "clickers", PRS, GRS, CCS: for a discussion, see this list of terms used.)

(If you find this site useful, other major sets of pages on a similar topic are at: Peer instruction for computer science; Stanford / tomorrow's professor; Vanderbilt; Amherst; or this Simpson & Oliver 2002 report (33 pages); pages on EVS use in maths from Loughborough and from Surrey. or a page at Colorado.)

If you are located in the UK, particularly, then you might want to join the special interest group on EVS "ESTICT": "Engaging Students Through In-Class Technology". (Other, desperate enquiries about it to Sian Cox [sian.cox.1 AT city.ac.uk].) You might also want to join this email list: http://www.jiscmail.ac.uk/lists/ELECTRONIC-VOTING-SYSTEMS.html (electronic-voting-systems AT jiscmail.ac.uk).

You can access the pages on this website in alternative ways:

Contents of this page (click to jump to a section)


There are basically two ways for a newcomer to tackle these web pages and the subject of interactive teaching with EVS. If you just want to know what EVS are, or if you have already decided to give them a try (perhaps because you have an idea where and how they would fit into your own work), then you want the "bottom up" approach: go to the section below on "How-to advice", and it will take you from low level practical details, up through designing a question, then the presentation issues (such as explanations) around a single question, then on to designing sets of related questions, and on "up" to wider scopes. On the other hand, if you aren't particularly committed to technology but are interested in systematically changing teaching to be more effective by being more "interactive", then you want the "top down" approach, and should begin with the first section "Interactive Lectures". You are more likely to be interested in this approach if you are a head of department or at least a course team leader, and can consider substantial changes to the demands made of your students and the timetable.

Interactive Lectures

  • Interactive Lectures: overall points.
          The EVS technique       The one minute paper technique.
  • Short (2 pages and a table) overview of our past work with electronic voting systems. PDF file
  • EVS: a catalyst for lecture reform by Alistair Bruce.
  • Transforming lectures to improve learning

    Using EVS at the University of Glasgow

    Current (2016) system: YACRS

    Neil Barr has created a new system "YACRS" (Yet Another Classroom Response System). It is a classroom interaction system that allows students to use their own devices (mobile phones, tablets or laptops) to respond to questions during class. The motivation behind developing it was that the EVS systems we use at the University of Glasgow were becoming increasingly problematic — making sure batteries were OK, identifying broken hand-sets and getting them to the right lecture theatre all posed problems. Since almost all students carry a smartphone or other device it seemed logical to replace the clicker system with a web based system. A few features:

    The websites for versions of the YACRS server are:
    https://classresponse.gla.ac.uk (most stable)
    https://learn.gla.ac.uk/yacrs/ (More advanced)
    https://learn.gla.ac.uk/yalis/ (being developed further)

    Basic teacher guide
    A talk about it was given at the 2015 internal learning and teaching conference. The abstract is at page 12 of the proceedings.

    Wifi coverage: the expert seems to be Drew McConnell; and the online web pages about coverage are not accurate (seem to under-report its extent). I have the impression that there are just a very few rooms with no usable coverage. Newer / larger lecture theatres may have multiple points. A test of YACRS in the Joseph Black LT with 190 students had no trouble for mass student access to wifi, even though it had only one access point in the room itself.

    Passport photo A server is being commissioned for this. Currently ask Niall Barr to set permissions for you to use this; and to get more information.


    Newsletter items about YACRS:   1   2
    Other systems currently (October 2017) being used by some:

    Past advice on the older RF (radio) handsets

    Information on using the university's EVS equipment is accessed from a single web page (and if necessary this phone number: x3286). The university now owns quantities of PRS RF equipment, which is centrally managed. (There is also older IR equipment available, and computing science owns some new WordWall equipment which allows freeform texting input from the audience.)

    You may want to consider these elements (you can find links for these from the link above):

    Past work and advice on the older IR handsets

  • The EVS technique (short introduction with pictures)
  • More practical details (longer introduction)
  • Physics has its own equipment
  • Past workshops for prospective users
  • Overview evaluation paper about uses Oct 2001 - Dec 2003.

    How-to advice on using EVS anywhere

    This section is essentially a "bottom up" tour, beginning with practical technical details, and gradually leading to wider questions of how to string questions together or redesign whole sessions.

    What's it all about?

    If you want a quick look at what it's all about, to see if it might interest you, then try

    Getting started quickly

    The majority of the lecturers and presenters who have approached us to try using the EVS have already had an idea about how they might be used, and wanted practical tips on putting this into practice. Here are some introductory how-to topics for your first few uses.

    More detailed issues in designing and conducting sessions with EVS questions

    The set of different benefits and pedagogical approaches

    What are the pedagogical benefits / aims?     Short answer     Best summary     (an alternative expression)     Long answer (a whole paper)    
  • Summary list of pedagogic purposes for EVS

    Technologies

    Technologies and alternatives are given on this page,
    which also includes contact information on equipment purchase
    (and a few bits of history of earlier efforts)

    FAQs

    Some other common questions not answered elsewhere on these pages are here.

    Evaluating evidence on EVS effectiveness

    There are basically three classes of evidence to consider: as discussed on this page.

    Publications

    Ad hoc bibliography.

    In that bibliography, a very few outstanding things are starred: if you want to do some reading, you could do much worse than start with them. See also below.

    written at Glasgow University

    Written elsewhere in the UK

    Mentions in newspapers

  • Mentions in THES (Times Higher Education Supplement) include:
  • Guardian Education 21 may 2003 pp.14-15
  • Andy Sharman "Lessons at the click of a finger" The Independent, Education section 14 Feb 2008 p.6

    written elsewhere in the world

    A large, if rather random, collection of articles related to EVS written elsewhere in the world are in the ad hoc bibliography. Here are a few notable ones.

    Other local Web documents

  • Newsletter ad for users at University of Glasgow
  • Second Newsletter ad for users at University of Glasgow
  • Third Newsletter ad for users at University of Glasgow
  • Newsletter article on use in English Literature
  • A letter to THES
  • Alternatives, TECHnologies, and VENDORS.
  • Unnecessary technical details of PRS [not finished yet]
  • Hake and what matters (To be written)
  • Undigested notes and URLs
  • Some pictures of PRS: at the end of this page and also here
  • Some UK sites and PEOPLE who use EVS

    Some other websites

    Here's just a very few websites related to EVS that you might like to try. topic are at: Vanderbilt; Amherst; or this Simpson & Oliver 2002 report (33 pages); pages on EVS use in maths from Loughborough and from Surrey. or a page at Colorado.)

    Handset use in the UK

    Some other UK sites and people who use EVS are listed here.

    Some human contacts at Glasgow

    If you want to actually talk to someone, you could try:

    Other important Glasgow University contacts:


    Other people who use EVS (and PRS users) in the UK are listed here.


    Last changed 7 Dec 2015 ............... Length about 1300 words (10,000 bytes).
    (Document started on 29 Jan 2005.) This is a WWW document maintained by Steve Draper, installed at http://www.psy.gla.ac.uk/~steve/ilig/terms.html. You may copy it. How to refer to it.

    Terms for EVS

    By Steve Draper,   Department of Psychology,   University of Glasgow.

    There are various terms or names used to refer to the equipment in question:

    I don't like them. On the other hand, some others do. I've debated this most with Michael McCabe. Here are my views (with which he disagrees quite strongly) on what the names should say, and what is wrong with the ones being used.

    Kay reports finding (in 2008) 26 terms in the literature.

    The main points

    [Electronic, digital vs. group, audience, classroom, ...]
    A problem with the terms ARS, PRS, GRS etc. is that they fail to express the main meaning. Putting your hand up in class, shouting, a mob lynching someone are all "group responses" but that isn't what people who use these phrases mean. They are almost entirely interested in new electronic systems, so the phrase fails to express what is, to the speaker and to their intended audience, the key defining feature. So the name should say "electronic" or "digital" to mark this, except for those who really are discussing the use of raised hands etc., and not just new technology.

    [System vs. equipment, technology]
    Saying "system" to refer to a small bit of equipment which is not a system that stands alone (without human operators it does nothing), but a wholly dependent adjunct on the real system of, say, teacher, students and discussion is inaccurate and self-inflating: "equipment" might be more exact. The real "system" using EVS in, say, education is something like the plan for the whole lecture or session. There are a number of quite different alternatives that do use EVS (e.g. Mazur's "Peer instruction", or contingent teaching); and also still others (e.g. MacManaway's use of "lecture scripts") that do not, but are equally revolutionary and promising.

    [voting, polling vs. texting vs. other shared data types]
    The equipment I'm usually referring to is for giving one of a small number of pre-determined alternative choices i.e. responding only to MCQs (multiple choice questions): hence the direct term would be "voting" or "polling". This also contrasts it to some other technologies that support free-text open-ended input from the audience (like mobile phone SMS texting). Note, however, that although this too certainly could be useful in some ways, many types of meeting cannot handle this: imagine a hundred people all sending in text responses: no-one (neither audience nor presenter) can scan a hundred different text messages and summarise or cluster them usefully. A feature of voting (i.e. of MCQs) is that summarising is easy: just count the votes for each alternative and present these five or so numbers. This is a fundamental advantage for large groups of more than about six people (say). So voting is a feature not a limitation for such groups. Of course other kinds of interaction are organised round free-text: email, blogs, discussion fora, etc. So we need a term for these that contrasts with voting, but covers all the free-text group electronic communication systems -- perhaps "texting". A third alternative is passing around other material e.g. software, as in a classroom or lab with networked computers.

    Further points

    [Synchronous vs. asynchronous]
    Part of what I usually mean is the use as part of a synchronous meeting, whether face to face or online; as opposed to asynchronous like email, or phone (text message) polls done for TV over a day or a week. And in fact response time really does matter here. A class often wants to move on quickly from a vote to another topic or to explanations and discussion of the disagreement, and a response time of minutes, and preferably of seconds, is needed. In contrast even in a small area like the UK, parliamentary elections take more than a day to decide and broadcast the results, and SMS texting may take hours depending on the network state. Remember "synchronous" doesn't mean instantaneous but it does mean the recipient is sitting waiting for the result before they can do anything else.

    [1 vs. 2 way]
    To technologists, a huge difference is equipment that offers 1-way vs. 2-way communication (e.g. feedback lights or a little screen on each handset). However to users, this is about as unimportant as whether the person you are talking to says "yes" (2-way) or nods (1-way for a sonic technologist, but 2-way in terms of human communication). All the equipment relies on fast feedback, but some do this by projecting information on a big screen for all to read together.

    [Decision support vs. establishing mutual knowledge of the spread of opinions]
    Furthermore the applications are less about making group decisions (at least with the voting technology) and more about coordinating group thinking and understanding by giving everyone an overview of what and how strong the consensus or disagreement is. These distinguish it from formal voting for political candidates or in shareholder meetings: more synchronous than asynchronous; more about establishing mutual knowledge of the varieties of opinion than reaching a final decision.

    [personal vs. subgroup voting]
    Another issue is whether every audience member has their own handset and vote, or whether they agree a group vote i.e. one vote per small group.

    [Face to face vs. online, "virtual"]
    The main application I'm interested in is face to face, but actually it could perfectly well be done online (but synchronously) (though the equipment might be different). And one of the areas we are exploring at Glasgow is moving MCQs and associated discussion between the web out of class, and EVS in class as seamlessly as possible.

    [Education vs. other applications]
    The applications I am interested in are educational, but many sets of the same technology are sold to business for meetings for planning, brain-storming etc. That's what is wrong for some audiences in saying "classroom EVS". "Group decision support system" is a term sometimes used for the business, not educational, applications.

    Technological distinctions that can matter are:

    Summary

    What is generally meant here is an electric or electronic technology, used for polling in groups of size 10-1000 (not millions, as in serious national electronic voting), as part of a synchronous interaction (could be face to face or online), usually to share thinking and disagreements more than to come to decisions. What is most important really is that the human interaction supported by the equipment is real time (i.e. synchronous), and always interactive (even if one direction is optical and only one is electronic).

    I've started to standardise on the term "EVS", although perhaps "synchronous electronic polling equipment (SEPE)" would really be even more exact.


    Last changed 15 Feb 2005 ............... Length about 800 words (7,000 bytes).
    This is a WWW document maintained by Steve Draper, installed at http://www.psy.gla.ac.uk/~steve/ilig/il.html.

    Interactive Lectures

    (written by Steve Draper,   as part of the Interactive Lectures website)

    A summary or introductory page on interactive lectures.

    Contents (click to jump to a section)

    Why make lectures interactive?

    To improve the learning outcomes. [The positive way of putting it.]

    Because there is no point in having lectures or class meetings UNLESS they are interactive. Lectures may have originated before printing, when reading a book to a class addressed what was then the bottleneck in learning and teaching: the number of available books. Nowadays, if one-way monologue transmission is what's needed, then books, emails, tapes will do that, and do it better because they are self-paced for the learner. [The negative way of putting it.]

    What are interactive lectures?

    Whenever it makes a difference that the learners are co-present with the teacher and each other. This might be because the learners act differently, or think differently; or because the teacher behaves differently.

    In fact it is not enough to be different: it should be better than the alternatives. Learners are routinely much more interactive with the material when using books (or handouts) than they can be with lectures: they read at their own pace, re-read anything they can't understand, can see the spelling of peculiar names and terms, ask other students what a piece means, and carry on until they understand it rather than until a fixed time has passed. All of these ordinary interactive and active learning actions are impossible or strongly discouraged in lectures.

    So for a lecture to be interactive in a worthwhile sense, what occurs must depend on the actions of the participants (not merely on a fixed agenda), and benefit learning in ways not achieved by, say, reading a comparable textbook.

    Alternative techniques

    One method is the one minute paper: have students write out the answer to a question for just one minute, and collect the answers for response by the teacher next time.

    Another method is to use a voting system: put up a multiple choice question, have all the audience give an anonymous answer, and immediately display the aggregated results.

    Another method is "Just in time teaching", where students are required both to read the material and to submit questions on it in advance, thus allowing the contact time to be spent on what they cannot learn for themselves.

    In fact there are many methods.

    Pedagogical rationale / benefits

    In brief, there are three distinct classes of benefit that may be obtained by interactive techniques:

    The general benefits, and specific pedagogic issues, are very similar regardless of the technique used. I have written about them in a number of different places including:


    The key underlying issues, roughly glossed by the broad term "interactivity", probably are:


    Last changed 20 Feb 2005 ............... Length about 300 words (3,000 bytes).
    This is a WWW document maintained by Steve Draper, installed at http://www.psy.gla.ac.uk/~steve/ilig/why.html.

    Why use EVS? the short answer

    (written by Steve Draper,   as part of the Interactive Lectures website)

    What are the pedagogical benefits / aims?
    To "engage" the students i.e. not only to wake them up and cheer them up, but to get their minds working on the subject matter, and so to prompt learning.

    How specifically?:

    1. Simple questions to check understanding: "SAQs" (self-assessment questions) to give "formative feedback" to both students and presenter.
    2. Using responses (e.g. proportion who got it right) to switch what you do next: "contingent teaching" that is adapted on the spot to the group.
    3. Brain teasers to initiate discussion (because generating arguments (for and against alternative answers) is a powerful promoter of learning).
  • A short argument on why be interactive
  • A short introduction to EVS
  • EVS: a catalyst for lecture reform by Alistair Bruce.
  • Long answer (a whole paper) on pedagogic potential

    But above all, realise from the start that there are powerful benefits not just for learners but also for teachers. Both need feedback, and both do much better if that feedback is fast and frequent -- every few minutes rather than once a year. So the other great benefit of using EVS is the feedback it gives to the lecturer, whether you think of that as like course feedback, or as allowing "contingent teaching" i.e. adapting how the time is spent on rather than sticking to a rigid plan that pays no attention to how this particular audience is responding.


    Last changed 31 Jan 2005 ............... Length about 500 words (5,000 bytes).
    This is a WWW document maintained by Steve Draper, installed at http://www.psy.gla.ac.uk/~steve/ilig/handsetintro.html.

    Using EVS for interactive lectures

    (written by Steve Draper,   as part of the Interactive Lectures website)

    This is a brief introduction to the technique of using EVS (electronic voting systems) for interaction in lectures. (A complementary technique is the one minute paper which uses open-ended audience input. An introduction to interactive lectures and why attempt them is here.)

    The technique is much as in the "Ask the audience" lifeline in the TV show "Who wants to be a millionaire?". A multiple choice question (MCQ) is displayed with up to 10 alternative response options, the handsets (using infrared like domestic TV remote controls) distributed to each audience member as they arrive allow everyone to contribute their opinion anonymously, and after the specified time (e.g. 60 seconds) elapses the aggregated results are displayed as a barchart. Thus everybody sees the consensus or spread of opinion, knows how their own relates to that, and contributes while remaining anonymous. It is thus like a show of hands, but with privacy for individuals, more accurate and automatic counting, and more convenient for multiple-choice rather than yes/no questions.

    It can be used for any purpose that MCQs can serve, including:


    At Glasgow University we currently use the PRS equipment: small handheld transmitters for each audience member, some receivers connected to a laptop up front, itself connected to a data projector and running the PRS software. This equipment is portable, and there is enough for our largest lecture theatres (300 seats). Given advance organisation, setting up and packing up can be quick. We can accommodate those who normally use OHPs, powerpoint, ad hoc oral questions, or a mixture.

    More practical details are offered here, and more details of how to design and use the questions are available through the main page, e.g. here.

    Handset transmitter Fig.1 Infrared handset transmitter

     

     

     

     

     

     

      Handset transmitter
    Fig.2 A receiver

    Handset transmitter
    Fig.3 The projected feedback during collection, showing handset ID numbers

    Handset transmitter
    Fig.4 Display of aggregated responses


    Last changed 9 Sept 2007; 2 Sept 2023 ............... Length about 2000 words (17,000 bytes).
    This is a WWW document maintained by Steve Draper, installed at http://www.psy.gla.ac.uk/~steve/resources/minute.html.

    One minute papers

    By Stephen W. Draper,   Department of Psychology,   University of Glasgow.

    The basic idea is that at the end of your session (e.g. lecture) you ask students to spend one minute (60 seconds) writing down their personal (and anonymous) answer to one or two questions, such as "What was least clear in this lecture?". Then you collect the scraps of paper and brood over them afterwards, possibly responding in the next session. It's wonderful because it takes only a minute of the students' time (each), requires no technology or preparation, but gives you immediate insight into how your class is doing. There are probably other benefits too.

    That is the short version, which is all you really need to give it a try out. Trying it out is probably, if it is at all possible, the best second step in understanding the technique. However when you want more information, theory, and examples, then the rest of this document offers some.


    The longer version

    This is a note on the simple but excellent technique summarised above to use in teaching, particularly lectures. These particular notes are mainly adapted (stolen) from David Nicol, although the ideas also appear in the literature if you look for them. [Angelo,T.A & Cross,K.P. (1993) Classroom assessment techniques: a handbook for college teachers (San Francisco : Jossey-Bass Publishers) p.148. Stead,D.R. (2005) "A review of the one-minute paper" Active Learning in Higher Education vol.6 pp.118-131.] For more, you should go on his workshop (as part of a course for new lecturers, or see here), or bother him personally.

    Credit might go to: The "minute paper" has long been ascribed to Wilson as he was apparently the first to describe it in the literature: R.C.Wilson "Improving faculty teaching: Effective use of student evaluations and consultants" J. Higher Educ. vol.57 pp.192-211 (1986). More recently, it has been acknowledged that the original source of the idea was Berkeley physicist C. Schwartz. See Barbara Gross Davis, Lynn Wood, and Robert C. Wilson, A Berkeley Compendium of Suggestions for Teaching with Excellence (University of California, Berkeley) (1983) available at http://teaching.berkeley.edu/compendium/suggestions/file95.html See equally http://www-writing.berkeley.edu:16080/wab/2-2-gone.htm

    I am addressing this note to teachers like myself: what they might do, and why. However a student could usefully read this, and carry it out privately. They could then use what they write for these one minute papers a) as a useful study habit; b) as a procedure for generating a question to ask as part of their good practice in being a student.

    Although your first uses are likely to be generic, if you use it regularly you can focus it to your particular concerns that day for that class, by designing questions with respect to the learning objectives, or important disciplinary skills, or the sequence of development important for that course.

    Remaining Contents (click to jump to a section)

    How to do it

    Most common questions to set

  • "What question do you most wish to have answered at this moment?"
    [I.e. tells you what you failed to get across, what you should fix at the start of next time.]

  • "What was the main point of today's lecture?"
    [Often a lot of what you said went aross, but the overall point is not apparent to them, or not apparent that it WAS the chief point.]

  • "What are the most important questions remaining unanswered?"

  • "What was the muddiest point?"

    More questions

    Asking questions

    Many of these questions could be asked either at the end, or in the middle, or at the start.

    Many are best announced at the start but written at the end i.e. "At the end I am going to ask you to write for a minute on ....". This should promote more thinking during the class.

    In asking each question, don't forget to specify the "rubric" i.e. state what kind of response is required e.g.


    Classifying questions

    Questions could be classified in various ways e.g.

    Many questions can be fitted under both of two contrasting types e.g. asked either as MCQs or as one-minute open ended papers; or be both reflective and about testing content retention.

    Feedback

    Content

    Reflective

    Rationale: theoretical articulations of why this is good

    Most of the reasons for using this technique apply more generally to interactive lectures but can be spelled out as follows.

    Course feedback; feedback from learner to teacher

    The first kind of benefit from this technique is to get good feedback from learners to teacher on how the learning and teaching process is going. Standard course feedback is largely ineffective in improving things. Two massive drawbacks, each alone sufficient to render it ineffective, of the standard method of one feedback questionnaire per course, are:

    You can get, if you wish, still more precise information by focussing the question you ask e.g. on a learning objective from the course, on a specific skill you think important to the discipline, etc. In other words, as an evaluation technique, it can be sensitive to context, to the discipline, to the course, to a particular (perhaps unusual) session. But also, it can be completely open-ended, and detect the surprises the teacher would never have thought to ask about (e.g. "I had no idea my graphs were not self-explanatory").

    Direct benefits to the learners

    If your teaching is too perfect to need improvement, or if you are too wimpish to take negative feedback, or in addition to the course feedback function, there are arguably direct benefits to the learners even if the teacher never reads the collected bits of paper.

    Above all, they can be used to get learners to:

    Fostering interaction / dialogue between teacher and learners

    Independently of private benefits to the teacher and of private benefits to the learners, there are the benefits of establishing real "dialogue": that is, an iterative (to and fro) process in which a common understanding is progressively established rather than communications each succeeding or failing as one-off acts. This is both immediately valuable, and makes it progressively easier for little interactions such as clarification questions to be made and dealt with easily, and quickly.

    Aspects of this, and of how this technique contributes and can succeed at this, are:

    And as a complement to handsets

    And finally: this technique may also be very valuable as a complement to using handsets in lectures. Handsets are excellent in many ways, above all in promoting dialogue. But they are essentially a technique revolving around Multiple Choice Questions (MCQs) which have fixed response sets. One minute papers use open-ended responses, and so collect the unexpected and the unprompted. MCQs invite guessing; one minute papers do not.

    The handsets give an immediate shared group response, and so can move the dialogue forward faster (every 5 minutes rather than once per session). However one-minute papers are better at uncovering complete surprises (students saying things it didn't occur to the teacher to put as an optional response in an MCQ); and at giving you a chance to think about each answer even if it does take you by surprise.


    Last changed 24 Feb 2005 ............... Length about 4,000 words (29,000 bytes).
    (Document started on 15 Feb 2005.) This is a WWW document maintained by Steve Draper, installed at http://www.psy.gla.ac.uk/~steve/ilig/td.html. You may copy it. How to refer to it.

    Transforming lectures to improve learning

    By Steve Draper,   Department of Psychology,   University of Glasgow.

    Contents (click to jump to a section)

    Introduction

    Some of the most successful uses of EVS (Electronic Voting Systems) have been associated with a major transformation of how "lectures" have been used within a HE (Higher Education) course. Here we adopt the approach of asking how in general we might make teaching in HE more effective, and keeping an open mind about whether and how ICT (Information and Communication Technology) could play a role in this. The aim then is to improve learning outcomes (in quantity and quality) while only investing about the same, or even fewer, teaching resources. More specifically, can we do this by transforming how lectures are used.

    Replacing exposition

    The explicit function of lectures is exposition: communicating new concepts and facts to learners. In fact lectures usually perform some additional functions, as their defenders are quick to point out and as we shall discuss below, but nevertheless in general most of the time is spent on exposition and conversely most exposition (in courses based on lectures) is performed by lectures. Clearly this could be done in other ways, such as requiring learners to read a textbook. On the face of it, this must be not only possible, but better. Remember, the best a speaker, whether face to face or on video, can possibly do in the light of individual differences between learners is to speak too fast for half the audience and too slowly for the other half. Reading is self-paced, and is therefore the right speed for the whole audience. Furthermore reading is in an important sense more interactive than listening: the reader can pause when they like, re-read whatever and whenever they like; pause to think and take notes at their own pace, before going on to try to understand what is said next -- which is likely to assume the audience has already understood what went before. So using another medium for the function of exposition should be better. Can this be made to work in actual undergraduate courses?

    Yes. Here are several methods of replacing exposition and using the face to face large group "lecture" periods for something else.

    It seems clear that lectures are not needed for exposition: the Open University (OU) has made this work for decades on a very big scale. Another recurring theme is the use of questions designed not for accurate scores (summative assessment), but to allow students to self-diagnose their understanding, and even more, to get them thinking. A further theme is to channel that thinking into discussion (whether with peers or teachers). This requires "interactivity" from staff: that is, being ready to produce discussion not to some plan, but at short notice in response to students' previous responses.

    Should we expect to believe the reports of success with these methods, and should we expect them to generalise to many subjects and contexts? Again the answer is yes, which I'll arrive at by considering various types of theoretical analysis in turn.

    The basic 3 reasons for any learning improvements

    Many claims of novel learning success can be understood in terms of three very simple factors.

    1. The time spent by the learner actually learning: often called "time on task" by Americans. The effect of MacManaway's approach is to double the amount of time each learner spent (he studied how long they took reading his lecture scripts): first they read the scripts, then they attended the classes anyway. In fact they spent a little more than twice as long in total. Similarly JITT takes the same teacher time, but twice the student time.

    2. Processing the material in different ways. It probably isn't only total time, but (re)processing the concepts in more than one way e.g. not only listening and understanding, but then re-expressing in an essay. That is why so many courses require students not just to listen or read, but to write essays, solve written problems etc. However these methods are usually strongly constrained by the amount of staff time available to mark them. Here MacManaway got students to discuss the issues with each other, as do the IE and JITT schemes. Discussion requires producing reasons and parrying the conflicting opinions and reasons produced by others. Thinking about reasons and what evidence supports what conclusions is a different kind of mental processing than simply selecting or calculating the right answer or conclusion.

    3. Metacognition in the basic sense of monitoring one's degree of knowledge and recognising when you don't know or understand something. We are prone to feeling we understand something when we don't, and it isn't always easy to tell. The best established results on "metacognition" (Hunt, 1982; Resnick, 1989) show that monitoring one's own understanding effectively and substantially improves learning. Discussion with peers tests one's understanding and often leads to changing one's mind. The quizzes in the OU, JITT and the IE methods also perform this function, because eventually the teacher announces the right answer, and each student then knows whether they had got it right.
      Brain teaser questions also do this, partly because they frequently draw wrong answers and so force the learner to reassess their grasp of a concept, but for good learners the degree of uncertainty they create, even without the correct solution being announced, is alone enough to show them their grasp isn't as good as it should be.

    The Laurillard model

    The Laurillard (1993) model asserts that for satisfactory teaching and learning, 12 distinct activities must be covered somehow. Exposition is the first; and in considering its wider place, we are concerned with the first 4 activities: not only exposition by the teacher, but re-expression by the learner, and sufficient iteration between the two to achieve convergence of the learner's understanding with the teacher's conception.

    Re-expression by learners (Laurillard activity 2) is achieved in peer discussion in the MacManaway and Interactive Engagement schemes, and by the quizzes in the OU and JITT schemes. Feedback on correctness (Laurillard activity 3) is provided by peer responses in the IE schemes and by the quiz in the JITT and IE schemes. Remediation more specifically targeted at student problems by the teacher (a fuller instantiation of Laurillard activity 3) is provided in the JITT scheme (because class time is given to questions sent in in advance), and often in the IE schemes in response to the voting results.

    Thus in terms of the Laurillard model, instead of only covering activity 1 as a strictly expository lecture does, these schemes offer some substantial provision of activities 2,3 and 4 in quantities and frequency approaching that allocated to activity 1, while using only large group occasions and without extra staff time.

    The management layer

    I argue elsewhere that the Laurillard model needs to be augmented by a layer parallel to the one of strictly learning activities: one that describes how the decisions are made about what activities are performed. At least in HE, learning is not automatic but on the contrary, highly intentional and is managed by a whole series of decisions and agreements about what will be done. Students are continually deciding how much and what work to do, and learning outcomes depend on this more than on anything else. In many cases lectures are important in this role, and a major reason for students attending lectures is often to find out what the curriculum really is, and what they are required to do, and what they estimate they really need to do. One reason that simply telling students to read the textbook and come back for the exam often doesn't work well is that, while it covers the function of exposition, it neglects this learning management aspect. Lectures are very widely used to cover it, with many class announcements being made in lectures, and the majority of student questions often being about administrative issues such as deadlines.

    The schemes discussed here (apart from the OU) do not neglect this aspect, so again we can expect them to succeed on these grounds. They do not abolish classes, so management and administrative functions can be covered there as before. In fact the quizzes and to some extent the peer discussion offer better information than either standard lectures, a textbook or lecture script about how a student is doing both in relation to the teacher's expectations and to the rest of the class. They also do this not just absolutely (do you understand X which you need to know before the exam) but in terms of the timeline (you should have understood this by today).

    In addition to this, these schemes also give much superior feedback to the teacher about how the whole course is going for this particular class of students. This equally is part of the management layer. However standard lectures are never very good for this. While a new, nervous, or uncaring lecturer may pick up nothing about a classes' understanding, even a highly skilled one has difficulty since at best the only information is a few facial expressions and how the self-selected one student answers each question from the lecturer. In contrast most of the above methods get feedback from every student, and formative feedback for the teacher is crucial to good teaching and learning. What I have found in interviewing adopters of EVS is that while many introduced it in order to increase student engagement, the heaviest users now most value the way it keeps them in much better touch with each particular class than they ever had without it.

    This formative feedback to teachers is important for debugging an exposition they have authored, but is also important for adapting the course for each class, dwelling on the points that this particlar set find difficult.

    Other functions of lectures

    Arguments attacking the use of lectures have been made before (Laurillard, 1993). Those seeking to defend them generally stress the other functions than simple exposition that they may perform. One of these is learning management, as discussed in the previous section. Some others are:

    Conclusion

    We began by considering some schemes for replacing the main function of lectures -- exposition -- and then used various pieces of theory to discuss whether the proposed schemes would be likely to be successful at replacing all the functions of a lecture. Overall, while providing exposition in other media alone might be worse than lectures because of neglecting other functions, the proposed schemes should be better because they address all the identified functions and address some important ones better than standard lectures do.

    Thus we can replace some or all exposition in lectures. Furthermore, we can re-purpose these large group meetings to cover other learning activities significantly better than usual. We can feel some confidence in this by a careful analysis of the functions covered by traditional lectures, and the ones thought important in general, and show how these are each covered in proposed new teaching schemes. This in turn leads to two further issues to address.

    Firstly: which functions can in fact be effectively covered in large group teaching with the economies of scale that allows, and which others must be covered in other ways? Besides exposition, and the way the schemes above address Laurillard's activities 1 to 4, other functions that can be addressed in large groups in lecture theatres include:

    Secondly, some aspects of a course can use large group teaching (see above), but all the rest must be done in smaller groups. How small, and how to organise them? One of the most interesting functions to notice is that many of the schemes above use peer discussion, coordinated by the teacher but otherwise not supervised or facilitated by staff. For this the effective size is no more than 5 learners, and 2 or 4 may often be best. Both our experience and published research on group dynamics and conversation structures support this. Instead of clinging to group sizes dictated either by current resources or by what staff are used to (which often leads to "tutorial" group sizes of 6, 10, or 20), we should consider what is effective. When the learning benefit is in the student generating an utterance, then 2 is the best size, since then at any given moment half the students are generating utterances. Where spontaneous and flowing group interaction is required, then 5 is the maximum number. For creating and coordinating a community, then it can be as large as you like provided an appropriate method is used e.g. using EVS to show everyone the degree of agreement and diversity on a question, or having the lecturer summarise written responses submitted earlier.

    However forming groups simply by dividing the number of students by the number of staff is a foolish administrative response, not a pedagogic one. What is the point of groups of 10 or 20? Not much. If the model is for a series of short one to one interactions (which may be relevant for pastoral and counselling functions), then consider how to organise this. Putting a group of students in the same room is obviously inappropriate for this, and ICT makes this less and less necessary. If the model is for more personalised topics e.g. all the students with trouble over subtopic X go to one group, then we need NOT to assign permanent groups, but should organise ad hoc ones based on that subtopic. In general, what the schemes above suggest for the future is to consider a course as involving groups of all sizes, not necessarily permanent, not necessarily supervised; and organised in a variety of ways, including possibly pyramids and unsupervised groups. This is after all only an extension of the eternal expectation that learners will do some work alone: the ultimate small unsupervised group.

    In the end, we should consider:

    References

    Draper, S.W. (1997) Adding (negotiated) learning management to models of teaching and learning http://www.psy.gla.ac.uk/~steve/TLP.management.html (visited 24 Feb 2005)

    Dufresne, R.J., Gerace, W.J., Leonard, W.J., Mestre, J.P., & Wenk, L. (1996) Classtalk: A Classroom Communication System for Active Learning Journal of Computing in Higher Education vol.7 pp.3-47 http://umperg.physics.umass.edu/projects/ASKIT/classtalkPaper

    Hake, R. R. (1998). Interactive-engagement versus traditional methods: A six-thousand student survey of mechanics data for introductory physics courses. American Journal of Physics, 66, 64-74.

    R.R. Hake (1991) "My Conversion To The Arons-Advocated Method Of Science Education" Teaching Education vol.3 no.2 pp.109-111 online pdf copy

    Hunt, D. (1982) "Effects of human self-assessment responding on learning" Journal of Applied Psychology vol.67 pp.75-82.

    Laurillard, D. (1993), Rethinking university teaching (London: Routledge)

    MacManaway,M.A. (1968) "Using lecture scripts" Universities Quarterly vol.22 no.June pp.327-336

    MacManaway,M.A. (1970) "Teaching methods in HE -- innovation and research" Universities Quarterly vol.24 no.3 pp.321-329

    Mazur, E. (1997). Peer Instruction: A User’s Manual. Upper Saddle River, NJ:Prentice-Hall.

    Meltzer,D.E. & Manivannan,K. (1996) "Promoting interactivity in physics lecture classes" The physics teacher vol.34 no.2 p.72-76 especially p.74

    Novak,G.M., Gavrin,A.D., Christian,W. & Patterson,E.T. (1999) Just-in-time teaching: Blending Active Learning and Web Technology (Upper Saddle River, NJ: Prentice- Hall)

    Novak,G.M., Gavrin,A.D., Christian,W. & Patterson,E.T. (1999) http://www.jitt.org/ Just in Time Teaching (visited 20 Feb 2005)

    Resnick,L.B. (1989) "Introduction" ch.1 pp.1-24 in L.B.Resnick (Ed.) Knowing, learning and instruction: Essays in honor of Robert Glaser (Hillsdale, NJ: Lawrence Erlbaum Associates).


    Last changed 15 Oct 2009 ............... Length about 1700 words (13,000 bytes).
    This is a WWW document maintained by Steve Draper, installed at http://www.psy.gla.ac.uk/~steve/ilig/local.html.

    Using EVS at Glasgow University c.2005

    (written by Steve Draper,   as part of the Interactive Lectures website)

    This page is about the use of EVS (electronic voting systems) in lectures at Glasgow University. It was written a few years ago, and assumes the use of the old IR equipment; though most of the rest of the advice is still reasonable. More up to date advice about use of the current equipment here.

    Questions and answers (click to jump to a section)

    Brief introduction

    If you haven't already read a passage explaining what these EVS are about, a brief general account is here.

    To date, student response, and lecturers' perceptions of that, have been almost entirely favourable in an expanding range of trials here at the University of Glasgow (to say nothing of those elsewhere) already involving students in levels 1,2,3 and 4, and diverse subjects (psychology, medicine, philosophy, computer science, ...), and in sequences from one-off to every lecture for a term.

    The equipment is mobile, and so can be used anywhere with a few minutes setup. It additionally requires a PC (laptops are also mobile, and we can supply one if necessary), and a data projector (the machine for projecting a computer's displayed output on to a big screen).

    In principle, the equipment is available for anyone at the university to use, and there is enough for the two largest lecture theatres to be using it simultaneously. In practice, the human and equipment resources are not unlimited, and advance arrangements are necessary. We can accommodate any size audience, but there is a slight chance of too many bookings coinciding for the equipment, and a considerable chance of us not having enough experienced student assistants available at the right time: that is the currently scarcest resource.

    Why would you want to use EVS in your lectures?

    Want to see them in action?

    Find out who is using them, and go and see them in use.

    If it's one of mine you needn't ask, just turn up; and probably other users feel the same. We are none of us expert, yet we all seem to be getting good effects and needn't feel defensive about it. It usually isn't practicable to get 200 students to provide an audience for a realistic demonstration: so seeing a real use is the best option.

    What's involved at the moment of use?

    What's involved at the lecture?

    Ideally (!):

    One way of introducing a new audience to the EVS is described here.

    What preparation is required by the lecturer?

    Equipment?

    There are several alternative modes you could use this in.

    Human resources

    It is MUCH less stressful for a lecturer, no matter how practised at this, if there are assistants to fetch and set up the equipment, leaving the lecturer to supervise the occasion. We have a small amount of resource for providing these assistants.

    What has experience shown can go wrong?

    Generally both the basic PRS equipment, and the PRS software itself have proved very reliable, both here and elsewhere. Other things however can go wrong.

    Unnecessary technical details

    Most lecturers never need to know about further technical details. But if you want to know about them, about the log files PRS creates, etc.etc. then read on here.

    [ Long past bookings   Past workshops for prospective users     (Past uses)     Interim evaluation report ]


    Last changed 3 Nov 2010 ............... Length about 1600 words (10,000 bytes).
    This is a WWW document maintained by Steve Draper, installed at http://www.psy.gla.ac.uk/~steve/ilig/start.html.

    Introducing the EVS to a new audience

    (written by Steve Draper,   as part of the Interactive Lectures website)

    Here is one possible way of introducing the EVS (electronic voting system), and in particular the PRS IR (infra red) equipment) to a new audience. Below is the script for you, the presenter, to act on; and below that, a slide to use.

    Script for the presenter

    Comments in (small font and parentheses) are optional: you may or may not make them to your audience. Comments in [italics and square brackets] are for you alone from me.

    Assuming the handsets have been distributed, and the time (not necessarily the start of the session) has now come to use or comment on them.

    Slide for use during the introduction

    Here's an HTML impression of the slide, also ready to print (for an OHP). Should be in powerpoint, sorry.
    Using the handsets

    A. Check handset is turned on -- green light on?

    B. Turn it over and read the 3 digit ID number

    C. Point at a receiver (small box with red light on)

  • Can press H(igh) or L(ow confidence) first

    D. Press the number of your choice
    -- see your ID come up on the screen

  • If your ID doesn't come up, wait a few seconds then try again.

  • Can change your vote, but don't keep sending unnecessarily as you will obstruct others' votes.


    Startup questions

    Problems occasionally observed in audience operation of handsets

    Don't comment on these to the whole audience, but be aware of them in case you see them. These are all problems that have been seen e.g. 1 in 50 audience members.

    Problems occasionally observed in lecturer operation of PRS

    The importance of getting every single vote in on the first question(s)

    Finally, I just want to repeat the importance, in the first question or two, of being patient and getting every single audience member's vote to register successfully. If it doesn't work for them on the first question, that person will probably never participate throughout the rest of the session or even the course: for them, the moment will have passed when they feel able to ask for help. Furthermore being seen to take such care about this probably sets a valuable tacit precedent that sets everyone up to expect to vote on every question.

    In almost every group we have run, about 1 in 50 of the audience fail to get it to work for them despite considerable effort. However we have failed to identify a pattern, either of the type of person or the type of problem. Furthermore hardly anyone ever asks for help (they are seeing hundreds around them succeed without effort) until they have been explicitly asked several times. Even though it feels like it's holding up the whole session, it is really only a few more minutes. Just keep asking until the total distinct handset IDs counted on the screen display matches your count of the people/handsets handed out. Keep asking, search the audience with your eyes, run up and down the aisles (carrying a spare handset or two) to attend to whoever lets slip they have a problem. It may be anything, or even something you can't fix: but usually it's turning the handset on, a handset battery being flat, not pointing the handset at a receiver (but at the screen, or into the head of the person in front of them); not being able to recognise their ID number on the screen.


    Last changed 25 Jan 2003 ............... Length about 300 words (3,000 bytes).
    This is a WWW document maintained by Steve Draper, installed at http://www.psy.gla.ac.uk/~steve/ilig/question.html.

    Presenting a question

    (written by Steve Draper,   as part of the Interactive Lectures website)

    What is involved in presenting each question?

    How to present a question

  • Display the question (but don't start the PRS handset software)
  • Explain it as necessary
  • "Are you ready to answer it? Anything wrong with this question?" and encourage any questions, discussion of the question.
  • Only then, press <start> on the computer system.
  • Audience answers: wait until the total of votes reach the full audience total.
  • Display answers (as a bar graph).
  • Always try to make at least one oral comment about the distribution of answers shown on the graph. Partly for "closure"/acknowledgement; partly to slow you up and let everyone see the results.
  • State which answer (if any) was right, and decide what to do next.

    What the presenter does in essence

    The presenter's function is, where and when possible, to:

    What each learner does in essence

    For each question, each learner has to:



    Last changed 6 June 2004 ............... Length about 300 words (2500 bytes).
    This is a WWW document maintained by Steve Draper, installed at http://www.psy.gla.ac.uk/~steve/ilig/length.html.

    Length and number of questions

    (written by Steve Draper,   as part of the Interactive Lectures website)

    How many questions? How long do they take?
    A rule of thumb for a 50 minute lecture is to use only 3 EVS questions.

    In a "tutorial" session organised entirely around questions, you could at most use about 12 if there were no discussion: 60 secs to express a question, 90 secs to collect votes, 90 secs to comment briefly on the responses gives 4 minutes per question if there is no discussion or detailed explanation, and so 12 questions in a lecture.

    Allowing 5 mins (still very short) for discussion by audience and presenter of issues that are not well understood would mean only 5 such questions in a session.

    It is also possible, especially with a large collection of questions ready, to "use up" some by just asking someone to shout out the answer to warm up the audience, and then vote on a few to make sure the whole audience is keeping up with the noisy few. It would only take 20 seconds rather than 4 minutes for each such informal use of a question. Never let the EVS become too central or important: it is only one aid among others.

    Thus for various reasons you may want to prepare a large number of questions from which you select only a few, depending on how the session unfolds.





    Last changed 13 April 2022 ............... Length about 1500 words (12,000 bytes).
    This is a WWW document maintained by Steve Draper, installed at http://www.psy.gla.ac.uk/~steve/ilig/qdesign.html.

    Question formats

    (written by Steve Draper,   as part of the Interactive Lectures website)

    There is a whole art to designing MCQs (multiple choice questions). Much of the literature on this is for assessment. In this context however we don't much care (as that literature does) about fairness, or discriminatory power, but instead will concentrate on what will maximise learning.

    Here I just discuss possible formats for a question, without varying the purpose or difficulty. I was in part inspired by Michele Dickson of Strathclyde University. The useful tactic implied by her practice is to vary the way questions are asked about each topic.

    A common type of MCQ concerns one relationship e.g. (using school chemistry as an example domain) "What is the chemical symbol for gold: Ag, Al, Au, Ar ?"

    Reversing the relationship

    You can equally, and additionally, ask about the same relationship in reverse: "Which metal is represented by the symbol 'Au'? Gold, silver, platinum, copper?"

    Multiple types of relationship

    When you have several relationships, the alternative question types multiply. Consider these 3 linked pieces of information: a photo of a gold nugget or ring; the word (name) "Gold"; and the symbol "Au". These 3 pieces of information each have a relationship with the other 2, giving 3 types of relationship; and each has 2 directions, giving 6 question types in all:

    Applied to statistics this might be:

    The idea is to require students to access knowledge of a topic from several different starting points. Here I exercised three kinds of link, and each kind in both directions. Exercising these different types and directions of link is not only important in itself (because understanding requires understanding all of these) but keeps the type of mental demand on the students fresh, even if you are in fact sticking on one topic.

    Types of relationship to exercise / test

    In the abstract there are three different classes of relationship to test:

    The first is that of linking ideas or concepts to particular examples or instances of them e.g. is a whale a fish or a mammal? Another form of this is linking (engineering or maths) problems with the principle or rule that is likely to be used to solve it. However both concepts and instances are represented in more than one way, and practice at these alternative representations and their equivalences is usually an essential aspect of learning a subject. Thus concepts usually have both a technical name, and a definition or description, and testing this relationship is important. Similarly instances usually have more than one standard method of description and, although these are specific to each subject, learners need to master them all, and questions testing these equivalences are important. In teaching French language, both the spelling, the pronounciation, and the meaning of a word need to be learned. In statistics, an example data set should be represented by a graph, a table of values, as well as a description such as "bell shaped curve with long tails". In chemistry, the name "copper sulfate" should be linked to "CuSO4" and a photograph of blue crystals, and questions should test these links. (See Johnstone, A.H. (1991) "Why is science difficult to learn? Things are seldom what they seem" Journal of computer assisted learning vol.7 no.2 pp.75-83 for an argument related to this based in teaching Chemistry. See also Roy Tasker's group: http://visualizingchemistry.com/research.)

    These relationships are all bidirectional, so questions can (and should) be asked in both directions e.g. both "which of these is a mammal" and "to which of these categories do dolphins belong?". Thus a subject with three standard representations for instances plus concept names and concept definitions will have five representations, and so 20 types of question (pick one of five for the question, and one of the remaining four for the response categories). Additional variations come from allowing more than one item as an answer, or asking the question in the negative e.g. "which of these is not a mammal?: mouse, platypus, porpoise?".

    The problem of technical vocabulary is a general one, and suggests that the concept name-definition link should be treated especially carefully. If you ask questions that are problems (real-world cases) and ask which concept applies but use only the technical names of the concepts, then students must understand perfectly both concept and the vocabulary; and if they get it wrong you don't know which aspect they got wrong. Asking concept-case questions using not technical vocabulary but paraphrased descriptions of the concepts can separate these; and separate questions to test name-definition (i.e. concept vocabulary).

    Further Response Options

    The handsets do not directly allow the audience to specify more than one answer per question. However you can offer at least some combinations yourself e.g.
    "Is a Black Widow:
    1. A spider
    2. An insect
    3. An arachnid
    4. (1) and (2)
    5. (2) and (3)
    6. (1) and (3)
    7. (1) and (2) and (3)
    8. None of the above

    It may or may not be a good idea to include null responses as an option. Against offering them is the idea that you want to force students to commit to an answer rather than do nothing, and also the observation that when provided usually few take the null option, given the anonymity of entering a guess. Furthermore, a respondent could simply not press any button; although that, for the presenter, is ambiguous between a decision rejecting all the alternatives, the equipment giving trouble to some of the audience, or the audience getting bored or disengaged. However if you do include them as standard, it may give you better, quicker feedback about problems. In fact there are at least three usually applicable distinct null options to use:

    Assertion-reason questions

    I particularly commend asking MCQs that, instead of asking which fact is true, ask which reason for a given fact is the right one.

    An extension of this are: Assertion-reason questions.

    Covertly related questions: Using 3 questions to make a strong test of understanding one concept

    Mark Russell suggests using 3 (say) alternative questions all testing the same key concept. With MCQs with 4 response options, 25% of students will get a question right by accident if they answer at random: not a strong test. He suggests having 3 alternative questions testing exactly the same concept, and only students who get all 3 of these correct should be regarded as having learned the concept. The questions are tacitly linked (by being about the same concept), but not listed adjacently and not using similar structure. He found that students who did not have a sound understanding of the concept did not even recognise that the 3 questions were linked: the disguise does not need to be elaborate (contrary to expert / staff perceptions, who naturally see the 3 questions as "about the same thing" exactly because they grasp the concept).

    Russell, Mark (2008) "Using an electronic voting system to enhance learning and teaching" Engineering Education vol.3 no.2 pp.58-65 doi:10.11120/ened.2008.03020058

    Some references on MCQ design

  • CAAC (Computer Assisted Assessment Centre) website advice on MCQ design

  • Johnstone, A. H. (1991) "Why is science difficult to learn? Things are seldom what they seem" Journal of Computer Assisted Learning vol.7 no.2 pp.75-83 doi:10.1111/j.1365-2729.1991.tb00230.x
  • See also Roy Tasker's group: http://visualizingchemistry.com/research

  • McBeath, R. J. (ed.) (1992) Instructing and Evaluating Higher Education: A Guidebook for Planning Learning Outcomes (New Jersey: ETP)

  • Russell, Mark (2008) "Using an electronic voting system to enhance learning and teaching" Engineering Education vol.3 no.2 pp.58-65 doi:10.11120/ened.2008.03020058


    Last changed 13 April 2022 ............... Length about 4,000 words (29,000 bytes).
    This is a WWW document maintained by Steve Draper, installed at http://www.psy.gla.ac.uk/~steve/ilig/qpurpose.html.

    Pedagogical formats for using questions and voting

    (written by Steve Draper,   as part of the Interactive Lectures website)

    EVS questions may be used for many pedagogic purposes. These can be classified in an abstract way: discussed at length elsewhere and summarised here:

    1. Assessment
      • Confidence (or certainty) based marking (CBM) for summative assessment. While the rest of the purposes addessed on this page are about using handsets in a large classroom, CBM is for summative assessment and solo study online. Gardner-Medwin developed this well and a lot of his work is still on the web. Bear in mind three things:
        1. He taught medical students: very bright, and very motivated to maximise marks. But also (a2) with two drives: to sound completely certain to patients; but also very aware that it is dangerous to bet a patient's life on a decision, so how certain you are really matters professionally. (Programmers mostly don't care about their users.)
        2. He made them practise on this format for tests before doing tests that counted, so they could get used to it.
        3. The bit of CBM which is not quite obvious is the exact marking scheme. It is here, among other places: https://tmedwin.net/cbm/

        • Issroff K. & Gardner-Medwin A.R. (1998) "Evaluation of confidence assessment within optional coursework" In : Oliver, M. (Ed.) Innovation in the Evaluation of Learning Technology, University of North London: London, pp 169-179
        • Gardner-Medwin, A. R. (2006). "Confidence-based marking: towards deeper learning and better exams" In C. Bryan & K. Clegg (Eds), Innovative assessment in higher education. London: Routledge
        • His web site: https://tmedwin.net/cbm/
        • His papers: https://tmedwin.net/~ucgbarg/pubteach.htm
        • My website on question design

          In theory, I might bet that using CBM would work as well for (deep) learning INSTEAD of Mazur's PI. I believe both work the same way in learners: forcing them to think about whether they are sure of their answer, and then self-correcting by thinking up reasons for and against it. See:
          Draper,S.W. (2009a) "Catalytic assessment: understanding how MCQs and EVS can foster deep learning" British Journal of Educational Technology vol.40 no.2 pp.285-293 doi: 10.1111/j.1467-8535.2008.00920.x

      • Diagnostic SAQs i.e. "self-assessment questions" (formative assessment). These give individual formative feedback to students, but also both teacher and learners can see what areas need more attention. The design of sets of these is discussed further on a separate page, including working through an extended example (e.g. of how to solve a problem) with a question at each step. SAQs are a good first step in introducing voting systems to otherwise unmodified lectures.

    2. Initiate a discussion. Discussed further below.
    3. Formative feedback to the teacher i.e. "course feedback".
      1. In fact you will get it anyway without planning to. For instance SAQs will also tell you how well the class understands things.
      2. To organise a session explicitly around this, look at contingent teaching;
      3. To think more directly about how questioning students can help teachers and promote learning directly, look at this book on "active assessment": Naylor,S., Keogh,B., & Goldsworthy,A. (2004) Active assessment: Thinking, learning, and assessment in science (London: David Fulton Publishers)
      4. The above are about feedback to the teacher of learners' grasp of content. You can also ask about other issues concerning the students' views of the course as in course feedback questionnaires (which could be administered by EVS).
      5. Combining that with the one minute paper technique would give you some simple open-ended feedback to combine with the "numbers" from the EVS voting.
      6. A more sophisticated (but time consuming) version of this would combine collecting issues from the students, and then asking EVA survey questions about each such issue. This is a form of of having students design questions where this is described further.
    4. Summative assessment (even if only as practice) e.g. practice exam questions.
    5. Peer assessment could be done on the spot, saving the teacher administrative time and giving the learner much more rapid, though public, feedback.
    6. Community mutual awareness building. At the start of any group e.g. a research symposium or the first meeting of a new class, the equipment gives a convenient way to create some mutual awareness of the group as a whole by displaying personal questions and having the distribution of responses displayed.
    7. Experiments using human responses: for topics that concern human responses, a very considerable range of experiments can be directly demonstrated using the audience as participants. The great advantage of this is that every audience member both experiences what it is to be a "subject" in the experiment, and sees how variable (or not) the range of responses is (and how their own compares to the average). In a textbook or conventional lecture, neither can be done experientially and personally, only described. Subjects this can apply in include:
      • Politics (demonstrate / trial voting systems)
      • Psychology (any questionnaire can be administered then shared)
      • Physiology (Take one's pulse: see class' average; auditory illusions)
      • Vision science (display visual illusions; how many "see" it?)
      • Maths/statistics/physics: Illustrate Benford's law by collecting data on the first digit of almost anything (train ticket serial number, house address, ...)
    8. Having students design questions: this is relatively little used, but has all the promise of a powerfully mathemagenic tactic. Just as peer discussion moves learners from just picking an answer (perhaps by guessing) to arguing about reasons for answers, so designing MCQs gets them thinking much more deeply about the subject matter.

    However pedagogic uses are probably labelled rather differently by practising lecturers, under phrases like "adding a quiz", "revision lectures", "tutorial sessions", "establishing pre-requisites at the start", "launching a class discussion". This kind of category is more apparent in the following sections and groupings of ways to use EVS.

    SAQs and creating feedback for both learner and teacher

    Asking test questions, or "self-assessment questions" (SAQs) since only the student knows what answer they gave individually, is useful in more than one way.

    A first cautious use of EVS

    The simplest way to introduce some EVS use into otherwise conventional lectures is to add some SAQs at the end so students can check if they have understood the material. This is simplest for the presenter: just add two or three simple questions near the end without otherwise changing the lecture plan. Students who get them wrong now know what they need to work on. If the average performance is worse than the lecturer likes, she or he can address this at the start of the next lecture. Even doing this in a simple, uninspired way has in fact consistently been viewed positively by students in our developing experience, as they welcome being able to check their understanding.

    Extending this use: Emotional connotations of questions

    If you put up an exam question, its importance and relevance is clear to everyone and leads to serious treatment. However, it may reduce discussion even while increasing attention, since to get it wrong is to "fail" in the terms of the course. Asking brain teasers is a way of exercising the same knowledge, but without the threatening overtones, and so may be more effective for purposes such as encouraging discussion.

    Putting up arguments or descriptions for criticism may be motivating as well as useful (e.g. describe a proposed experiment and ask what is faulty about it). It allows students to practise criticism which is useful; and criticism is easier than constructive proposals which, in effect, is what they are exclusively asked for in most "problem solving" questions, and so questions asking for critiques may be a better starting point.

    Thus in extending beyond a few SAQs, presenters may like to vary their question types with a view to encouraging a better atmosphere and more light hearted interaction.

    Contingent teaching: Extending the role of questions in a session

    Test questions can soon lead to trying a more contingent approach, where a session plan is no longer for a fixed lecture sequence of material, but is prepared to vary depending upon audience response. This may mean preparing a large set of questions, those actually used depending upon the audience: this is discussed in "designing a set of questions for a contingent session".

    This approach could be used, for instance, in:


    Designing for discussion

    Another important purpose for questions is to promote discussion, especially peer discussion. A general format might be: pose a question and take an initial vote (this gets each person to commit privately to a definite initial position, and shows everyone what the spread of opinion on it is). Then, without expressing an opinion or revealing what the right answer if any is, tell the audience to discuss it. Finally, you might take a new vote, and see if opinions have shifted.

    The general benefit is that peer discussion requires not just deciding on an answer or position (which voting requires) but also generating reasons for and against the alternatives, and also perhaps dealing with reasons and objections and opinions voiced by others. That is, although the MCQ posed only directly asks for an answer, discussion implicitly requires reasons and reasoning, and this is the real pedagogical aim. Furthermore, if the discussion is done in small groups of, say, four, then at any moment one in four not only one in the whole room is engaged in such generation activity.

    There are two classes of question for this: those that really do have a right answer, and those that really don't. (Or, to use Willie Dunn's phrase, those that concern objects of mastery and those that are a focus for speculation.) In the former case, the question may be a "brain teaser" i.e. optimised to provoke uncertainty and dispute (see below). In the latter case, the issue to be discussed simply has to be posed as if it had a fixed answer, even though it is generally agreed it does not: for instance as in the classic debate format ("This house believes that women are dangerous."). Do not assume that a given discipline necessarily only uses one or the other kind of question. GPs (doctors), for instance, according to Willie Dunn in a personal note, "came to distinguish between topics which were a focus for speculation and those which were an object of mastery. In the latter the GPs were interested in what the expert had to say because he was the master, but with the other topics there was no scientifically-determined correct answer and GPs were interested in what their peers had to say as much as the opinion of the expert, and such systems [i.e. like PRS] allowed us to do this."

    Slight differences in format for discussion sessions have been studied: Nicol, D. J. & Boyle, J. T. (2003) "Peer Instruction versus Class-wide Discussion in large classes: a comparison of two interaction methods in the wired classroom" Studies in Higher Education. In practice, most presenters might use a mixture and other variations. The main variables are in the number of (re)votes, and the choice or mixture of individual thought, small group peer discussion, and plenary or whole-class discussion. While small group discussion may maximise student cognitive activity and so learning, plenary discussion gives better (perhaps vital) feedback to the teacher by revealing reasons entertained by various learners, and so may maximise teacher adaptation to the audience. The two leading alternatives are summarised in this table (adapted from Nicol & Boyle, 2003).

    Discussion recipes
    "Peer Instruction":
    Mazur Sequence
    "Class-wide Discussion":
    Dufresne (PERG) Sequence
    1. Concept question posed.
    2. Individual Thinking: students given time to think individually (1-2 minutes).
    3. [voting] Students provide individual responses.
    4. Students receive feedback -- poll of responses presented as histogram display.
    5. Small group Discussion: students instructed to convince their neighbours that they have the right answer.
    6. Retesting of same concept.
      [voting] Students provide individual responses (revised answer).
    7. Students receive feedback -- poll of responses presented as histogram display.
    8. Lecturer summarises and explains "correct" response.
    1. Concept question posed.
    2. Small group discussion: small groups discuss the concept question (3-5 mins).
    3. [voting] Students provide individual or group responses.
    4. Students receive feedback -- poll of responses presented as histogram display.
    5. Class-wide discussion: students explain their answers and listen to the explanations of others (facilitated by tutor).
    6. Lecturer summarises and explains "correct" response.

    Questions to discuss, not resolve

    Examples of questions to launch discussion in topics that don't have clear right and wrong answers are familiar from debates and exam questions. The point, remember, is to use a question as an occasion first to remind the group there really are differences of view on it, but mainly to exercise giving and evaluating reasons for and against. The MCQ, like a debate, is simply a conventional provocation for this.

    "Brain teasers"

    Using questions with right and wrong answers to launch discussion is, in practice, less showing a different kind of question to the audience and more a different emphasis in the presenter's purpose. Both look like (and are) tests of knowledge; in both cases if (but only if) the audience is fairly split in their responses then it is a good idea to ask them to discuss the question with their neighbours and then re-voting, rather than telling them the right answer; in both cases the session will become more contingent: what happens will depend partly on how the discussion goes not just on the presenter's prepared plan; in both cases the presenter may need to bring a larger set of questions than can be used, and proceed until one turns out to produce the right level of divisiveness in initial responses.

    The difference is only that in the SAQ case the presenter may be focussing on finding weak spots and achieving remediation up to a basic standard whether the discussion is done by the presenter or class as a whole, while in the discussion case, the focus may be on the way that peer discussion is engaging and brings benefits in better understanding and more solid retention regardless of whether understanding was already adequate.

    Nevertheless optimising a question for diagnosing what the learners know (self-assessment questions), and optimising it for fooling a large proportion and for initiating discussion are not quite the same thing. There are benefits from initiating discussion independently of whether this is the most urgent topic for the class (e.g. promoting the practice of peer interaction, generating arguments for an answer probably improves the learner's grasp even if they had selected the right answer, and is more related to deep learning, and promotes their learning of reasons as well as of answers, etc.).

    Some questions seem interesting but hard to get right if you haven't seen that particular question before. Designing a really good brain teaser is not just about a good question, but about creating distractors i.e. wrong but very tempting answers. In fact, they are really paradoxes: where there seem to be excellent reasons for each contradictory alternative. Such questions are ideal for starting discussions, but perhaps less than optimal for simply being a fair diagnosis of knowledge. In fact ideally, the alternative answers should be created to match common learner misconceptions for the topic. An idea is to use the method of phenomenography to collect these misconceptions: the idea here would be to then express the findings as alternative responses to an MCQ.

    Great brain teasers are very hard to design, but may be collected or borrowed, or generated by research.

    Here's an example that enraged me in primary school, but which you can probably "see through".

    "If a bottle of beer and a glass cost one pound fifty, and the beer costs a pound more than the glass, how much does the glass cost?"
    The trap seems to lie in matching the beer to one pound, the glass to fifty pence, and being satisfied that a "more" relation holds.

    Here is one from Papert's Mindstorms p.131 ch.5.

    "A monkey and a rock are attached to opposite ends of a rope that is hung over a pulley. The monkey and the rock are of equal weight and balance one another. The monkey begins to climb the rope. What happens to the rock?"
    His analysis of why this is hard (but not complex) is: students don't have the category of "laws-of-motion problem" like conservation of energy problem. I.e. we have mostly learned Newton without having really learned the pre-requisite concept of what IS a law of motion. Another view is that it requires you to think of Newtons 3rd law (reaction), and most people can repeat the law without having exercised it much.

    Another example on the topic of Newtonian mechanics can be paraphrased as follows.

    Remember the old logo or advert for Levi's jeans that showed a pair of jeans being pulled apart by two teams of mules pulling in opposite directions. If one of the mule teams was sent away, and their leg of the jeans tied to a big tree instead, would the force (tension) in the jeans be: half, the same, or twice what it was with two mule teams?
    The trouble here is how can two mule teams produce no more force than one team, when one team clearly produces more than no teams; on the other hand, one mule pulling one leg (while the other is tied to the tree) clearly produces force, so a second mule team isn't necessary.

    Another one (taken from the book "The Tipping Point") can be expressed:

    Take a large piece of paper, fold it over, then do that again and again a total of 50 times. How tall do you think the final stack is going to be?
    Somehow even those who have been taught better, tend think it will be about 50 times the thickness of a piece of paper, whereas really it is doubled 50 times i.e. it will be 2 to the 50th power thicknesses, which is a huge number; and probably comes out as about the distance from here to the sun.

    Brain teasers seem to relate the teaching to students' prior conceptions, since tempting answers are most often those suggested by earlier but incorrect or incomplete ways of thinking.

    Whereas with most questions it is enough to give (eventually) the right answer and explain why it is right, with a good brain teaser it may be important in addition to explain why exactly each tempting wrong answer is wrong. This extra requirement on the feedback a presenter should produce is discussed further here.

    Finally, here is an example of a failed brain teaser. "Isn't it amazing that our legs are exactly the right length to reach the ground?" (This is analogous to some specious arguments that have appeared in cosomology / evolution.) At the meta-level, the brain teaser or puzzle here is to analyse why that is tempting to anyone; something to do with starting the analysis from your seat of consciousness in your head (several feet above the ground) and then noticing what a good fit from this egocentric viewpoint your legs make between this viewpoint and the ground.

    May need a link here on to the page seq.html about designing sequences with/of questions. And on from there to lecture.html.

    Extending discussion beyond the lecture theatre

    An idea which Quintin is committed to trying out (again, better) from Sept. 2004 is extending discussion, using the web, beyond the classroom. The pedagogical and technical idea is to create software to make it easy for a presenter to ship a question (for instance the last one used in a lecture, but it could be all of them), perhaps complete with initial voting pattern, to the web where the class may continue the discussion with both text discussion and voting. Just before the next lecture, the presenter may equally freeze the discussion there and export it (the question, new voting pattern, perhaps discussion text) back into powerpoint for presentation in the first part of their next lecture.

    If this can be made to work pedagogically, socially, and technically then it would be a unique exploitation of e-learning with the advantages of face to face campus teaching; and would be expected to enhance learning because so much is simply proportional to the time spent by the learner thinking: so any minutes spent on real discussion outside class is a step in the right direction.

    Direct tests of reasons

    One of the main reasons that discussion leads to learning, is that it gets learners to produce reasons for a belief or prediction (or answer to a question), and requires judgements about which reasons to accept and which to reject. This can also be done directly by questions about reasons.

    Simply give the prediction in the question, and ask which of the offered reasons are the right or best one(s); or which of the offered bits of evidence actually support or disconfirm the prediction.

    Collecting experimental data

    A voting system can obviously be used to collect survey data from an audience. Besides being useful in evaluating the equipment itself, or the course in which it is used (course feedback), this is particularly useful when that data is itself the subject of the course as it may be in psychology, physiology, parts of medical teaching, etc.

    For instance, in teaching the part of perception dealing with visual illusions, the presenter could put up the illusion together with a question about how it is seen, and the audience will then see the proportion of the audience that "saw" the illusory percept, and compare what they are told, their own personal perceptual experience, and the spread of responses in the audience.

    In a practical module in psychology supported by lectures, Paddy O'Donnell and I have had the class design and pilot questionnaire items (questions) in small groups on a topic such as the introduction and use of mobile phones, for which the class is itself a suitable population. Each group then submited their items to us, and we then picked a set drawing on many people's contributions to form a larger questionnaire. We then used a session to administer that questionnaire to the class, with them responding using the voting equipment. But the end of that session we had responses from a class of about 100 to a sizeable questionnaire. We could then make that data set available almost immediately to the class, and have them analyse the data and write a report.

    A final year research project has also been run, using this as the data collection mechanism: it allowed a large number of subjects to be "run" simultaneously, which is the advantage for the researcher.

    In a class on the public communication of science, Steve Brindley has surveyed the class on some aspects of the demonstrations and materials he used, since they are a themselves a relevant target for such communciation and their preferences for different modes (e.g. active vs. passive presentations) are indicative of the subject of the course: what methods of presentation of science are effective, and how do people vary in their preferences. He would then begin the next lecture by re-presenting and commenting on the data collected last time.


    Last changed 6 Aug 2003 ............... Length about 1,600 words (10,000 bytes).
    This is a WWW document maintained by Steve Draper, installed at http://www.psy.gla.ac.uk/~steve/ilig/contingent.html.

    Degrees of contingency

    (written by Steve Draper,   as part of the Interactive Lectures website)

    Besides the different purposes for questions (practising exam questions, collecting data for a psychological study, launching discussion on topics without a right or wrong answer), an independent issue is whether the session as a whole has a fixed plan, or is designed to vary contingent (depending) on audience responses. The obvious example of this is to use questions to discover any points where understanding is lacking, and then to address those points. (While direct self-assessment questions are the obvious choice for this diagnosis function, in fact other question types can probably be used.) This is to act contingently. By contingency I mean having the presenter NOT have a fixed sequence of stuff to present, but a flexible branching plan, where which branches actually get presented depends on how the audience answers questions or otherwise shows their needs. There are degrees of this.

    Contents (click to jump to a section)

    Implicit contingency

    First are simple self-assessment questions, where little changes in the session itself depending on how the audience answers, but the implicit hope is that learners will (contingently i.e. depending on whether they got a question right) later address the gaps in their knowledge which the questions exposed, or that the teacher will address them later.

    Whole/part training

    Secondly, we might present a case or problem with many questions in it; but the sequence is fixed. A complete example of a problem being solved might be prepared, with questions at each intermediate step, giving the audience practice and self-assessment at each, and also showing the teacher where to speed up and where to slow down in going over the method.

    An example of this can be found in the box on p.74 of Meltzer,D.E. & Manivannan,K. (1996) "Promoting interactivity in physics lecture classes" The physics teacher vol.34 no.2 p.72-76. It's a sample problem for a basic physics class at university, where a simple problem is broken down into 10 MCQ steps.

    Another way of looking at this is that of training on the parts of a skill or piece of knowledge separately, then again on fitting them together into a whole. Diagnostically, if a learner passes the test for the whole thing, we can usually take it they know it all. But if not, then learning may be much more effective if the pieces are learned separately before being put together. Not only is there less to learn at a time, but more importantly feedback is much clearer, less ambiguous if it is feedback on a single thing at a time. When a question is answered wrongly by everyone, it may be a sign that too much has been put together at once.

    In terms of the lesson/lecture plan, though, there is a single fixed course of events, although learners contribute answers at many steps, with the questions being used to help all the learners converge on the right action at each step.

    Contingent path through a case study

    Thirdly, we could have a prepared case study (e.g. a case presented to physicians), with a fixed start and end point; but where the audience votes on what actions and tests to do next, and the presenter provides the information the audience decided to ask for next. Thus the sequence of items depends (is contingent) on the audience's responses to the questions; and the presenter has to have created slides, perhaps with overlays, that allows them to jump and branch in the way required, rather than trudging through a fixed sequence regardless of the audience's responses.

    Diagnosing audience need

    Fourthly, a fully contingent session might be conducted, where the audience's needs are diagnosed, and the time is spent on the topics shown to be needing attention. The plan for such a session is no longer a straight line, but a tree branching at each question posed. The kinds of question you can use for this include:

    Designing a bank of diagnostic questions

    If you want to take diagnosis from test questions seriously, you need to come with a large set, selecting each one depending on the response to the last one. A fuller scheme for designing such a bank might be:
    1. List the topics you want to cover.
    2. Multiply these by several levels of difficulty for each.
    3. Even within a given topic, and given level of difficulty, you can vary the type of question: the type of link, the direction of link, the specific case. [Link back]

    Responding to the answer distribution

    When the audience's answers are in, the presenter must a) state which answer (if any) was right, and b) decide what to do next:

    Selecting the next question

    Decomposing a topic the audience was lost with

    While handset questions are MCQs, the real aim is (when required) to bring out the reasons for and against each alternative answer. When it turns out that most of the audience gets it wrong, how best to decompose the issue? My suggestion is to generate a set of associated part questions.

    One case is when a question links instances (only) to technical terms e.g. (in psychology) "which of these would be the most reliable measure?" If learners get this wrong, you won't know if that is because they don't understand the issues, or this problem, or have just forgotten the special technical meaning of "reliable". In other words, a question may require understanding of both the problem case, and the concepts, and the special technical vocabulary. If very few get it right, it could be unpacked by asking about the vocabulary separately from the other issues e.g. "which of these measures would give the greatest test-retest consistency?". This is one aspect of the problem of technical vocabulary.

    Another case of this was about the top level problem decomposition in introductory programming. The presenter had a set of problems (each of which requiring a program to be designed) {P1, P2, P3}. He had a set of standard top level structures {S1,S2, ... e.g. sequential, conditional, iteration} and the problem the students "should" be able to do is to select the right structure for each given problem. To justify/argue about this means to generate a set of reasons for {F1,F2, ...} and against {A1,A2...} each structure for each problem. I suggest having a bank of questions to select from here. If there are 3 problems and 5 top level structures then 2*3*5=30 questions. An example of one of these 30 would be a set of alternative reasons FOR using structure 3 (iteration) on problem 2, and the question asks the audience which (subset) of these are good reasons.

    The general notion is, that if a question turns out to go too far over the audience's head, we could use these "lower" questions to structure the discussion that is needed about reasons for each answer. (While if everyone gets it right, you speed on without explanation. If half get it right, you go for (audience) discussion because the reasons are there among the audience. But if all get it wrong, support is needed; and these further questions could keep the interaction going instead of crashing out into didactic monologue.)


    Last changed 27 May 2003 ............... Length about 900 words (6000 bytes).
    This is a WWW document maintained by Steve Draper, installed at http://www.psy.gla.ac.uk/~steve/ilig/feedback.html.

    Feedback to students

    (written by Steve Draper,   as part of the Interactive Lectures website)

    While the presenter may be focussing on finding the most important topics for discussion and on whether the audience seems "engaged", part of what each learner is doing is seeking feedback. Feedback not only in the sense of "how am I doing?", though that is vital for regulating the direction and amount of effort any rational learner puts in, but also in the sense of diagnosing and fixing errors in their performance and understanding. So "feedback" includes, in general, information about the subject matter, not just about indicators of the learner's performance.

    This can be thought about as levels of detail, discussed at length in another paper, but summarised here. A key point is that, while our image of ideal feedback may be individually judged and personalised information, in fact it can be mass produced for a large class to a surprising extent, so handset sessions may be able to deliver more in this way than expected.

    Levels of feedback (in order of increasing informativeness)

    1. A mark or grade. Handsets do (only) this if, with advanced software, they deliver only an overall mark for a set of questions.
    2. The right answer: a description or specification of the desired outcome. Handset questions do this if the presenter indicates which option was the right answer.
    3. Diagnosis of which part of the learner action (input) was wrong. When a question really involves several issues, or combinations of options, the learner may be able to see that they got one issue right but another wrong.
    4. Explanation of what makes the right answer correct: of why it is the right answer. I.e. the principles and relationships that matter. The presenter can routinely give an explanation (to the whole audience) of the right answer, particularly if enough got it wrong to make that seem worthwhile.
    5. Explanation of what's wrong about the learner's answer. Since handset questions have fixed alternatives, and furthermore may have been designed to "trap" anyone with less than solid knowledge, in fact this otherwise most personal of types of feedback can be given by a presenter to a large set of students at once, since at most one explanation for each wrong option would need to be offered.

    The last (5) is a separate item because the previous one (4) concerned only correct principles, but this one (5) concerns misconceptions, and in general negative reasons why apparent connections of this activity with other principles are mistaken. Thus (4) is self-contained, and context-free; while (5) is open-ended and depends on the learner's prior knowledge. This is only needed when the learner has not just made a slip or mistake but is in the grip of a rooted misconception -- but is crucial when that is the case. Well designed "brain teasers" are of this kind: eliciting wrong answers that may be held with conviction. Thus with mass questions that are forced choice, i.e. MCQ, one can identify in advance what the wrong answers are going to be and have canned explanations ready.

    Here are two rough tries, applying to actual handset questions posed to an introductory statistics class, at describing the kind of extra explanation that might be desirable here. Their feature is explaining why the wrong options are attractive, but also why they are wrong despite that.

    Example1. A question on sample vs. population medians.

    The null-hypothesis for a Wilcoxon test could be:
    1. The population mean is 35
    2. The sample mean is 35
    3. The sample median is 35
    4. The population median is 35
    5. I don't know
    Why is it that this vocabulary difference is seductively misleading to half the class? Perhaps because both are artificial views of the same real people: the technical terms don't refer to any real property (like age, sex, or height), just a stance taken by the analyst. And everyone who is in the sample is in the population. It's like arguing about whether to call someone a woman or a female, where the measure is the average blood type of a woman or of a female. And furthermore because of this, most investigators don't have a fixed idea about either sample or population. They would like their ideas to apply the population of all possible people alive and unborn; but know it is likely that it only applies to a limited population; but that they will only discuss this in the last paragraph of their report, long after getting the data and doing the stats. Similarly, they are continually reviewing whom to use as a sample. So not only are these unreal properties that exist only in the mind of the analyst, but they are continually shifting there in most cases. (None of this is about casting doubt on the utility of the concepts, just about why they may stay fuzzy in learners' minds for longer than you might expect.)

    Example2. Regression Analysis: Reading versus Motivation

    PredictorCoefSE CoefTP
    Constant2.0741.9801.050.309
    Motivati0.65880.36161.820.085
    The regression equation is Reading = 2.07 + 0.659 Motivation
    S = 2.782     R-Sq = 15.6%     R-Sq(adj) = 10.9%

    Which of the following statements are correct?
    a. There seems to be a negative relationship between Motivation and Reading ability.
    b. Motivation is a significant predictor of reading ability.
    c. About 11% of the variability in the Reading score is explained by the Motivation score.

    1. a
    2. ab
    3. c
    4. bc
    5. I don't know
    There was something cunning in the question on whether a correlation was significant or not, with a p value of 0.085. Firstly because it isn't instantly easy to convert 0.085 to 8.5% to 1 in 12. 0.085 looks like a negligible number to me at first glance. And secondly, the explanation didn't mention the wholly arbitrary and conventional nature of picking 0.05 as the threshold of "significance".

    For more examples, see some of the examples of brain teasers, which in essence are questions especially designed to need this extra explanation.


    Last changed 21 Feb 2003 ............... Length about 700 words (5,000 bytes).
    This is a WWW document maintained by Steve Draper, installed at http://www.psy.gla.ac.uk/~steve/ilig/manage.html.

    Designing and managing a teaching session

    (written by Steve Draper,   as part of the Interactive Lectures website)

    Any session or lecture can be thought of as having 3 aspects, all of which ideally will be well managed. If you are designing a new kind of session (e.g. with handsets) you may want to think about these aspects explicitly. They are:

    Feedback to the presenter

    In running a session, the presenter has to make various judgements on the fly, because they must make decisions on:


    Last changed 21 Dec 2007 ............... Length about 200 words (3,000 bytes).
    (Document started on 6 Jan 2005.) This is a WWW document maintained by Steve Draper, installed at http://www.psy.gla.ac.uk/~steve/ilig/qbanks.html. You may copy it. How to refer to it.

    Question banks available on the web

    By Steve Draper,   Department of Psychology,   University of Glasgow.

    This page is to collect a few pointers to sets of questions that might be used with EVS that are available on the web. Further suggestions and pointers are welcome.

    For first year physics at University of Sydney: their webpage     and a local copy to print off as one document.

    The Galileo project has some examples if you regester online with them.

    The SDI (Socratic dialog Inducing) lab has some examples.

  • Physics: Joe Redish's list of Mazur type questions i.e. "ConcepTests"

  • Chemistry ConcepTests

  • Calculus questions

  • JITT: just in time teaching: example "warmup questions"

  • Canadian In-Class Question Database (CINQ-DB) for astronomy, mathematics, physics, psychology, and science.

    ?Roy Tasker


    Last changed 15 Feb 2005 ............... Length about 900 words (6000 bytes).
    (Document started on 15 Feb 2005.) This is a WWW document maintained by Steve Draper, installed at http://www.psy.gla.ac.uk/~steve/ilig/evidence.html. You may copy it. How to refer to it.

    Kinds of evidence about the effectiveness of EVS

    By Steve Draper,   Department of Psychology,   University of Glasgow.

    There are basically three classes of evidence to consider:

    .


    ABRAHAMSON, A.L. (1998) An Overview of Teaching and Learning Research with Classroom Communication Systems. Paper presented at the International Conference of the Teaching of Mathematics, Samos, Greece. ANDERSON, T., HOWE, C., SODEN. R., HALLIDAY, J. & LOW, J. (2001). Peer interaction and the learning of critical thinking skills in further education students. Instructional Science, 29, 1-32. Angelo, T.A., and K.P. Cross (1993). Minute Paper. In Classroom Assessment Techniques: A Handbook for College Teachers, San Francisco: Jossey-Bass, 148-153. Annett, D. (1969): Feedback and Human Behaviour. New York, Penguin. ** Banks,D.A. (ed.) (2006) Audience response systems in higher education: Applications and cases (Information Science Publishing) Barak, M., Lipson, A., & Lerman, S. (2006). Wireless laptops as means for promoting active learning in large lecture halls. Journal of Research on Technology in Education, 38(3), 245-263. Beatty, I. D., Gerace, W. J., Leonard, W. J., & Dufresne, R. J. (2006). Designing effective questions for classroom response system teaching. American Journal of Physics, 74(1), 31-39. N Beatty, I.D., Gerace, W.J., Leonard, W.KJ., Dufresne, R.J., Am. J. Phys xxx xxxx (2006) Bjork, R. A., & Richardson-Klavehn, A. (1989). On the puzzling relationship between environmental context and human memory. In C. Izawa (Ed.) Current Issues in Cognitive Processes: The Tulane Floweree Symposium on Cognition (pp. 313-344). Hillsdale, NJ: Erlbaum. Bligh, D. (2000), What’s the use of lectures? San Francisco: Jossey-Bass. Bloom’s Taxonomy : http://www.officeport.com/edu/blooms.htm Bloom, B. S. (1956). Taxonomy of Educational Objectives, the Classification of Educational Goals — Handbook I: Cognitive Domain New York: McKay. Boyle, J.T. & Nicol, D.J. (2003) "Using classroom communication systems to support interaction and discussion in large class settings" Association for Learning Technology Journal vol.11 no.3 pp.43-57 Bransford J. , Brophy S., Williams S. (2000) "When Computer Technologies Meet the Learning Sciences - Issues and Opportunities" Journal of Applied Developmental Psychology Volume 21, Number 1, January 2000, pp. 59-84 Bransford, J. D., Brown, A. L., & Cocking, R. R. (1999). How people learn: Brain, mind, experience, and school. Washington, D. C.: National Academy Press. Brookfield, S. D. (1995), Becoming a critically reflective teacher, San Francisco: Jossey-Bass. Brown, J.S., Collins, A. & Duguid, P. (1989). Situated cognition and the culture of learning. Educational Researcher. 18, 32-42. Brumby, M.N. (1984), ‘Misconceptions about the concept of natural selection by medical biology students’, Science Education, 68(4), 493-503. Bruner, J. (1985), ‘Vygotsky: a historical and conceptual perspective’ in Wertsch, J.V. (ed), Culture, communication and cognition. Cambridge: Cambridge University Press. Burnstein, R. A. & Lederman, L. M. (2001). Using wireless keypads in lecture classes. The Physics Teacher, vol.39 no.1 pp.8-11. N CAL-laborate, UniServe Science International Newsletter, 14 (2005) available online http://science.uniserve.edu.au/pubs/callab/vol14/cal14_bates.pdf Carnaghan, C., & Webb, A. (2005). Investigating the effects of group response systems on learning outcomes and satisfaction in accounting education. Downloaded January 2007 from www.learning.uwaterloo.ca/LIF/responsepad_june20051.pdf . Carnaghan, C., & Webb, A. (2005). Investigating the effects of group response systems on learning outcomes and satisfaction in accounting education. Retrieved on January 1, 2007 from www.learning.uwaterloo.ca/LIF/ responsepad_june20051.pdf Champagne, A. B., Klopfer, L. E. and Anderson, J. H. (1980). Factors influencing the learning of classical mechanics. American Journal of Physics, 48, 1074-1079. Chickering, Arthur W. and Gamson, Zelda F, Seven Principles for Good Practice in Undergraduate Education - http://aahebulletin.com/public/archive/sevenprinciples1987.asp Christianson, R. G. & Fisher, K. M. (1999). Comparison of student learning about diffusion and osmosis in constructivist and traditional classrooms. International Journal of Science Education, 21, 687-698. Cohen, E.G. (1994), ‘Restructuring the classroom: Conditions for productive small groups’, Review of Educational Research, 64(1), 3-35. Comlekci, T., Boyle, J.T., King, W., Dempster, W., Lee, C.K., Hamilton, R. and Wheel, M.A. (1999), ‘New approaches in mechanical engineering education at the University of Strathclyde in Scotland: I — Use of Technology for interactive teaching’, in Saglamer, G.(ed), Engineering Education in the Third Millenium, Leuchtturm- Verlag. Cristle, C. A., & Schuster, J. W. (2003). The effects of using response cards on student participation, academic achievement, and on-task behavior during whole-class math instruction. Journal of Behavioral Education, 12(3), 147-165. Crombie, W. J. (2006). Harvard launches wireless classroom. Harvard University Gazette (February 23, 2006), Downloaded February 2007 from www.news.harvard.edu/gazette/2006/02.23/05-eclassroom.html . ** Crouch, C.H. and Mazur, E. (2001), ‘Peer Instruction: Ten years of experience and results’, American Journal of Physics, 69, 970-977 Crouch,C., A. Fagen, P. Callan & E. Mazur (2004) "Classroom demonstrations: learning tools or entertainment" Am. J. Physics vol.72 no.6 pp.835-838 Cutts,Q. Carbone,A., & van Haaster,K. (2004) "Using an Electronic Voting System to Promote Active Reflection on Coursework Feedback" To appear in Proc. of the Intnl. Conf. on Computers in Education 2004, Melbourne, Australia, Nov. 30th — Dec 3rd 2004. Cutts,Q. Kennedy,G., Mitchell,C., & Draper,S.W. (2004) "Maximising dialogue in lectures using group response systems" Accepted for 7th IASTED Internat. Conf. on Computers and Advanced Technology in Education, Hawaii, 16-18th August 2004 Cutts,Q.I. & Kennedy, G.E. (2005) "Connecting Learning Environments Using Electronic Voting Systems" Seventh Australasian Computer Education Conference, Newcastle, Australia. Conferences in Research and Practice in Information Technology, Vol 42, Alison Young and Denise Tolhurst (Eds) Cutts,Q.I. & Kennedy,G.E. (2005) "The association between students’ use of an electronic voting system and their learning outcomes" Journal of Computer Assisted learning vol.21 pp.260–268 Davies, G. (1986). Context effects in episodic memory: A review. Cahiers de Psychologie Cognitive, 6, 157-174. DeCorte, E. (1996), ‘New perspectives on learning and teaching in higher education’, in Burgen, A. (ed.), Goals and purposes of higher education, London: Jessica Kingsley. DOISE, W & MUGNY, G. (1984). The social development of the intellect. Oxford: Pergamon. Dori, Y. J., & Belcher (2005). How does technology-enabled active learning affect undergraduate students’ understanding of electromagnetism concepts? The Journal of the Learning Sciences, 14(2), 243-279. Draper, S. W., Cargill, J. and Cutts, Q. (2002). Electronically enhanced classroom interaction. Australian Journal of Educational Technology, 18, 13-23. Draper, S.W. (1998) "Niche-based success in CAL" Computers and Education vol.30, pp.5-8 Draper,S.W. & Brown,M.I. (2004) "Increasing interactivity in lectures using an electronic voting system" Journal of Computer Assisted Learning vol.20 pp.81-94 Draper,S.W., Brown, M.I., Henderson,F.P. & McAteer,E. (1996) "Integrative evaluation: an emerging role for classroom studies of CAL" Computers and Education vol.26 no.1-3, pp.17-32 Draper,S.W., Cargill,J., & Cutts,Q. (2002) Electronically enhanced classroom interaction Australian journal of educational technology vol.18 no.1 pp.13-23. [This paper gives the arguments for interaction, and how EVS might be a worthwhile support for that.] Druckman, D., & Bjork, R. A. (1994). Learning, remembering, believing: Enhancing human performance. Washington, D. C.: National Academy Press. ** Dufresne, R.J., Gerace, W.J., Leonard, W.J., Mestre, J.P., & Wenk, L. (1996) Classtalk: A Classroom Communication System for Active Learning Journal of Computing in Higher Education vol.7 pp.3-47 http://umperg.physics.umass.edu/projects/ASKIT/classtalkPaper Duncan, D. (2005). Clickers in the classroom: How to enhance science teaching using classroom response systems. San Francisco: Addison Wesley. Duncan, D. (2005). Clickers in the classroom: How to enhance science teaching using classroom response systems. San Francisco: Addison-Wesley. Eich, J. E. (1980). The cue dependent nature of state dependent retrieval. Memory and Cognition, 8, 157-173. Educational Technologies at Missouri, University of Missouri, Introducing Student Response Systems at MU http://etatmo.missouri.edu/toolbox/doconline/SRS.pdf Edwards, H., Smith, B.A. and Webb, G. (2001), Lecturing: Case studies, experience and practice, London: Kogan Page. N El Rady, J., Innovate, 2(4) 2006. Available online at http://www.innovateonline.info/index.php?view=article&id=171 Elliott, C. (2003) Using a personal response system in economics teaching. International Review of Economics Education. Accessed 11 Nov 2004 http://www.economics.ltsn.ac.uk/iree/i1/elliott.htm Elliott,C. (2001) "Case Study: Economics Lectures Using a Personal Response System" http://www.economics.ltsn.ac.uk/showcase/elliott_prs.htm Fjermestad, J., & Hiltz, S. R. (2001). Group support systems: A descriptive evaluation of case and field studies. Journal of Management Information Systems (JMIS) vol.17 no.3 pp.115-160 Frey, Barbara A. and Wilson, Daniel H. (2004) Student Response Systems, Teaching, Learning, and Technology: Low Threshold Applications - http://jade.mcli.dist.maricopa.edu/lta/archives/lta37.php Gardner, R., Heward, W. L., & Grossi, T. A. (1994). Effects of response cards on student participation and academic achievement: A systematic replication with inner-city students during whole-class science instruction. Journal of Applied Behavior Analysis, 27, 63-71. Glaser, R. (1990), ‘The re-emergence of learning theory within instructional research’, American Psychologist, 45(1), 29-39. Godden, D. R., and Baddeley, A. D. (1975). Context dependency in two natural environments: on land and underwater. British Journal of Psychology, 91, 99-104. Good, T. L., & Brophy, J. E. (2002). Looking in classrooms (9th edition). Boston: Allyn & Bacon. Guthrie, R. W., & Carlin, A. (2004). Using technology to engage passive listeners. The International Principal, vol.8 no.2 N Hake, R. Conservation Ecology, 5, 28 (2002) ** Hake, R. R. (1998). Interactive-engagement versus traditional methods: A six-thousand student survey of mechanics data for introductory physics courses. American Journal of Physics, 66, 64-74. N Hake, R.R. Am. J. Phys. 66, 64-72 (1998) Hake,R.R. (1991) "My Conversion To The Arons-Advocated Method Of Science Education" Teaching Education vol.3 no.2 pp.109-111 online pdf copy Halloun, I.A. and Hestenes, D. (1985), ‘The initial knowledge state of college physics students’, American Journal of Physics, 53, 1043-1055. Hansen, E. J. (1998). Creating teachable moments…and making them last. Innovative Higher Education, 23(1), 7-26. Horowitz,H.M. (1988) "Student Response Systems: Interactivity in the Classroom Environment" IBM Learning Research http://www.qwizdom.com/fastrack/interactivity_in_classrooms.pdf N Hestenes, D. Am. J. Phys., vol.66, 465-7 (1998) N Hestenes, D. and Wells, M. Phys. Teach. 30, 141-158 (1992) N Hestenes, D., Wells, M., Swackhamer, G. Phys. Teach. 30, 159-166 (1992) Horowitz,H.M. (2003) "Adding more power to powerpoint using audience response technology" http://www.socratec.com/FrontPage/Web_Pages/study.htm Horowitz. H. M. (1998). Student response systems: Interactivity in a classroom environment. Proceedings of Sixth Annual Conference on Interactive Instruction Delivery. Society for Applied Learning Technology, 8-15. This article has been updated in 2003 with a new title, Adding more power to PowerPoint using audience response technology, available at: www.audienceresponseinfo.com/audienceresponse-info/power-to-powerpoint.html Howe, C. J. (1991) "Explanatory concepts in physics: towards a principled evaluation of teaching materials" Computers and Education vol.17 no.1 pp.73-80 Hunt, D. (1982) "Effects of human self-assessment responding on learning" Journal of Applied Psychology vol.67 pp.75-82. Inverno, R. "Making Lectures Interactive", MSOR Connections Feb 2003, Vol.3, No.1 pp.18-19 Irving,A., M. Read, A. Hunt & S. Knight (2000) Use of information technology in exam revision Proc. 4th International CAA Conference Loughborough, UK http://www.lboro.ac.uk/service/fi/flicaa/conf2000/pdfs/readm.pdf N Joe Redish at http://www.physics.umd.edu/perg/role/PIProbs/ProbSubjs.htm See the collection of Peer Instruction questions maintained by Judson, E., & Sawada, D. (2006). Audience response systems: Insipid contrivances or inspiring tools? In David A. Banks (Ed.) Audience Response Systems in HigherEducation: Applications and Cases (pp. 26-39). Kearney, M. (2002). Description of Predict-observe-explain strategy supported by the use of multimedia. Retrieved April 8, 2004, from Learning Designs Web site: http://www.learningdesigns.uow.edu.au/exemplars/info/LD44/ Kellum, K. K., Carr, J. E., & Dozier, C. L. (2001). Response-card instruction and student learning in a college classroom. Teaching of Psychology, 28(2), 101-104. Klaas and Baggely, 2003 The International Review of Research in Open and Distance Learning,†Vol 4, No 2 (2003), † ISSN: 1492-3831 Accessed online 26th September 2007 at: http://www.irrodl.org/index.php/irrodl/article/view/138/218 Kohn, A. (1999). Punished by rewards: The trouble with gold stars, incentive plans, A's, praise, and other bribes. Boston: Houghton Mifflin. Kolb, D. A. (1984): Experiential Learning: Experience as the source of learning and development. Englewood Cliffs, NJ, Prentice Hall. Kolikant, Y., McKenna, A., & Yalvac, B. (2005) "Using the personal response system as a cultural bridge from silent absorption to active participation" in Kommers, P., & Richards, G. (Eds.), Proceedings of World Conference on Educational Multimedia, Hypermedia and Telecommunications 2005 (pp. 2660-2667) Chesapeake, VA: AACE. Kourilsky, M. L., & Wittrock, M. C. (1992). Generative teaching: An enhancement strategy for the learning of economics in cooperative groups. American Educational Research Journal, 29, 861-876. Langer, E. J. (1997). The power of mindful learning. Reading, MA: Addison-Wesley. Langer, E. J. (2001). Mindful Learning. Current Directions in Psychological Science, 9, 220-223. Laurillard, D. (1993), Rethinking university teaching, London: Routledge. LAVE, J & WENGER, E (1991). Situated learning: Legitimate peripheral participation. Cambridge University Press, Cambridge. Ledlow, Susan, 2001, Center for Learning and Teaching Excellence http://clte.asu.edu/active/lesspre.htm Liu, T. C., Liang, J. K., Wang, H. Y., Chan, T. W., & Wei, L. H. (2003). Embedding EduClick in Classroom to Enhance Interaction. In Proceedings of International Conference on Computers in Education (ICCE) pp.117-125. Liu, Y. (2003). Developing a scale to measure the interactivity of websites. Journal of Advertising Research vol.43 no.2 pp.207-218. MacGregor, J., Cooper, J.L., Smith, K.A. and Robinson, P. (2000), Strategies for energizing large classes: From small groups to learning communities, San Francisco: Jossey-Bass. MacManaway,M.A. (1968) "Using lecture scripts" Universities Quarterly vol.22 no.June pp.327-336 MacManaway,M.A. (1970) "Teaching methods in HE -- innovation and research" Universities Quarterly vol.24 no.3 pp.321-329 Marmolejo, E. K., Wilder, D. A., & Bradley, L. (2004). A preliminary analysis of the effects of response cards on student performance and participation in an upper division university course. Journal of Applied Behavior Analysis, 37, 405-410. Marton, F., and Säljö, R. (1976). On qualitative differences in learning: I — outcome and process. British Journal of Educational Psychology, 46,4-11. Matthews, R.S. (1996), ‘Collaborative Learning: creating knowledge with students’, in Menges, M., Weimer, M. and Associates. Teaching on solid ground, San Francisco: Jossey-Bass, Mayer, R. E., Heiser, J., & Lonn, S. (2001). Cognitive constraints on multimedia learning: When presenting more material results in less understanding. Journal of Educational Psychology, 93, 187-198. Mayes, T. (2001), ‘Learning technology and learning relationships’, in J. Stephenson (ed), Teaching and learning online, London: Kogan Page. Mazur, E. (1997). Peer Instruction: A User’s Manual. Upper Saddle River, NJ:Prentice-Hall. McBeath, R. J. ed. (1992) Instructing and Evaluating Higher Education: A Guidebook for Planning Learning Outcomes. New Jersey: ETP. McCabe et al. (2001) The Integration of Group Response Systems into Teaching, 5 th International CAA Conference, http://www.lboro.ac.uk/service/fi/flicaa/conf2001/pdfs/d2.pdf N McDermott, L. Am. J. Phys. 61, 295-298 (1993). McDermott, L.C. (1984), ‘Research on conceptual understanding in mechanics’, Physics Today, 37 (7) 24-32. Meltzer,D.E. & Manivannan,K. (1996) "Promoting interactivity in physics lecture classes" The physics teacher vol.34 no.2 p.72-76 Moreno, R., & Mayer, R. E. (2000). A coherence effect in multimedia learning: The case for minimizing irrelevant sounds in the design of multimedia instructional messages. Journal of Educational Psychology, 92, 117-125. Narayan, J. S., Heward, W. L., Gardner, R., Courson, F. H., & Omness, C. K. (1990). Using response cards to increase student participation in an elementary classroom. Journal of Applied Behavior Analysis, 23, 483-490. Nicol, D. J. & Boyle, J. T. (2003) "Peer Instruction versus Class-wide Discussion in large classes: a comparison of two interaction methods in the wired classroom" Studies in Higher Education vol.28 no.4 pp.457-473 Novak,G.M., Gavrin,A.D., Christian,W. & Patterson,E.T. (1999) Just-in-time teaching: Blending Active Learning and Web Technology (Upper Saddle River, NJ: Prentice- Hall) Novak,G.M., Gavrin,A.D., Christian,W. & Patterson,E.T. (1999) http://www.jitt.org/ Just in Time Teaching (visited 20 Feb 2005) Palinscar, A.S. (1998), "Social constructivist perspectives on teaching and learning", Annual Review of Psychology, 49, 345-375. To be published in: Association for Learning Technology Journal (ALT-J), 2003, 11(3), 43-57. Panetta, K.D., Dornbush, C. and Loomis, C. (2002), "A collaborative learning methodology for enhanced comprehension using TEAMThink" Journal of Engineering Education, 223-229. Paschal, C. B. (2002). Formative assessment in physiology teaching using a wireless classroom communication system. Advances in Physiology Education, 26(4), 299-308. Philipp, Sven and Schmidt, Hilary (2004) Optimizing learning and retention through interactive lecturing: Using the Audience Response System (ARS) at CUMC, http://library.cpmc.columbia.edu/cere/web/facultyDev/ARS_handout_2004_overview.pdf Pickford, R. and Clothier, H. (2003) "Ask the Audience: A simple teaching method to improve the learning experince in large lectures", Proceedings of the Teaching, Learning and Assessment in Databases conference, LTSN ICS. ** Poulis, J., Massen, C., Robens, E. and Gilbert, M. (1998). Physics lecturing with audience paced feedback. American Journal of Physics, 66, 439-441. N Raine, D. Physics World, December 2000 http://physicsweb.org/articles/world/13/12/2/1 REITER, S. N. (1994). Teaching dialogically: its relationship to critical thinking in college students. In P. R. PINTRICH, D. R. BROWN & C. E. WEINSTEIN (eds). Student motivation, cognition and learning. Lawrence Erlbaum, New Jersey. RESNICK, L.B. (1989). Knowing, learning and instruction: Essays in honour of Robert Glaser. Lawrence Erlbaum Associates, Hillsdale, New Jersey. Resnick,L.B. (1989) "Introduction" ch.1 pp.1-24 in L.B.Resnick (Ed.) Knowing, learning and instruction: Essays in honor of Robert Glaser (Hillsdale, NJ: Lawrence Erlbaum Associates). Roediger, H. L., III, & Guynn, M. J. (1996). Retrieval processes. In E. L. Bjork & R. A. Bjork (Eds.), Memory (pp. 197-236). San Diego, CA: Academic Press. Roschelle,J., Penuel,W.R. & Abrahamson,L. (2004) "Classroom Response and Communication Systems: Research Review and Theory" American Educational Research Association, San Diego, CA, April 2004 http://ubiqcomputing.org/CATAALYST_AERA_Proposal.pdf Roselli, R.J. and Brophy, S.P. (2002) "Exploring an electronic polling system for the assessment of student progress in two biomedical engineering courses " Proceedings of the American Society for Engineering Education (CD-ROM DEStech Publications) Session 2609 http://www.vanth.org/docs/008_2002.pdf Rothkopf, E. Z. (1965). Some theoretical and experimental approaches to problems in written instruction. In J. D. Krumboltz (Ed.). Learning and the education process (pp. 193-221). Chicago: Rand McNally. Rothkopf, E. Z. (1966). Learning from written instructive materials: An exploration of the control of inspection behavior by test-like events. American Educational Research Journal, 3, 241-249. Schwartz, D. L., & Heiser, J. (2006). Spatial representations and imagery in learning. In R. Keith Sawyer (Ed.) The Cambridge Handbook of the Learning Sciences. Cambridge, UK: Cambridge University Press. Shapiro, J. A. (1997). Journal of Computer Science and Technology, May issue, 408-412. http://www.physics.rutgers.edu/~shapiro/SRS/instruct/index.html Shapiro, J. A. (1997). Student response found feasible in large science lecture hall Journal of College Science Teaching vol.26 no.6 pp.408-412. Sharma,M. (2002a). Interactive lecturing using a classroom communication system. Proceedings of the UniServe Science Workshop, April 2002, University of Sydney, 87-89. http://science.uniserve.edu.au/pubs/procs/wshop7/ Slavin, R. E., Hurley, E. A., & Chamberlain, A. (2003). Cooperative learning and achievement: Theory and research. In Reynolds, William M. (Ed.); Miller, Gloria E. (Ed.). Handbook of psychology: Educational psychology, Vol. 7. (pp. 177-198). Hoboken, NJ: John Wiley & Sons, Inc. Smith, S. M. (1988). Environmental context-dependent memory. In G. M. Davies & D. M. Thomson (Eds.) Memory in Context: Context in Memory (pp. 13-34), Chichester, UK: Wiley. Smith, S. M., & Vela, E. (2001). Environmental context-dependent memory: A review and meta-analysis. Psychonomic Bulletin & Review, 8, 203-220. Smith, S. M., Glenberg, A., & Bjork, R. A. (1978). Environmental context and human memory. Memory and Cognition, 6, 342-353. Sokoloff, D. R. and Thornton, R. K. (1997). Using interactive lecture demonstrations to create an active learning environment. The Physics Teacher, 35, 340-347. Springer, L., Stanne, M.E., and Donovon, S. (1999), ‘Effects of small group learning on undergraduates in science, mathematics, engineering and technology: A metaanalysis’, Review of Educational Research, 69(1), 50-80. Stolovich, H. D., & Keeps, E. J. (2002). Telling Ain’t Training. Alexandria, VA: ASTD Press. Stuart,S.A.J., & Brown,M.I. (2003-4) "An electronically enhanced philosophical learning environment: Who wants to be good at logic?" Discourse: Learning and teaching in philosophical and religious studies vol.3 no.2 pp.142-153 Stuart,S.A.J., & Brown,M.I. (2004) "An evaluation of learning resources in the teaching of formal philosophical methods" Association of Learning Technology Journal - Alt-J vol.11 no.3 pp.58-68 Stuart,S.A.J., Brown,M.I. & Draper,S.W. (2004) "Using an electronic voting system in logic lectures: one practitioner's application" Journal of Computer Assisted Learning vol.20 pp.95-102 Teaching, Learning, and Technology Center, University of California, 2001, Educational Technology Update: Audience Response Systems Improve Student Participation in Large Classes, http://www.uctltc.org/news/2004/03/ars.html Thalheimer, W. (2002). Simulation-like questions: How and why to write them. Available at www.work-learning.com/catalog . Thalheimer, W. (2003). Research that Supports Simulations and Simulation-Like Questions. Available at www.work-learning.com/catalog . Thalheimer, W. (2004). Bells, whistles, neon, and purple prose: When interesting words, sounds, and visuals hurt learning and performance–a review of the seductiveaugmentation research. Retrieved January 2007 from http://www.worklearning .com/catalog/ Thalheimer, W. (2006). Spacing learning over time: What the research says. Available at www.work-learning.com/catalog . Thalheimer, W. (forthcoming). Audience response learning: Using research-based questioning and discussion techniques to improve your classroom instruction. Wieman, C., & Perkins, K. (2005). Transforming physics education, November 2005, 58(11), p.36-41. Also available (as of March 2007) at www.colorado.edu/physics/EducationIssues/papers/PhysicsTodayFinal.pdf . Thornton, R. K. and Sokoloff, D. R. (1998). Assessing student learning of Newton’s laws: The Force and Motion Conceptual Evaluation and the evaluation of active learning laboratory and lecture curricula. American Journal of Physics, 66, 338-352. Tobias, S. (1994). They’re Not Dumb, They’re Different: Stalking the Second Tier. Tuscon, USA: Research Corporation a Foundation for the Advancement of Science. Topping,K., The effectiveness of peer tutoring in further and higher education: A typology and review of the literature. Higher Education 32(3), 1996, 321- 345. Treagust, D. F. (1988). Development and use of diagnostic tests to evaluate students' misconceptions in science. International Journal of Science Education, 10, 159-169. Tyson, L. M. and Bucat, R.B. (1995). Chemical equilibrium: Using a two-tier test to examine students' understanding. Western Australia Science Education Association Annual Conference. Uhari,M., Marjo Renko and Hannu Soini (2003) "Experiences of using an interactive audience response system in lectures" BMC [BioMed Central] Medical Education vol.3 article 12 http://www.biomedcentral.com/1472-6920/3/12#IDATWC2D Valacich, J. S., Dennis, A. R., & Nunamaker Jr., J. F. (1992). Group size and anonymity effects on computer-mediated idea generation. Small Group Research 23 (1) 49-73. Van Dijk, L. A., Van Den Berg, G. C. and Van Keulen, H. (2001). Interactive lectures in engineering education. European Journal of Engineering Education, 26, 15-28. VYGOTSKY, L. S. (1978). Mind in society. Cambridge, MA: Harvard University Press. West, L.H.T. and Pines, A.L. (1985), Cognitive structure and conceptual change, New York: Academic Press. N Wieman, C. and Perkins K., Physics Today (2005) 36-42 Wiggins, P, undated, An evaluation of the potential role and effectiveness of the use of EVS in academic law classes, Accessed online 26th September 2007 at: http://perseus.herts.ac.uk/uhinfo/library/y37860_3.pdf Wiske, M. S. (1997). Teaching for Understanding: Linking Research with Practice. San Francisco: Jossey-Bass. Wit,E. (2003) "Who wants to be... The use of a Personal Response System in Statistics Teaching" MSOR Connections Volume 3, Number 2: May 2003, p.5-11 (publisher: LTSN Maths, Stats & OR Network) Wolfman,S., Making lemonade: exploring the bright side of large lecture classes, Proc. SIGSCE '02, Covington, Kentucky, 2002, 257-261. Woods, H. A., & Chui, C. (2003). Wireless response technology in college classrooms. Downloaded February 2007 from www.mhhe.com/cps/docs/CPSWP_WoodsChiu.pdf .


    Last changed 23 Jul 2004 ............... Length about 900 words (6000 bytes).
    (Document started on 23 Jul 2004.) This is a WWW document maintained by Steve Draper, installed at http://www.psy.gla.ac.uk/~steve/ilig/faq.html. You may copy it. How to refer to it.

    Other frequent questions

    By Steve Draper,   Department of Psychology,   University of Glasgow.

    This page is a place for other questions frequently asked by inquirers, but not answered elsewhere in these web pages.

    How many handsets do you lose?

    xx

    xx

    xx


    Last changed 25 June 2010 ............... Length about 7,000 words (49,000 bytes).
    This is a WWW document maintained by Steve Draper, installed at http://www.psy.gla.ac.uk/~steve/ilig/tech.html.

    EVS technologies, alternatives, vendors

    (written by Steve Draper,   as part of the Interactive Lectures website)

    Contents (click to jump to a section)

    What are the alternative methods and technologies to the PRS handsets we've bought? In fact this is all part of a wider set of choices. Our own approach at Glasgow university adopted the position:

    The ideal system for this would allow huge groups of students to register a response to a MCQ (multiple choice question) with privacy (secret ballot), and have the results immediately summarised, and the summary displayed. Feedback to individual students (e.g. by LCDs on handsets) could be nice. Best of all would be to escape the MCQ strategy and have open-ended input from each audience member.

    Besides these web pages containing our views, there are some other reports on what technology to adopt:

    Non-MCQ

    There are many other interactive techniques than MCQs. See for example:
  • Summary map of techniques
  • Notes from the Social Policy and social work LTSN / Bristol.
  • More pointers.
  • A journal article: Charman, D.J. & Fullerton, H. (1995) Journal of Geography in Higher Education "Interactive Lectures: a case study in a geographical concepts course" vol.19 no.1 pp.41-55
  • "Interactive lectures: 20 years later" by Thiagi (2002).
  • Steinert,Y. & Snell,L.S. (1999) "Interactive lecturing: strategies for increasing participation in large group presentations" Medical teacher vol.21 no.1 pp.37-42.
  • Bligh,D. (2000) What's the use of lectures? (Josey-Bass: San Francisco)
  • Edwards,H., Smith,B.A. & Webb,G. (2001) Lecturing: Case studies, experience and practice (Kogan Page: London)
  • MacGregor,J., Cooper,J.L., Smith,K.A. & Robinson,P. (2000) Strategies for energizing large classes: From small groups to learning communities (Josey-Bass: San Francisco)

    Non-electronic, but MCQ

    Given the use of MCQs as the way to involve student interaction, there are other ways that are possible and in fact have been heavily used in the recent past and present.

    Electric but not wireless (MCQ)

    There have been, and perhaps still are, cases where particular rooms have had systems installed based on wiring rather than wireless technology. Some examples of this are described in the History section at the end of this page. Nowadays the installation or even the cable costs of such systems would outweigh those of wireless ones, besides being tied to a single room.

    Electronic voting (MCQ) technologies

    This section lists non-computer special electronic handset voting systems that support MCQs. Non-MCQ systems, that allow open-ended responses from the audience, are discussed in a later section. And don't forget the alternative of using computers: one PC per student (discussed in this paper) and in a section below.

    The price information below is out of date, but as of March 2008 I cleaned up a lot of the rest of it. Still, even an out of date starting point is better than none for people looking for a vendor.

    For another view you could look at this 5 Aug 2005 news article by news.com, and 3 ads. The article also reports that "U.K market research firm DTC Worldwide, which tracks the global market for education technology, expects that 8 million clickers ... will be sold annually by 2008".

  • Comparing different hardware: list of reviews on the web.

    Open-ended audience input

    The key function that MCQ-oriented technology cannot cover is allowing open-ended (e.g. free text) input from each audience member, rather than just indicating a selection from a small, fixed set of alternatives. However it is important to think, not just that MCQs are a limited way of asking questions, but what on earth a presenter in front of an audience of several hundred could possibly do with hundreds of free text inputs. The great virtue of MCQs is that great numbers of answers can be summarised in a single, simple summary (e.g. a bar or pie chart), whereas it would take a human not a computer, and considerable time, to group free text answers into "similar" points.

    For that reason, I long assumed that EVS couldn't usefully do open ended text input, because the presenter and audience couldn't do anything with it. However Promethean have gone a considerable way to proving me wrong. Their handsets allow text input similar to mobile phones, and crucially their software supports the presenter in using it. One mode provides a set of 8? blank boxes on the screen and as the words (or possibly phrases) come in, the presenter uses the mouse to sort these into the boxes (thus grouping variant spellings, synonyms etc. together); then a followup MCQ vote could be taken directly from that screen with each box being an optional response. With an audience of 30 this only takes a few seconds and works very well. This allows a 2 phase student feedback quiz to be done very fast: you first ask (free text mode) "what is the thing you struggle with most on the course?", quickly group similar answers, then re-vote to check which really is the top issue for the group.

    Even with 300 something can be done: it provides a list of the received words with a frequency count against each word. With big numbers it doesn't matter losing a few percent to deviant spellings etc., or ignoring words that only one person put in: you probably only want the popular (high frequency) ones anyway.

    However if you do want this function, then the most obvious method is to teach in rooms or labs with a computer per student (or at least per group of students); and use the network instead of infrared to interact. If the computers use wireless networking, then the system could be mobile and flexible. (See this discussion.)

    Other specialised equipment however allows some of this.

    Voting by SMS (mobile phone) texting

    For some years, in nearly every talk or workshop on EVS, someone would suggest that it could all be done using texting on the mobile phones very nearly all students carry. In 2008, an MIT startup company Poll Everywhere created by Jeff Vyduna (jvyduna AT sloan.mit.edu) offered this service (which is not yet available in the UK). Here is a discussion about the apparent prospects and problems with this approach.

    The service provided seems to be:

  • A short text number to dial (41411)
  • Msg content each voter types in of the form "CAST 10082"
  • Votes caught, processed, put on a web page that can be displayed in the talk. And embedded in powerpoint.
  • Votes from SMS and the web can be combined.
  • (Also can download spreadsheet form of the data)

    First, the attractions are large: a speaker need only arrange and pay for the service in advance, and have a live internet (WWW) connection in the lecture theatre (actually still quite difficult and rare in my university), but they can reasonably assume most of the audience will come with their "handset" i.e. mobile phone, and no other equipment or setup is necessary on the spot. PollEverywhere also says their software is integrated with PowerPoint.

    However there are several issues, in fact drawbacks, with this.


    Matt Jones (always@acm.org) has a paper on trying SMS mobile phone text messaging in this way:
    Jones,M. & Marsden,G. (2004) "'Please turn ON your mobile phone' -- First impressions of text-messaging in lectures" (University of Waikato, Computer Science Tech Report (07/2004))
    In that study:

    Nevertheless, the students were favourable to this. So it is feasible if you don't mind only a minority voting successfully.

    Summary: how to decide if to adopt this technology

  • Much of the cost is retained by the presenter/university, but students may be charged for the texts by their mobile phone service provider.
  • It seems likely that a considerable proportion will not have their votes "heard", especially in large audiences. There isn't much useful data on this that I have found so far, and what there is doesn't look good. Especially in large audiences, missing some votes doesn't seem to matter at first thought: everyone knows what (they thought) they voted, everyone sees how that relates to the group votes. However, it quickly undermines the meaning: as people realise their votes aren't seen, they lose motivation, can't trust the summaries in the same way, etc. And the presenter too is increasingly misled, along with the audience; particularly if the missing votes are not random, but are the last (and perhaps most thoughtful) section of the audience.

    The paradox, or rather cleft stick, is that:

  • This technology is significantly easier to start up and set up especially for one-off presenters and audiences.
  • BUT for such uses, the strong tendency is to use quick questions that require no thinking: and then the longer key press sequences and potentially long latency times will matter more than they would for mature, regular educational uses where longer thinking times are required of the auidence, and slower equpment response times become less problematic.

    Other near future

    In 2002 near future solutions look like including using text messaging from mobile phones (see previous section), or with all audience members having a PDA (personal digital assistant i.e. palm-top computer) with wireless networking. See also this system for voting from pocket PCs: class in hand.

    Mark Finn has a journal paper reviewing projects to date that have used PDAs in teaching.

    Historical cases of classroom voting technology

    I'm interested in having a history of classroom voting technology; and putting it here. I repeatedly hear rumours about a "system just like that": these (so far) turn out to be wired (as opposed to wireless) handset-equipped lecture theatres. Such systems are of course not mobile but tied to particular rooms, and support only MCQs, not open-ended responses from the audience.

    A comment suggested by John Cowan and in different words by Willie Dunn, is that these systems represented a first effort towards engaging with learning rather than teaching. Their importance was perhaps more that shift: and when the equipment, or more generally feedback classrooms, were abandoned, it was as much to take the underlying change in attitude further into new forms than a step backwards.


    Last changed 18 May 2003 ............... Length about 900 words (6000 bytes).
    (Document started on 18 May 2003.) This is a WWW document maintained by Steve Draper, installed at http://www.psy.gla.ac.uk/~steve/ilig/techtech.html. You may copy it. How to refer to it.

    Some technical details on PRS

    By Steve Draper,   Department of Psychology,   University of Glasgow.

    This page is about details that most lecturers using the PRS handset system will never need or want to know. For the rest of you, ....

    High/low confidence buttons
    ID numbers
    Time stamps
    Log files
    Range
    Angles
    How to install it; magic password; obscure error messges


    Last changed 8 Nov 2010 ............... Length about 2,000 words (23,000 bytes).
    This is a WWW document maintained by Steve Draper, installed at http://www.psy.gla.ac.uk/~steve/ilig/people.html.

    UK handset users and sites

    (written by Steve Draper,   as part of the Interactive Lectures website)

    This page lists some sites and people I know of in the UK mainly in Higher Education who are interested in classroom handsets, PRS, or similar approaches to interactive teaching. (To see why we are interested in this technique, and other information about why you might be interested, look at the parent page to this one.)

    I suggest that if you are looking for a person or place, that you use your browser's "Find" command to search this page for (part of) the name you are interested in.

    This page is organised firstly by (university) site with just a few key people mentioned: it would not be practical to mention them all. The order is idiosyncratic: expect to search using the Find command, not by scanning by eye. This page just contains people I happen to know about: it is not likely to be complete. If you would like to be added or removed, or if you can suggest someone else who should be listed here, please email me (s.draper@psy.gla.ac.uk) and I will act promptly on your request. People have found it useful to discover who is already interested in EVS or PRS near them, and conversely to advertise here their interest to others in their institution or city. Also, any pointers to papers and web documents on this would be gratefully received.

    (PRS is used widely in some places outside the UK, including Hong Kong University of Science and Technology, UMass/Amherst, Rutgers, University of British Columbia, North Dakota, and UC Berkeley. See also for mainly USA sites using PRS.)

    Strathclyde University

    Jim Boyle at the engineering pub Prof. Jim Boyle, in Mechanical Engineering, may be the longest standing practitioner in the UK, starting with non-technologically supported "peer instruction" in 1997, then using an earlier equipment system (Classtalk), and then being the first UK purchaser of PRS in 1998. He has by now modified not just his teaching practice but the timetable and the architecture of his teaching rooms, and been involved in project NATALIE. A number of papers based on evaluations by David Nicol are becoming available. The longest standing use is in the first year class in Mechanical Engineering, but there is some other use of the equipment at Strathclyde now, e.g. in maths (Geoff McKay), in psychology, in teaching foreign languages, (Michele Dickson) and student induction.

    University of Glasgow

    We have tried out the use of handsets since October 2001 in a variety of departments including Philosophy, Psychology, Computing Science, Biological sciences, Medicine, Vet School and the Dental School (with GPs), with audience sizes from 20 to 300, and with students in level 1 to level 4. See our interim report on this. See also the various papers published, listed on the main web page. Three contacts are Steve Draper (Psychology), Quintin Cutts (Computing Science), and Susan Stuart in Philosophy.

    Besides a central pool of mobile EVS equipment, available for any department or teacher to use, Physics have bought their own set of kit for first year lectures, Statistics have about 150 for their use, and Steve Brindley has a set for sessions aimed at those outside the university. Modern Languages use a set of 50 in smallish language teaching groups regularly; and a larger set for level 1 and 2 classes. Charles Higgins in Education is acquiring 300 partly for external use.

    University of Edinburgh

    Physics and Biology have embarked on serious PRS use in first year classes: Simon Bates is the person to talk to about this, and also see this site for other educational initiatives there. This followed preparatory work by Alistair Bruce who wrote a short article on this. A set of PRS kit for loan to lecturers, now more than 500 handsets, has been purchased by the Media and Learning Technology Service. Contact Mark Findlay or Nora Mogey .

    University of Surrey

    Vicki Simpson and Paul Burt (E-learning unit) (or try here or here) have been advising on adoption of the handsets. They had an Interactive Presenter system with 250 handsets that they used to pilot for the delivery of lectures in the School of Biomedical and Molecular Sciences. They have now bought 1,000 RF handsets using TurningPoint.

    University of Bath

    Now has a particularly well integrated and organised system of supporting new EVS users, with multiple elements to the support including collecting evaluation data on the success of the new uses. They currently have Turningpoint kit. See here. Contact Nitin Parmar.

    Nick Vaughan (Mechanical Engineering) may have used handsets (perhaps the CPS system). He supervised an undergraduate project that did a study on potential use, comparing PRS, Classtalk, and CPS:
    J.Smith (2001) Dialogue and interaction in a classroom environment (Final year research project, School of Mechanical Engineering, University of Bath). Summary.

    University of Aberdeen

    Phil Marston of the learning technology unit has bought a small set, and is evaluating them. http://www.abdn.ac.uk/diss/ltu/pmarston/prs/. (or if you have already registered, then here).

    University of Wolverhampton

    Apparently have used PRS to teach computing; but now have Turningpoint kit. Contact ILE.

    Wolverhampton is also the home of the large REVEAL project involving EVS use largely in schools: using Promethean equipment; and have reports available. Contacts there: Andrew Hutchinson and Diana Bannister, who are part of a learning technologies team.

    Glasgow Caledonian University

    A large set (many hundreds) of PRS-RF handsets have been bought and used in the business school for first year students.

    University of Central England (in Birmingham)

    Bill Madill (Bill.Madill@uce.ac.uk) of the School of Property and Construction wrote an MEd thesis on Peer Instruction. He has a case study of using PRS available on the web: synopsis and full case study.

    University of Wales, college of medicine (Cardiff)

    They bought a (non-PRS) system and used it for a while from 1997, but it may have fallen into disuse: see this paper by Joe Nicholls.
    Wendy Sadler in Physics and Astronomy is buying a set for school liaison as well as for students.

    University of Portsmouth

    Michael McCabe (michael.mccabe@port.ac.uk) was awarded a 3-year HEFCE National Teaching Fellowship for Project LOLA (Live and On-Line Assessment -- the proposal is available). The live part of the assessment relates to the use of interactive classrooms in face-to-face teaching, which includes PRS handsets as one approach. Other papers are listed on the main page.

    An unconfirmed report says the psychology group also bought PRS equipment.

    The ExPERT centre has bought about 80 IML handsets. For more information, contact Lesley-Jane Reynolds.

    University of Lancaster

    Caroline Elliott (Economics dept.) has done featured work on using handsets in 2000/1. The dept. of Accounting and Finance also uses them regularly (see here). There is a set of about 150 PRS handsets: contact Sue Armitage.

    University of Southampton

    Su White and Hugh Davis and others in Electronics and Computer Science have acquired some equipment and begun exploring its use in teaching from 2002.

    Ray d'Inverno in Maths has begun using PRS, and has an early report and pedagogic rationale for PRS use.

    Chemistry and Psychology have each purchased sets of 130+ TurningPoint RF handsets, and the university a further 240 for wider experimentation. David Read is using them in schools outreach.

    University of Nottingham

    Liz Sockett in Genetics is a big fan, and uses them extensively.

    Science Museum (London)

    Deborah Scopes (d.scopes@nmsi.ac.uk) has been exploring the use of handsets as an enhancement to public debates and lectures on science.

    University of Ulster

    Edwin Curran says PRS was installed ready for Sept 2003 in a 170 seat lecture theatre in Engineering, plus a portable system.

    University of Liverpool

    Both CPS and PRS used there. Doug Moffat (Mechanical Engineering).

    Liverpool John Moores University

    Laura Bishop (Palaeoantropologist) and Clare Milsom (Geologist) are considering introducing EVS use there.

    University of Salford

    PRS used there. Elizabeth Laws, Engineering.

    Kingston University

    George Masikunas, Andreas Panayiotidis, and others at the Kingston University (Kingston Upon Thames) business school introduced PRS in 2003-4 and use it for first year classes of about 250 students, where small groups are required to discuss and agree answers to the questions posed. They now (Sept. 2005) have and use a set of 60 handsets of the PPVote system.

    University of Central Lancashire at Preston

    Mick Wood is leading the introduction of EVS (using IML not PRS kit) there, with a first application in Sports Psychology.

    University College London

    Martin Oliver produced a report on whether using handsets might be worthwhile at UCL.

    University of Keele

    Stephen Bostock in the Staff Development Centre got interested in using PRS, meantime introduced the use of coloured cubes as a substitute, but now has Promethean radio-connected voting handsets working with interactive whiteboards, around campus.

    University of Northumbria

    Chris Turnock (or here) is currently co-ordinating staff development in the use of the system before undertaking further evaluation of its use within the university. Paul Barlow (paul.barlow@unn.ac.uk) in the School of Humanities is considering applying EVS in the Arts area.

    University of Leeds

    Leeds now has over 100 PRS handsets and a simple introductory website for EVS. Contact Tony Lowe.

    Robert Gordon University (in Aberdeen)

    Roger McDermott in the school of computing started to use them in various classes from October 2004, and there are more than 300 PRS handsets now. Contact Garry Brindley. The faculty of health and social care has also taken up their use.

    Kings College London

    Ann Wilkinson is looking into EVS use. Professor Simon Howell in Biomedical Sciences is believed to have used an EVS.

    Bangor University

    Paul Wood (r.p.wood@bangor.ac.uk) is installing a 120 set PRS system in a lecture theatre in November 2004 and plans to start trials with enthusiastic academics.

    Coventry University

    Anne Dickinson has been investigating possible use, particularly of the Discourse equipment, and has written a report. She also has a project on it: "Investigation of the potential use of a classroom communication system in the Higher Education context"

    Lewisham College, London

    Raja Habib and Raul Saraiva are looking into using PRS on behalf of their college.

    Brooklands College, Weybridge, Surrey

    Theresa Willis

    National University of Ireland, Galway

    Ann Torres has started (October 2004) using PRS for teaching Marketing Principles to a class of over 300.

    Army Foundation College in Harrogate

    Lesley Harburn (lesley.pete@tiscali.co.uk) is using Promethean portable pods as EVS in teaching 16 year-old Junior Soldiers as part of an Apprenticeship in IT at the Army Foundation College in Harrogate.

    Essex University

    Caroline Angus in Sports Science is a current big user (with PRS-RF kit).

    Bournemouth University

    Kimberley Norman in the Learning Design Studio is interested.

    Brighton University

    Gareth Reast is looking into purchasing a set of CPS (eInstruction); perhaps for use in the business school and applied social sciences.

    University of Hertfordshire

    They have some PRS and some Promethean equipment for audiences of 250. Contact: Andy Oliver.

    Roehampton University

    Andy Curtis is going to purchase some kit, and get it used.

    University of East Anglia

    Believed to have bought some kit for the medical school. Contact Ann Barrett.

    University of Bristol

    Have got the loan of 400 Promethean RF handsets. Contact Nic Earle.

    University of Durham

    Have 50-100 Quizdom handsets for Maths. Contact James Blowey. Also interest from Stuart Jones, Geology dept.

    St. George's, University of London

    Have 50-100 Quizdom handsets (medical students). Contact Philip Harvey.

    Queen's University Belfast

    Have 500 TurningPoint handsets. See here for the project. Contact David Robinson, or Prof. Brian Whalley, Geomorphology, for the original push.

    University of Newcastle

    Have 100 TurningPoint handsets as part of central support. Contact Az Mohammed (Az.Mohammed AT newcastle.ac.uk).

    Leicester University

    Has a set of RF KEEpad handsets plus TurningPoint software for use with first year biology students from Oct. 2007. Contact Jo Badge. Another set at the university has been used for school outreach.

    Gordon Campbell, professor of Renaissance Studies, uses them.

    Loughborough University

    Has now got 300 handsets plus TurningPoint software. These are held centrally by media services and booked out. Currently most used in teaching maths to engineers by the Math Education Centre (contact Carol Robinson).

    City University

    Since about Easter 2008, they have 1000 RF handsets using TurningPoint software. See this page. Key contact: Sian Cox.

    Reading University

    Gan Niyadurupola (d.g.niyadurupola AT reading.ac.uk) has used them successfully in chemistry.

    Queen Margaret University, Edinburgh

    They are triallng Turning Point in class sizes ranging from 10 to 180. Contact Graeme Ferris.

    Warwick University

    Library and medical school have kit and TurningPoint software since 2006.

    Abertay University

    Rebecca Ross is in the process of purchasing.

    Have made enquiries

    Mike Watkinson (m.watkinson@qmul.ac.uk), Chemistry, Queen Mary, University of London


    Last changed 13 Nov 2009.............Length about 3821 words (36,000 bytes).
    (Document started on 6 Jan 2005.) This is a WWW document maintained by Steve Draper, installed at http://www.psy.gla.ac.uk/~steve/ilig/bib.html. You may copy it. How to refer to it.

    Ad hoc bibliography on EVS

    By Steve Draper,   Department of Psychology,   University of Glasgow.

    This page is an ad hoc bibliography of papers about EVS. I expect to paste in lists of references I come across without checking them: you use these at your own risk, but they could be a useful starting point. Please send in suggestions: great papers, useful papers, papers you have written, corrections to entries already here. Attached word documents or HTML is preferred (or I may not do the formatting). I will probably only include either published journal articles and books or reports on the web; and I may exclude anything without explanation.

    A very few I have starred '**'. I think these are worth special attention.

    See also Will Thalheimer's annotated bibliography. While mine is an uncritical heap to use as a starting point, his selects often cited ones AND provides some critical appraisal.


    ABRAHAMSON, A.L. (1998) An Overview of Teaching and Learning Research with Classroom Communication Systems. Paper presented at the International Conference of the Teaching of Mathematics, Samos, Greece.

    ANDERSON, T., HOWE, C., SODEN. R., HALLIDAY, J. & LOW, J. (2001). Peer interaction and the learning of critical thinking skills in further education students. Instructional Science, 29, 1-32.

    Angelo, T.A., and K.P. Cross (1993). Minute Paper. In Classroom Assessment Techniques: A Handbook for College Teachers, San Francisco: Jossey-Bass, 148-153.

    Annett, D. (1969): Feedback and Human Behaviour. New York, Penguin.

    ** Banks,D.A. (ed.) (2006) Audience response systems in higher education: Applications and cases (Information Science Publishing)

    Barak, M., Lipson, A., & Lerman, S. (2006). Wireless laptops as means for promoting active learning in large lecture halls. Journal of Research on Technology in Education, 38(3), 245-263.

    Beatty, I. D., Gerace, W. J., Leonard, W. J., & Dufresne, R. J. (2006). Designing effective questions for classroom response system teaching. American Journal of Physics, 74(1), 31-39.

    N Beatty, I.D., Gerace, W.J., Leonard, W.KJ., Dufresne, R.J., Am. J. Phys xxx xxxx (2006)

    Bjork, R. A., & Richardson-Klavehn, A. (1989). On the puzzling relationship between environmental context and human memory. In C. Izawa (Ed.) Current Issues in Cognitive Processes: The Tulane Floweree Symposium on Cognition (pp. 313-344). Hillsdale, NJ: Erlbaum.

    Bligh, D. (2000), What’s the use of lectures? San Francisco: Jossey-Bass.

    Bloom’s Taxonomy : http://www.officeport.com/edu/blooms.htm

    Bloom, B. S. (1956). Taxonomy of Educational Objectives, the Classification of Educational Goals — Handbook I: Cognitive Domain New York: McKay.

    Boyle, J.T. & Nicol, D.J. (2003) "Using classroom communication systems to support interaction and discussion in large class settings" Association for Learning Technology Journal vol.11 no.3 pp.43-57

    Bransford J. , Brophy S., Williams S. (2000) "When Computer Technologies Meet the Learning Sciences - Issues and Opportunities" Journal of Applied Developmental Psychology Volume 21, Number 1, January 2000, pp. 59-84

    Bransford, J. D., Brown, A. L., & Cocking, R. R. (1999). How people learn: Brain, mind, experience, and school. Washington, D. C.: National Academy Press.

    Brookfield, S. D. (1995), Becoming a critically reflective teacher, San Francisco: Jossey-Bass.

    Brown, J.S., Collins, A. & Duguid, P. (1989). Situated cognition and the culture of learning. Educational Researcher. 18, 32-42.

    Brumby, M.N. (1984), ‘Misconceptions about the concept of natural selection by medical biology students’, Science Education, 68(4), 493-503.

    Bruner, J. (1985), ‘Vygotsky: a historical and conceptual perspective’ in Wertsch, J.V. (ed), Culture, communication and cognition. Cambridge: Cambridge University Press.

    Burnstein, R. A. & Lederman, L. M. (2001). Using wireless keypads in lecture classes. The Physics Teacher, vol.39 no.1 pp.8-11.

    N CAL-laborate, UniServe Science International Newsletter, 14 (2005) available online http://science.uniserve.edu.au/pubs/callab/vol14/cal14_bates.pdf

    Carnaghan, C., & Webb, A. (2005). Investigating the effects of group response systems on learning outcomes and satisfaction in accounting education. Downloaded January 2007 from www.learning.uwaterloo.ca/LIF/responsepad_june20051.pdf .

    Carnaghan, C., & Webb, A. (2005). Investigating the effects of group response systems on learning outcomes and satisfaction in accounting education. Retrieved on January 1, 2007 from www.learning.uwaterloo.ca/LIF/ responsepad_june20051.pdf

    Champagne, A. B., Klopfer, L. E. and Anderson, J. H. (1980). Factors influencing the learning of classical mechanics. American Journal of Physics, 48, 1074-1079.

    Chickering, Arthur W. and Gamson, Zelda F, Seven Principles for Good Practice in Undergraduate Education - http://aahebulletin.com/public/archive/sevenprinciples1987.asp

    Christianson, R. G. & Fisher, K. M. (1999). Comparison of student learning about diffusion and osmosis in constructivist and traditional classrooms. International Journal of Science Education, 21, 687-698.

    Cohen, E.G. (1994), ‘Restructuring the classroom: Conditions for productive small groups’, Review of Educational Research, 64(1), 3-35.

    Comlekci, T., Boyle, J.T., King, W., Dempster, W., Lee, C.K., Hamilton, R. and Wheel, M.A. (1999), ‘New approaches in mechanical engineering education at the University of Strathclyde in Scotland: I — Use of Technology for interactive teaching’, in Saglamer, G.(ed), Engineering Education in the Third Millenium, Leuchtturm- Verlag.

    Cristle, C. A., & Schuster, J. W. (2003). The effects of using response cards on student participation, academic achievement, and on-task behavior during whole-class math instruction. Journal of Behavioral Education, 12(3), 147-165.

    Crombie, W. J. (2006). Harvard launches wireless classroom. Harvard University Gazette (February 23, 2006), Downloaded February 2007 from www.news.harvard.edu/gazette/2006/02.23/05-eclassroom.html .

    ** Crouch, C.H. and Mazur, E. (2001), ‘Peer Instruction: Ten years of experience and results’, American Journal of Physics, 69, 970-977

    Crouch,C., A. Fagen, P. Callan & E. Mazur (2004) "Classroom demonstrations: learning tools or entertainment" Am. J. Physics vol.72 no.6 pp.835-838

    Cutts,Q. Carbone,A., & van Haaster,K. (2004) "Using an Electronic Voting System to Promote Active Reflection on Coursework Feedback" To appear in Proc. of the Intnl. Conf. on Computers in Education 2004, Melbourne, Australia, Nov. 30th — Dec 3rd 2004.

    Cutts,Q. Kennedy,G., Mitchell,C., & Draper,S.W. (2004) "Maximising dialogue in lectures using group response systems" Accepted for 7th IASTED Internat. Conf. on Computers and Advanced Technology in Education, Hawaii, 16-18th August 2004

    Cutts,Q.I. & Kennedy, G.E. (2005) "Connecting Learning Environments Using Electronic Voting Systems" Seventh Australasian Computer Education Conference, Newcastle, Australia. Conferences in Research and Practice in Information Technology, Vol 42, Alison Young and Denise Tolhurst (Eds)

    Cutts,Q.I. & Kennedy,G.E. (2005) "The association between students’ use of an electronic voting system and their learning outcomes" Journal of Computer Assisted learning vol.21 pp.260–268

    Davies, G. (1986). Context effects in episodic memory: A review. Cahiers de Psychologie Cognitive, 6, 157-174.

    DeCorte, E. (1996), ‘New perspectives on learning and teaching in higher education’, in Burgen, A. (ed.), Goals and purposes of higher education, London: Jessica Kingsley.

    DOISE, W & MUGNY, G. (1984). The social development of the intellect. Oxford: Pergamon.

    Dori, Y. J., & Belcher (2005). How does technology-enabled active learning affect undergraduate students’ understanding of electromagnetism concepts? The Journal of the Learning Sciences, 14(2), 243-279.

    Draper, S. W., Cargill, J. and Cutts, Q. (2002). Electronically enhanced classroom interaction. Australian Journal of Educational Technology, 18, 13-23.

    Draper, S.W. (1998) "Niche-based success in CAL" Computers and Education vol.30, pp.5-8

    Draper,S.W. & Brown,M.I. (2004) "Increasing interactivity in lectures using an electronic voting system" Journal of Computer Assisted Learning vol.20 pp.81-94

    Draper,S.W., Brown, M.I., Henderson,F.P. & McAteer,E. (1996) "Integrative evaluation: an emerging role for classroom studies of CAL" Computers and Education vol.26 no.1-3, pp.17-32

    Draper,S.W., Cargill,J., & Cutts,Q. (2002) Electronically enhanced classroom interaction Australian journal of educational technology vol.18 no.1 pp.13-23. [This paper gives the arguments for interaction, and how EVS might be a worthwhile support for that.]

    Druckman, D., & Bjork, R. A. (1994). Learning, remembering, believing: Enhancing human performance. Washington, D. C.: National Academy Press.

    ** Dufresne, R.J., Gerace, W.J., Leonard, W.J., Mestre, J.P., & Wenk, L. (1996) Classtalk: A Classroom Communication System for Active Learning Journal of Computing in Higher Education vol.7 pp.3-47 http://umperg.physics.umass.edu/projects/ASKIT/classtalkPaper

    Duncan, D. (2005). Clickers in the classroom: How to enhance science teaching using classroom response systems. San Francisco: Addison Wesley.

    Duncan, D. (2005). Clickers in the classroom: How to enhance science teaching using classroom response systems. San Francisco: Addison-Wesley.

    Eich, J. E. (1980). The cue dependent nature of state dependent retrieval. Memory and Cognition, 8, 157-173.

    Educational Technologies at Missouri, University of Missouri, Introducing Student Response Systems at MU http://etatmo.missouri.edu/toolbox/doconline/SRS.pdf

    Edwards, H., Smith, B.A. and Webb, G. (2001), Lecturing: Case studies, experience and practice, London: Kogan Page.

    N El Rady, J., Innovate, 2(4) 2006. Available online at http://www.innovateonline.info/index.php?view=article&id=171

    Elliott, C. (2003) Using a personal response system in economics teaching. International Review of Economics Education. Accessed 11 Nov 2004 http://www.economics.ltsn.ac.uk/iree/i1/elliott.htm

    Elliott,C. (2001) "Case Study: Economics Lectures Using a Personal Response System" http://www.economics.ltsn.ac.uk/showcase/elliott_prs.htm

    Fjermestad, J., & Hiltz, S. R. (2001). Group support systems: A descriptive evaluation of case and field studies. Journal of Management Information Systems (JMIS) vol.17 no.3 pp.115-160

    Frey, Barbara A. and Wilson, Daniel H. (2004) Student Response Systems, Teaching, Learning, and Technology: Low Threshold Applications - http://jade.mcli.dist.maricopa.edu/lta/archives/lta37.php

    Gardner, R., Heward, W. L., & Grossi, T. A. (1994). Effects of response cards on student participation and academic achievement: A systematic replication with inner-city students during whole-class science instruction. Journal of Applied Behavior Analysis, 27, 63-71.

    Glaser, R. (1990), ‘The re-emergence of learning theory within instructional research’, American Psychologist, 45(1), 29-39.

    Godden, D. R., and Baddeley, A. D. (1975). Context dependency in two natural environments: on land and underwater. British Journal of Psychology, 91, 99-104.

    Good, T. L., & Brophy, J. E. (2002). Looking in classrooms (9th edition). Boston: Allyn & Bacon.

    Guthrie, R. W., & Carlin, A. (2004). Using technology to engage passive listeners. The International Principal, vol.8 no.2

    N Hake, R. Conservation Ecology, 5, 28 (2002)

    ** Hake, R. R. (1998). Interactive-engagement versus traditional methods: A six-thousand student survey of mechanics data for introductory physics courses. American Journal of Physics, 66, 64-74.

    N Hake, R.R. Am. J. Phys. 66, 64-72 (1998)

    Hake,R.R. (1991) "My Conversion To The Arons-Advocated Method Of Science Education" Teaching Education vol.3 no.2 pp.109-111 online pdf copy

    Halloun, I.A. and Hestenes, D. (1985), ‘The initial knowledge state of college physics students’, American Journal of Physics, 53, 1043-1055.

    Hansen, E. J. (1998). Creating teachable moments…and making them last. Innovative Higher Education, 23(1), 7-26. Horowitz,H.M. (1988) "Student Response Systems: Interactivity in the Classroom Environment" IBM Learning Research http://www.qwizdom.com/fastrack/interactivity_in_classrooms.pdf

    N Hestenes, D. Am. J. Phys., vol.66, 465-7 (1998)

    N Hestenes, D. and Wells, M. Phys. Teach. 30, 141-158 (1992)

    N Hestenes, D., Wells, M., Swackhamer, G. Phys. Teach. 30, 159-166 (1992)

    Horowitz,H.M. (2003) "Adding more power to powerpoint using audience response technology" http://www.socratec.com/FrontPage/Web_Pages/study.htm

    Horowitz. H. M. (1998). Student response systems: Interactivity in a classroom environment. Proceedings of Sixth Annual Conference on Interactive Instruction Delivery. Society for Applied Learning Technology, 8-15. This article has been updated in 2003 with a new title, Adding more power to PowerPoint using audience response technology, available at: www.audienceresponseinfo.com/audienceresponse-info/power-to-powerpoint.html

    Howe, C. J. (1991) "Explanatory concepts in physics: towards a principled evaluation of teaching materials" Computers and Education vol.17 no.1 pp.73-80

    Hunt, D. (1982) "Effects of human self-assessment responding on learning" Journal of Applied Psychology vol.67 pp.75-82.

    Inverno, R. "Making Lectures Interactive", MSOR Connections Feb 2003, Vol.3, No.1 pp.18-19

    Irving,A., M. Read, A. Hunt & S. Knight (2000) Use of information technology in exam revision Proc. 4th International CAA Conference Loughborough, UK http://www.lboro.ac.uk/service/fi/flicaa/conf2000/pdfs/readm.pdf

    N Joe Redish at http://www.physics.umd.edu/perg/role/PIProbs/ProbSubjs.htm See the collection of Peer Instruction questions maintained by

    Judson, E., & Sawada, D. (2006). Audience response systems: Insipid contrivances or inspiring tools? In David A. Banks (Ed.) Audience Response Systems in HigherEducation: Applications and Cases (pp. 26-39).

    Kearney, M. (2002). Description of Predict-observe-explain strategy supported by the use of multimedia. Retrieved April 8, 2004, from Learning Designs Web site: http://www.learningdesigns.uow.edu.au/exemplars/info/LD44/

    Kellum, K. K., Carr, J. E., & Dozier, C. L. (2001). Response-card instruction and student learning in a college classroom. Teaching of Psychology, 28(2), 101-104.

    Klaas and Baggely, 2003 The International Review of Research in Open and Distance Learning,†Vol 4, No 2 (2003), † ISSN: 1492-3831 Accessed online 26th September 2007 at: http://www.irrodl.org/index.php/irrodl/article/view/138/218

    Kohn, A. (1999). Punished by rewards: The trouble with gold stars, incentive plans, A's, praise, and other bribes. Boston: Houghton Mifflin.

    Kolb, D. A. (1984): Experiential Learning: Experience as the source of learning and development. Englewood Cliffs, NJ, Prentice Hall.

    Kolikant, Y., McKenna, A., & Yalvac, B. (2005) "Using the personal response system as a cultural bridge from silent absorption to active participation" in Kommers, P., & Richards, G. (Eds.), Proceedings of World Conference on Educational Multimedia, Hypermedia and Telecommunications 2005 (pp. 2660-2667) Chesapeake, VA: AACE.

    Kourilsky, M. L., & Wittrock, M. C. (1992). Generative teaching: An enhancement strategy for the learning of economics in cooperative groups. American Educational Research Journal, 29, 861-876.

    Langer, E. J. (1997). The power of mindful learning. Reading, MA: Addison-Wesley.

    Langer, E. J. (2001). Mindful Learning. Current Directions in Psychological Science, 9, 220-223.

    Laurillard, D. (1993), Rethinking university teaching, London: Routledge.

    LAVE, J & WENGER, E (1991). Situated learning: Legitimate peripheral participation. Cambridge University Press, Cambridge.

    Ledlow, Susan, 2001, Center for Learning and Teaching Excellence http://clte.asu.edu/active/lesspre.htm

    Liu, T. C., Liang, J. K., Wang, H. Y., Chan, T. W., & Wei, L. H. (2003). Embedding EduClick in Classroom to Enhance Interaction. In Proceedings of International Conference on Computers in Education (ICCE) pp.117-125.

    Liu, Y. (2003). Developing a scale to measure the interactivity of websites. Journal of Advertising Research vol.43 no.2 pp.207-218.

    MacGregor, J., Cooper, J.L., Smith, K.A. and Robinson, P. (2000), Strategies for energizing large classes: From small groups to learning communities, San Francisco: Jossey-Bass.

    MacManaway,M.A. (1968) "Using lecture scripts" Universities Quarterly vol.22 no.June pp.327-336

    MacManaway,M.A. (1970) "Teaching methods in HE -- innovation and research" Universities Quarterly vol.24 no.3 pp.321-329

    Marmolejo, E. K., Wilder, D. A., & Bradley, L. (2004). A preliminary analysis of the effects of response cards on student performance and participation in an upper division university course. Journal of Applied Behavior Analysis, 37, 405-410.

    Marton, F., and Säljö, R. (1976). On qualitative differences in learning: I — outcome and process. British Journal of Educational Psychology, 46,4-11.

    Matthews, R.S. (1996), ‘Collaborative Learning: creating knowledge with students’, in Menges, M., Weimer, M. and Associates. Teaching on solid ground, San Francisco: Jossey-Bass,

    Mayer, R. E., Heiser, J., & Lonn, S. (2001). Cognitive constraints on multimedia learning: When presenting more material results in less understanding. Journal of Educational Psychology, 93, 187-198.

    Mayes, T. (2001), ‘Learning technology and learning relationships’, in J. Stephenson (ed), Teaching and learning online, London: Kogan Page.

    Mazur, E. (1997). Peer Instruction: A User’s Manual. Upper Saddle River, NJ:Prentice-Hall.

    McBeath, R. J. ed. (1992) Instructing and Evaluating Higher Education: A Guidebook for Planning Learning Outcomes. New Jersey: ETP.

    McCabe et al. (2001) The Integration of Group Response Systems into Teaching, 5 th International CAA Conference, http://www.lboro.ac.uk/service/fi/flicaa/conf2001/pdfs/d2.pdf

    N McDermott, L. Am. J. Phys. 61, 295-298 (1993).

    McDermott, L.C. (1984), ‘Research on conceptual understanding in mechanics’, Physics Today, 37 (7) 24-32.

    Meltzer,D.E. & Manivannan,K. (1996) "Promoting interactivity in physics lecture classes" The physics teacher vol.34 no.2 p.72-76

    Moreno, R., & Mayer, R. E. (2000). A coherence effect in multimedia learning: The case for minimizing irrelevant sounds in the design of multimedia instructional messages. Journal of Educational Psychology, 92, 117-125.

    Narayan, J. S., Heward, W. L., Gardner, R., Courson, F. H., & Omness, C. K. (1990). Using response cards to increase student participation in an elementary classroom. Journal of Applied Behavior Analysis, 23, 483-490.

    Nicol, D. J. & Boyle, J. T. (2003) "Peer Instruction versus Class-wide Discussion in large classes: a comparison of two interaction methods in the wired classroom" Studies in Higher Education vol.28 no.4 pp.457-473

    Novak,G.M., Gavrin,A.D., Christian,W. & Patterson,E.T. (1999) Just-in-time teaching: Blending Active Learning and Web Technology (Upper Saddle River, NJ: Prentice- Hall)

    Novak,G.M., Gavrin,A.D., Christian,W. & Patterson,E.T. (1999) http://www.jitt.org/ Just in Time Teaching (visited 20 Feb 2005)

    Palinscar, A.S. (1998), "Social constructivist perspectives on teaching and learning", Annual Review of Psychology, 49, 345-375. To be published in: Association for Learning Technology Journal (ALT-J), 2003, 11(3), 43-57.

    Panetta, K.D., Dornbush, C. and Loomis, C. (2002), "A collaborative learning methodology for enhanced comprehension using TEAMThink" Journal of Engineering Education, 223-229.

    Paschal, C. B. (2002). Formative assessment in physiology teaching using a wireless classroom communication system. Advances in Physiology Education, 26(4), 299-308.

    Philipp, Sven and Schmidt, Hilary (2004) Optimizing learning and retention through interactive lecturing: Using the Audience Response System (ARS) at CUMC, http://library.cpmc.columbia.edu/cere/web/facultyDev/ARS_handout_2004_overview.pdf

    Pickford, R. and Clothier, H. (2003) "Ask the Audience: A simple teaching method to improve the learning experince in large lectures", Proceedings of the Teaching, Learning and Assessment in Databases conference, LTSN ICS.

    ** Poulis, J., Massen, C., Robens, E. and Gilbert, M. (1998). Physics lecturing with audience paced feedback. American Journal of Physics, 66, 439-441.

    N Raine, D. Physics World, December 2000 http://physicsweb.org/articles/world/13/12/2/1

    REITER, S. N. (1994). Teaching dialogically: its relationship to critical thinking in college students. In P. R. PINTRICH, D. R. BROWN & C. E. WEINSTEIN (eds). Student motivation, cognition and learning. Lawrence Erlbaum, New Jersey.

    RESNICK, L.B. (1989). Knowing, learning and instruction: Essays in honour of Robert Glaser. Lawrence Erlbaum Associates, Hillsdale, New Jersey.

    Resnick,L.B. (1989) "Introduction" ch.1 pp.1-24 in L.B.Resnick (Ed.) Knowing, learning and instruction: Essays in honor of Robert Glaser (Hillsdale, NJ: Lawrence Erlbaum Associates).

    Roediger, H. L., III, & Guynn, M. J. (1996). Retrieval processes. In E. L. Bjork & R. A. Bjork (Eds.), Memory (pp. 197-236). San Diego, CA: Academic Press.

    Roschelle,J., Penuel,W.R. & Abrahamson,L. (2004) "Classroom Response and Communication Systems: Research Review and Theory" American Educational Research Association, San Diego, CA, April 2004 http://ubiqcomputing.org/CATAALYST_AERA_Proposal.pdf

    Roselli, R.J. and Brophy, S.P. (2002) "Exploring an electronic polling system for the assessment of student progress in two biomedical engineering courses " Proceedings of the American Society for Engineering Education (CD-ROM DEStech Publications) Session 2609 http://www.vanth.org/docs/008_2002.pdf

    Rothkopf, E. Z. (1965). Some theoretical and experimental approaches to problems in written instruction. In J. D. Krumboltz (Ed.). Learning and the education process (pp. 193-221). Chicago: Rand McNally.

    Rothkopf, E. Z. (1966). Learning from written instructive materials: An exploration of the control of inspection behavior by test-like events. American Educational Research Journal, 3, 241-249.

    Schwartz, D. L., & Heiser, J. (2006). Spatial representations and imagery in learning. In R. Keith Sawyer (Ed.) The Cambridge Handbook of the Learning Sciences. Cambridge, UK: Cambridge University Press.

    Shapiro, J. A. (1997). Journal of Computer Science and Technology, May issue, 408-412. http://www.physics.rutgers.edu/~shapiro/SRS/instruct/index.html

    Shapiro, J. A. (1997). Student response found feasible in large science lecture hall Journal of College Science Teaching vol.26 no.6 pp.408-412.

    Sharma,M. (2002a). Interactive lecturing using a classroom communication system. Proceedings of the UniServe Science Workshop, April 2002, University of Sydney, 87-89. http://science.uniserve.edu.au/pubs/procs/wshop7/

    Slavin, R. E., Hurley, E. A., & Chamberlain, A. (2003). Cooperative learning and achievement: Theory and research. In Reynolds, William M. (Ed.); Miller, Gloria E. (Ed.). Handbook of psychology: Educational psychology, Vol. 7. (pp. 177-198). Hoboken, NJ: John Wiley & Sons, Inc.

    Smith, S. M. (1988). Environmental context-dependent memory. In G. M. Davies & D. M. Thomson (Eds.) Memory in Context: Context in Memory (pp. 13-34), Chichester, UK: Wiley.

    Smith, S. M., & Vela, E. (2001). Environmental context-dependent memory: A review and meta-analysis. Psychonomic Bulletin & Review, 8, 203-220.

    Smith, S. M., Glenberg, A., & Bjork, R. A. (1978). Environmental context and human memory. Memory and Cognition, 6, 342-353.

    Sokoloff, D. R. and Thornton, R. K. (1997). Using interactive lecture demonstrations to create an active learning environment. The Physics Teacher, 35, 340-347.

    Springer, L., Stanne, M.E., and Donovon, S. (1999), ‘Effects of small group learning on undergraduates in science, mathematics, engineering and technology: A metaanalysis’, Review of Educational Research, 69(1), 50-80.

    Stolovich, H. D., & Keeps, E. J. (2002). Telling Ain’t Training. Alexandria, VA: ASTD Press.

    Stuart,S.A.J., & Brown,M.I. (2003-4) "An electronically enhanced philosophical learning environment: Who wants to be good at logic?" Discourse: Learning and teaching in philosophical and religious studies vol.3 no.2 pp.142-153

    Stuart,S.A.J., & Brown,M.I. (2004) "An evaluation of learning resources in the teaching of formal philosophical methods" Association of Learning Technology Journal - Alt-J vol.11 no.3 pp.58-68

    Stuart,S.A.J., Brown,M.I. & Draper,S.W. (2004) "Using an electronic voting system in logic lectures: one practitioner's application" Journal of Computer Assisted Learning vol.20 pp.95-102

    Teaching, Learning, and Technology Center, University of California, 2001, Educational Technology Update: Audience Response Systems Improve Student Participation in Large Classes, http://www.uctltc.org/news/2004/03/ars.html

    Thalheimer, W. (2002). Simulation-like questions: How and why to write them. Available at www.work-learning.com/catalog .

    Thalheimer, W. (2003). Research that Supports Simulations and Simulation-Like Questions. Available at www.work-learning.com/catalog .

    Thalheimer, W. (2004). Bells, whistles, neon, and purple prose: When interesting words, sounds, and visuals hurt learning and performance–a review of the seductiveaugmentation research. Retrieved January 2007 from http://www.worklearning .com/catalog/

    Thalheimer, W. (2006). Spacing learning over time: What the research says. Available at www.work-learning.com/catalog .

    Thalheimer, W. (forthcoming). Audience response learning: Using research-based questioning and discussion techniques to improve your classroom instruction. Wieman, C., & Perkins, K. (2005). Transforming physics education, November 2005, 58(11), p.36-41. Also available (as of March 2007) at www.colorado.edu/physics/EducationIssues/papers/PhysicsTodayFinal.pdf .

    Thornton, R. K. and Sokoloff, D. R. (1998). Assessing student learning of Newton’s laws: The Force and Motion Conceptual Evaluation and the evaluation of active learning laboratory and lecture curricula. American Journal of Physics, 66, 338-352.

    Tobias, S. (1994). They’re Not Dumb, They’re Different: Stalking the Second Tier. Tuscon, USA: Research Corporation a Foundation for the Advancement of Science.

    Topping,K., The effectiveness of peer tutoring in further and higher education: A typology and review of the literature. Higher Education 32(3), 1996, 321- 345.

    Treagust, D. F. (1988). Development and use of diagnostic tests to evaluate students' misconceptions in science. International Journal of Science Education, 10, 159-169.

    Tyson, L. M. and Bucat, R.B. (1995). Chemical equilibrium: Using a two-tier test to examine students' understanding. Western Australia Science Education Association Annual Conference.

    Uhari,M., Marjo Renko and Hannu Soini (2003) "Experiences of using an interactive audience response system in lectures" BMC [BioMed Central] Medical Education vol.3 article 12 http://www.biomedcentral.com/1472-6920/3/12#IDATWC2D

    Valacich, J. S., Dennis, A. R., & Nunamaker Jr., J. F. (1992). Group size and anonymity effects on computer-mediated idea generation. Small Group Research 23 (1) 49-73.

    Van Dijk, L. A., Van Den Berg, G. C. and Van Keulen, H. (2001). Interactive lectures in engineering education. European Journal of Engineering Education, 26, 15-28.

    VYGOTSKY, L. S. (1978). Mind in society. Cambridge, MA: Harvard University Press.

    West, L.H.T. and Pines, A.L. (1985), Cognitive structure and conceptual change, New York: Academic Press.

    N Wieman, C. and Perkins K., Physics Today (2005) 36-42

    Wiggins, P, undated, An evaluation of the potential role and effectiveness of the use of EVS in academic law classes, Accessed online 26th September 2007 at: http://perseus.herts.ac.uk/uhinfo/library/y37860_3.pdf

    Wiske, M. S. (1997). Teaching for Understanding: Linking Research with Practice. San Francisco: Jossey-Bass.

    Wit,E. (2003) "Who wants to be... The use of a Personal Response System in Statistics Teaching" MSOR Connections Volume 3, Number 2: May 2003, p.5-11 (publisher: LTSN Maths, Stats & OR Network)

    Wolfman,S., Making lemonade: exploring the bright side of large lecture classes, Proc. SIGSCE '02, Covington, Kentucky, 2002, 257-261.

    Woods, H. A., & Chui, C. (2003). Wireless response technology in college classrooms. Downloaded February 2007 from www.mhhe.com/cps/docs/CPSWP_WoodsChiu.pdf .


    Redirect to:

  • http://www.gla.ac.uk/services/learningteaching/learningtools/classroomvotingsystem/

    Last changed 20 Feb 2002 ............... Length about 900 words (6000 bytes).
    This is a WWW document maintained by Steve Draper, installed at http://www.psy.gla.ac.uk/~steve/ilig/bspecial.html.

    Week 17 plan

    Monday 18 Feb

    1. 8:50am Marek,Anton arrive at Chemistry Stores; carry stuff up to ChemMain LT and set up. (MargaretM will get students to distribute the handsets. Steve will be there the first time to introduce students to them.)
    2. 10am Marek,Anton disassemble and return to Chemistry Stores.
    3. 4:55pm Chris,Julie Anne Get stuff from BO janitors, set up Margaret Martin in BO-LT-1.
    4. 6pm: Chris,Julie Anne return stuff to janitors.

    Tuesday 19 Feb

    1. 8:50am Chris,Anton arrive at Chemistry Stores; carry stuff up to ChemMain LT and set up. (MargaretM will get students to distribute the handsets.)
    2. 10am Chris,Anton disassemble and return to Chemistry Stores.
    3. 1pm T3 (Chris??) set up Steve in BO-LT-B (30 students): Handsets, cables, receivers from BO janitor lot. But also for this: laptop from Margaret/Julie office.
    4. 3pm Disassemble.
    5. 4:55pm Marek,Julie Anne Get stuff from BO janitors, set up Margaret Martin in BO-LT-1.
    6. 6pm: Marek,Julie Anne return stuff to janitors.

    Wednesday 20 Feb

    1. Chris,Miguel,Anton arrive at Chemistry Stores; carry stuff up to ChemMain LT and set up. (MargaretM will get students to distribute the handsets.)
    2. 10am Chris,Anton,Miguel disassemble and move it all to WLT janitors (trolley: either borrow from Chemistry and return; or from DCS and return.)
    3. 3pm WLT Chris fetch laptop; Chris,Anton setup from stuff already with WLT janitors; Chris stay to 5pm to operate PRS for Marjorie.
    4. 4:55pm Marek,Julie Anne,Jennifer Get stuff from BO janitors, set up Margaret Martin in BO-LT-1.
    5. 5pm: at WLT: Chris,Anton,Miguel return stuff to Julie,MargaretB office in Lilybank. (trolley?)
    6. 6pm: at BO: Marek,Julie Anne,Jennifer return stuff to Julie,MargaretB office. (trolley?)

    Thursday 21 Feb

    1. MargaretB picks out and packs PRS stuff for Leeds trip. Problem: finding a laptop to take.
    2. Sometime, T5 move stuff to WMB janitors (I need to get permission for this) (Trolley?) 200 handsets, Laptop.

    Friday 22 Feb

    1. 8:50am Marek,Julie Anne,Jennifer arrive at WMB janitors, setup
    2. 10am Marek,Julie Anne,Jennifer disassemble, return to janitors
    3. 11:45 Chris,Julie Anne,Jennifer collect from WMB janitors, move and setup in Graham Kerr.
    4. 1pm Chris,Julie Anne,Jennifer disassemble and return to Lilybank.

    AND before Fri 22, T5 needs to do a rehearsal in both WMB and Graham Kerr.


    Lecturers who have used the handsets Oct 2001 - March 2003
    Lecturers
    Only one use per person shown
    Department Level Target class size Lectures x repeats
    Diane Addie Veterinary Medicine 4 100 1
    Marjorie Allison Medicine 3 250 3
    Barbara Cogdell
    Rob Smith
    IBLS 2 300 1 x 2
    Quintin Cutts Computing Science 1 450 20 x 2
    Steve Draper Psychology 4 40 3
    Jason Leitch Dental School CPD 18 1
    Sarah MacKay IBLS 2 150 1
    Colin Martin Medicine 4 250 1
    Margaret Martin Psychology 1 500 3 x 2
    Paddy O'Donnell Psychology 3 100 5
    Susan Stuart Philosophy 2 100 9
    Ernst Wit
    John McColl
    Nicole Augustin
    Nial Friel
    Statistics 1/ 2 200 9


    Handset uses between Oct.2001 and March 2003
    Department
    * some evaluation carried out
    Class Target number in class Lectures
    X repeats
    * Computing Science Level 1 2001-02 450 20 x 2
    Level 1 2002-03 300 20 x 2
    Computing Science Level 4 70 1
    * Psychology Level 4 Education 40 3
    * Psychology Level 4 HCI 30 8
    * Psychology Level 1 500 3 x 2
    * Philosophy Level 2 Logic 100 9
    * Philosophy Level 1 Mind & Body 260 1
    *Medicine Level 3 250 3
    *IBLS (Biology) Level 2 300 1 x 2
    *IBLS (Biology) Level 2 150 1
    *Veterinary Medicine Level 4 100 1
    *Dental School GP's (short course) 18 1
    Medicine Level 4 250 1
    *Statistics Level 1/ Level 2 200 9


    Last changed 13 July 2005 ............... Length about 300 words (2,000 bytes).
    (Document started on 21 Mar 2003.) This is a WWW document maintained by Steve Draper, installed at http://www.psy.gla.ac.uk/~steve/ilig/gregor.html. You may copy it. How to refer to it.

    Gregor Kennedy

    Gregor Kennedy has been investigating students' learning processes and outcomes in technology enhanced learning environments for a number of years. From July 2002- March 2003 he was a visiting fellow in the Department of Psychology at the University of Glasgow, and again for a week in July 2005 visiting the Department of Computing Science. He has worked with Steve Draper and Quintin Cutts on the use of handsets in lectures. His shared interest is in developing ideas about how we can evaluate students' learning processes in large group lectures that use electronic handsets. His personal foci are students' engagement, interest and motivation in handset based classes and whether this has any bearing on the way they think about course material; and in the use of logs (computer records) as a tool of educational evaluation. Hence an interest in whether meaningful and useful patterns of usage emerge from the vast amount of data that the PRS system generates.


    Last changed 5 Aug 2001 ............... Length about 900 words (6000 bytes).
    This is a WWW document maintained by Steve Draper, installed at http://www.psy.gla.ac.uk/~steve/ilig/hake.html.

    Interactive teaching

    (written by Steve Draper,   as part of the Interactive Lectures website)

    This is some notes following out from Hake's papers on improving learning by interactive lectures.

  • These all work. Other methods that merely have great teachers, carefully performed lectures and demonstrations etc. have failed, relatively speaking.

  • Electronic gadgets like the handsets may support some of these methods. they are NOT the most important element, but they do seem to add value. Still, the learning depends sensitively not on the use of the technology but on whether crucial pedagogic elements are present: e.g. whether the learners really try to explain and justify their views to someone, ....


    Last changed 9 June 2002 ............... Length about 4,300 words (30,000 bytes).
    This is a WWW document maintained by Steve Draper, installed at http://www.psy.gla.ac.uk/~steve/ilig/handsets.html. You may copy it. How to refer to it.

    Electronically enhanced classroom interaction

    (written by Steve Draper,   as part of the Interactive Lectures website)

    Contents (click to jump to a section)

    By Stephen W. Draper, Department of Psychology,
    Julie Cargill, and Quintin Cutts, Department of Computing Science,
    University of Glasgow.

    This is a version of a paper given at ASCILITE2001; and published as:
    Draper,S.W., Cargill,J., & Cutts,Q. (2002) "Electronically enhanced classroom interaction" Australian journal of educational technology vol.18 no.1 pp.13-23.

    Abstract

    A design rationale for introducing electronic equipment for student interaction in lecture theatres is presented, linking the instructional design to theory. The effectiveness of the equipment for learning depends mostly on what pedagogic method is employed: various alternative types are introduced. Prospective studies are outlined for exploring its use over new ranges of application. Rival views of the concept of interactivity are one way to organise the evalution of this learning technology.

    Introduction: the design

    This paper describes the design rationale for introducing electronic equipment for student interaction in lecture theatres, and the studies now in prospect of the use of this equipment.

    The equipment is essentially that of the TV show "Who wants to be a millionaire?": every member of the audience (i.e. each learner in a lecture theatre) has a handset similar to that of a TV remote control, the presenter displays a multiple choice question (MCQ), each learner transmits the digit corresponding to their chosen answer by infrared, a small PC (e.g. a laptop) accumulates the answers, and it displays, via the room's projection system, a bar chart representing the distribution (totals) of the responses to audience and presenter alike.

    This may be called (following Michael McCabe) a "Group Response" (GR) system. Its essential feature is that, regardless of group size, both audience and presenter get to know the distribution of responses (alternatives chosen), and how their own personal response relates to that distribution, but however without knowing who chose what. This means everyone contributes, and the representativeness of each response is also exactly known. On the other hand, the privacy of the choice means that, unlike in face to face groups, each individual can express the choice they incline to, rather than only a choice they feel able to explain and justify to others. These are quite often different both in science learning and in social processes.

    The main pedagogic categories of use of the equipment are:

    1. Assessment, both formative and as practice for summative assessment. Here the MCQs are meant to test content knowledge, and perhaps are drawn from a bank used for formal assessment on the course. The advantages of the equipment here are that "marking" is fully automatic, each learner can know immediately if they gave the right or wrong answer, how their performance on the question compares to the group as a whole, tailored explanations may be given by the presenter, and the presenter equally sees immediately how well the class measures up on that question (feedback from learners to teacher). The feedback cycle here takes about two minutes per item (somewhat longer if explanations are given). Any kind of MCQ may be used, provided the response is a single selection from a small fixed set: whether the usual rather shallow item, or one designed to probe understanding more than information retention (possibly by prior use of phenomenography (Marton, 1981; Marton & Booth, 1997) to map the common misconceptions).
    2. Formative feedback on learning within a class (i.e. within a contact period). Similar items might be used, but in order to discover and demonstrate what points should be focussed on during the class. Thus one or several such question items at the start of a class could be used to select a topic for detailed coverage, while the same or similar items at the end could demonstrate to what degree the group now understood the topic.
    3. Formative feedback to the teacher on the teaching i.e. "course feedback". While the standard questionnaire at the end of a term, semester, or course has in general only a small effect on changing anything (Cohen, 1980) and takes a year to do so, a quick on the spot anonymous poll half way through a class (e.g. on whether the pace is too fast or too slow, the jokes too numerous or infrequent, the examples too many or few) can be used to change things immediately. Making adjustments to the teaching every 30 minutes, instead of only once a year, and furthermore making them for the particular group that gave the feedback, is much more likely to be effective than the usual practice.
      Even better on the spot evaluation might be done by asking students what the best and worst issues are in the teaching at present. Assuming that even a handful are willing to mention an issue to the teacher's face, these can then be put as questions to the class, and an accurate secret ballot taken on the breadth of support for each one. This cycle of an open-ended evaluation probe, followed by systematic (and quantitative) measures of the issues thus identified, is the best evaluation practice: much better than using standard course questionnaires for all classes, learners, teachers, and contexts. Normally it would take days or weeks: but the whole 2-phase cycle could be done within 10 minutes.
    4. Peer assessment could be done on the spot, saving the teacher administrative time and giving the learner much more rapid, though public, feedback. For example if each student has to give a verbal presentation and this is peer assessed, then at the end of their talk the teacher can display (say) each of 10 criteria in turn, and get the other students to enter their mark for this anonymously but on the spot, with the totals displayed.
    5. Community mutual awareness building. At the start of any group e.g. a research symposium or the first meeting of a new class, the equipment gives a convenient way to create some mutual awareness of the group as a whole by displaying personal questions and having the distribution of responses displayed. For example, at a research meeting start by asking people's ages (which illustrates the advantage of anonymity), and the kind of department or institution they come from, and some alternative reasons for attending. At the start of a class, I might ask whether each student is straight from school or not, their gender, which faculty they belong to, whether they signed up for the course because it is their main interest, a side interest, or are just making up the number of courses they do.
    6. Experiments using human responses: for topics that concern human responses, a very considerable range of experiments can be directly demonstrated using the audience as participants. For instance visual illusions may be displayed and the equipment used to show what degree of uniformity of response is found. Priming effects can be shown, where the perception of an ambiguous word or display is affected by what was shown before. The performance of witnesses to a crime (including the effects of some well known biasses) can be explored by showing a short film, followed by various questions about what was shown. Social psychology effects, e.g. on conformity, could be demonstrated if responses to early questions were faked to see whether the class then changed their responses to later questions. In general, experiments that rely only on a stimulus and a forced choice response, but not on accurate measurements of reaction times, can usually be demonstrated in this way. Thus for the particular case of psychology, but also for parts of physiology, medicine, economics, and so on, direct demonstrations of relevant effects can be mounted.
    7. Possibly the most productive application, however, and the one with the largest body of existing research, is in using the equipment to initiate a discussion. Here, a carefully chosen MCQ is displayed and the learners register an answer, thus privately committing to a definite opinion. The presenter then, however, does not indicate the "right" answer but directs the class to discuss their answers with each other. Having to produce explanations and reasons is powerfully "mathemagenic" (conducive to learning), which of course is why researchers learn so much from giving talks and writing papers, and why teachers make their students write essays and answer questions. The equipment can be a significant help in introducing this, even into large classes.
      This method of teaching by questions has been widely used and researched, although mostly without electronic aids (Hake, 1998a, 1998b).

    Justification or design rationale

    Although techno-enthusiasts, and indeed many government agencies or departments, have been pushing the use of computers and other technologies in education, and there are now many people whose job is essentially this and who are therefore necessarily aligned with this indiscriminately positive attitude, there is still very little good evidence of benefits. Perhaps this is not surprising: Landauer (1995) found it very hard to discover evidence of economic benefits for using computer technology in general. Besides suggesting that developing evaluation methods powerful enough to test this may be a more important, if more difficult, research task than generating yet another application of technology to learning, this does mean that each application should be carefully justified. In a review of a number of applications (Draper 1998), I argued that most applications showed no significant improvements over what they replaced, but that the few striking positive exceptions were characterised by "niche-based design": by a good fit between a particular learning situation and a specific technical solution. They were projects that had been inspired by identifying a specific weakness in current delivery, and had focussed technology on solving that problem rather than on replacing what had been adequately done before. Can the use of the classroom equipment described above meet the implied standard of justification?

    In considering large classes in large lecture theatres, the main problem is usually analysed as to do with the lack of interaction and the consequent extreme passivity imposed on the audience. In terms of Laurillard's model of the learning and teaching process (Laurillard, 1993, p.103), this situation fails to support the iterative interaction between learner and teacher that is one of her underlying principles, and more specifically does not support even activity 2: the "re-expression" by the learner of what the teacher has expressed. (This can be seen as corresponding to the constructivist requirement that learners acquire knowledge by rebuilding it on their own personal, mental foundations. Redescribing it in their own terms is an activity that powerfully promotes this.) Actually, with highly skilled learners and a teacher reasonably in tune with the group, this can nevertheless take place: for instance, where the learners take notes that are not mere dictation, but substantial re-formulations of what is being talked about. (This is a reasonable theoretical analysis of the considerable benefits I have often obtained from listening to talks at conferences where I have not asked questions, but have nevertheless learned something useful.) However this degree of skilled, silent interaction is not often present in undergraduate teaching, and large numbers usually prevent learners asking sufficient questions to repair the attunement between speaker and audience, from both a pragmatic (there isn't time for many people to ask questions) and a social (it just feels too embarrassing) viewpoint.

    That, then, is the diagnosis offered here of the chief weakness of lecturing to large groups. The handsets and associated equipment offer a way of tackling that weakness by (a) allowing each learner independently to generate an answer (at least a partial instantiation of activity 2), whereas otherwise only the handful who put their hands up really do this; and (b) to register that answer and so maintain the motivation for doing it; and in so doing (c) to affect the course of what happens next. This contingency (dependence of the teacher's behaviour on what the learners do) is true interactivity: one of the underlying principles of Laurillard's model, represented there by the to and fro repetition of activities between learner and teacher. The summed responses are real feedback to the teacher, that naturally leads to adjustments and reattunement if required, and in fact do this better than questions and answers from any subset of individuals. Furthermore the equipment offers an anonymity of response that addresses the shyness that additionally inhibits any interaction.

    As mentioned in passing, there are some other reasons for expecting benefits with the types of pedagogic use other than initiating discussions. Formative, summative, and peer assessment could be made more convenient and quicker (and so more affordable for both learners and teachers in terms of time). Starting to build a sense of a learning community could get off to a quicker start, especially in large groups. Demonstrating experimental effects instantly connects the abstract overview given to a personal perception and experience of it: something very helpful to learning both for retention, comprehension, and for a fuller content of learning. The biggest learning gains, however, are likely to come from the much better and quicker feedback from learners to teachers, allowing better attunement of the delivery; and from the method of teaching by questions i.e. of discussions in class (whether in small groups, plenaries, or a combination) initiated by well designed questions and by getting each individual to start by committing to an initial position.

    Is the equipment really likely to be any better than the alternatives? The simplest alternative is getting students to give a show of hands. This equipment crucially offers more privacy (it's a secret ballot, and important for just the same reasons). Other rival technologies are to issue each student with a cardboard or plastic cube with a different colour on each face, to be turned to show their "vote"; or with a large sheet of paper divided into a few squares each with a digit in, that the student can hold up in front of their bodies and point to the digit they select. These methods allow only near neighbours to see a student's selection. Thus the electronic equipment offers somewhat better privacy, but the difference may only be crucial with new classes: it is quite possible that with a class grown comfortable with the electronic version, moving over to a cheaper but less private version might not destroy the interactivity. The electronic version also provides faster and more accurate counting of the results: most presenters will only estimate shows of hands to about the nearest 20%, unless they have the patience to pursue and count exactly even with large groups. The accuracy may have a small but not negligible value in making all participants feel their views count, and are not just lost in crudely approximate estimates.

    In scrutinising this instructional design rationale, note that it does not feature computers in a starring role (although actually one is crucial to tabulate the results): the instructional design mostly isn't in the equipment or software, but in how each teacher uses it. That is a lesson which perhaps the rest of the learning technology field should take more to heart if the aim is in fact to improve learning rather than to promote the glamour of machines. On the other hand, note too that this design does not fit with a simplistic interpretation of the slogan "learner-centered". Improved learning and the learners are the ultimate intended beneficiaries, but one of the important ways that end is achieved is by first serving the teachers better, by giving them much better, faster, and more detailed information on what the learners are thinking now, and where their problems are at each point.

    Prospective explorations

    There is a considerable history and community of practice in using such equipment in the specific area of promoting discussion (the last of the pedagogic uses listed above) and so improving student understanding in science and engineering at the school and early university levels (e.g. Hake 1998a, 1998b). The authors have obtained sufficient equipment for several lecture theatres, and are about to begin exploratory studies, particularly with a view to exploring the range of applications, and how far its utility can be demonstrated beyond its best established application area. We hope to trial its use in all of the pedagogic modes listed in the first section, in two universities (Glasgow and Strathclyde), in at least two disciplines (psychology and computer science) in both universities together with several others as opportunities arise, at various levels (years) in undergraduate programmes, and in a range of group sizes from 300 students downwards. (The biggest need and the biggest potential gains are in the largest group sizes, but innovation is of course a lot "safer", i.e. easier to manage, in smaller groups.)

    The exploratory studies should yield practical knowledge such as question banks for the participating disciplines, and how much support is needed for first time use (a new lecturer and students who haven't used the equipment) and for regular use. They will also yield evaluation results on what benefits can be demonstrated. We hope to use a version of the method of Integrative Evaluation (Draper et al. 1996) to address both these aspects.

    Interactivity

    According to Jim Boyle (personal communication), students are generally, although not universally, enthusiastic about this approach, even over long periods (e.g. regular use throughout a year). When asked if they regard the interactive equipment as an advantage or not, classes typically show a spread of opinion such as 70% for it, 20% indifferent, 10% definitely opposed to it. Investigating more deeply than general student preferences will require more, and more sophisticated, measures.

    Some of the most important evaluation issues can be organised around the notion of interactivity. Some researchers tend to an almost mechanical interpretation of interactivity e.g. counting the number and branching ratio of choice paths for users in multimedia learning software (Sims, 1997; Hoyet, 2000). With this equipment, that corresponds to the number of questions put to the learners for them to respond to, regardless of their content. It also corresponds to the effects we may well see of novelty, of the perception that the teachers are taking special trouble over the teaching (the Hawthorne effect; Mayo, 1933), or simply of physiological arousal (the physical activity involved in pressing buttons i.e. mechanical interactivity) which has led to the heuristic rule of not lecturing for more than 20 minutes without a pause, having the audience move around periodically, etc. On the other hand, if we believe in the Laurillard model, then the important factor would probably be the amount of time each learner spends on activity 2 ("re-expression"): so using the handsets should be better than a non-interactive monologue, but not as good as time spent in peer discussion (open-ended verbal responses rather than selecting one of the digits on the handsets). In other words, the measure of it would be the number of mental and verbal responses a learner makes (in discussion) rather than the number of button presses on the handset. On the other hand again, if what is important about "interactivity" is actually changing what happens by visibly affecting the teacher (i.e. genuine human-human interaction with the actions of one party being contingent on those of the other), then it will be changes to what the session is used for as a result of responses to questions near the start that predict the largest learning gains. Varying approaches in classes, and taking independent measures both of learning and of enjoyment or alertness should eventually allow such questions to be decided. Measures taken over time (e.g. weeks) should allow any halo and Hawthorne effects to be independently identified, if they are present, with enthusiasm decaying as the novelty wears off, or performance being independent of the learning activity tried and only dependent on the perceived interest of the researchers.

    Other technical details

    There are some further detailed issues that arise, and could be investigated. The particular equipment used transmits not only a digit to signal the learner's selected response to the question, but also a confidence level (high, medium, or low), and an ID for that handset which may or may not have been arranged with a known mapping to the student's identity. Furthermore the number of attempts each learner makes at the question before the cutoff time may be recorded. The GRUMPS (2001) project is interested in exploring data mining of records of such student interactions, though that involves negotiating issues of privacy and data protection with the students. We are writing software to smooth the integration of the equipment with other lecture facilities (e.g. the use of powerpoint presentations), and with keeping records of the interactions.

    There seem to have been a variety of particular equipment used in the past, and more than one type is currently available. For instance a one-button system has been used (Massen et al., 1998), though that required each response option for a question to be attended to separately. Various numbers of buttons are offered in other equipment, and sometimes the ability to enter multi-digit responses and transmit them as one number. Wired, radio, and infrared implementations have been used. Currently infrared proves cheapest. Already technically feasible, though not yet financially attractive, is the solution of equipping every student with a radio-linked PDA (e.g. palmtop computer). Functionally, the features that can matter to further pedagogical tactics include: entering multidigit numbers (e.g. to identify the student), entering a sequence of digits to specify a sequence or set of response options rather than exactly one as an answer, and free text entry. When the latter becomes widely available, we can at last address a fundamental problem of discussion groups (such as research seminars) where many people want to ask a question: which is the best question to take for the group as a whole? Using only voice, we cannot know what the set of candidate questions is without having them asked. With textual group responses, everyone's questions could appear in front of the speaker and/or facilitator, and could then be grouped, sequenced, and sorted by priority. Meanwhile, as the technology (especially radio communication techniques) advance rapidly, we can focus on how we would use additional functions, and what their pedagogic rationale is.

    Conclusion

    The studies in prospect with this equipment should eventually allow us to pronounce on the validity of the design rationales presented in this paper. These studies will use measures of learning outcome, attitudes, and engagement as dependent (i.e. output) variables. They may range over, as independent (i.e. input) variables, two or more universities, three or more levels of university class and so student experience, two or more academic subjects, class sizes up to 300, and all the pedagogic strategy types described above.

    Acknowledgements

    This work is being supported in part by the EPSRC funded grant to GRUMPS (GR/N38114), and also in large part by the University of Glasgow, both directly and through the TLC project. Many thanks to Prof. Jim Boyle of Strathclyde University, whose rich existing experience with this equipment gives us confidence in proceeding, and many practical tips. We look forward to collaboration with him, and with David Nicol who has done a pioneering evaluation of Jim's work. Thanks also to Michael McCabe, Mae McSporran, and the anonymous referees for points that have been incorporated into this version.

    References

    Cohen,P.A. (1980) "Effectiveness of student-rating feedback for improving college instruction: a meta-analysis" Research in Higher Education vol.13 pp.321-341

    Draper, S.W. (1998) "Niche-based success in CAL" Computers and Education vol.30, pp.5-8

    Draper,S.W., Brown, M.I., Henderson,F.P. & McAteer,E. (1996) "Integrative evaluation: an emerging role for classroom studies of CAL" Computers and Education vol.26 no.1-3, pp.17-32

    GRUMPS (2001, May 30). The GRUMPS research project [WWW document]. URL http://grumps.dcs.gla.ac.uk/ (visited 2001 June 1)

    Hake,R.R. (1998a) "Interactive-engagement versus traditional methods: A six-thousand-student survey of mechanics test data for introductory physics courses" Am.J.Physics vol.66 no.1 pp.64-74

    Hake,R.R. (1998b) "Interactive-engagement methods in introductory mechanics courses" submitted to J.of Physics Education Research

    Hoyet,H. (2000) "Graphing Interactivity in technology-based training" TechTrends vol.44 no.5 pp.26-31.

    Landauer,T.K. (1995) The trouble with computers: Usefulness, usability, and productivity (MIT press; Cambridge, MA)

    Laurillard, D. (1993) Rethinking university teaching: A framework for the effective use of educational technology (Routledge: London) p.103. A diagram of her model

    Marton,F. (1981) "Phenomenography -- describing conceptions of the world around us" Instructional Science vol.10 pp.177-200.

    Marton,F. & Booth,S. (1997) Learning and awareness (Mahwah, New Jersey: Lawrence Erlbaum Associates)

    Massen, C., Poulis, J., Robens, E., Gilbert, M., (1998) "Physics lecturing with audience paced feedback" American Journal of Physics Vol.66.

    Mayo,E. (1933) The human problems of an industrial civilization (New York: MacMillan) ch.3.

    Sims, R. (1997) "Interactivity: A Forgotten Art?" Computers in Human Behavior vol.13 no.2 pp.157-180.


    Last changed 16 May 2009 ............... Length about 400 words (6,000 bytes).
    This is a WWW document maintained by Steve Draper, installed at http://www.psy.gla.ac.uk/~steve/ilig/lobby.html. You may copy it. How to refer to it.

    Electronic Voting Systems and interactive lectures: entrance lobby

    (written by Steve Draper)   Short URL for this page: http://evs.psy.gla.ac.uk/

    Pic version of text below why

    Main EVS website index page


    This is the entrance point for my web pages on Electronic Voting Systems (EVS) for use in lectures; or more generally for interactive lectures (ILIG = Interactive Lecture Interest Group); or more specifically for the PRS equipment which we mainly use, and for local Glasgow University arrangements.

    If you want a quick look at what it's all about, to see if it might interest you, then try

    To see all the things available on this site you should read over the main website index page; or print off all the pages to study: they are available as a single web page ready for printing: compilation on designing lectures (that use an EVS) and a comprehensive compilation.

    Some of the most popular parts are:


    Last changed 27 Mar 2002 ............... Length about 2,000 words (18,000 bytes).
    This is a WWW document maintained by Steve Draper, installed at http://www.psy.gla.ac.uk/~steve/ilig/interim.html. You may copy it. How to refer to it.

    Use of the PRS (Personal Response System) handsets at Glasgow University

    INTERIM EVALUATION REPORT: MARCH 2002

    by
    Stephen W. Draper and Margaret I. Brown
    Department of Psychology
    University of Glasgow

    Contents (click to jump to a section)

    Overview

    In the current academic year (Oct 2001- March 2002), interactive handsets have been trialled at the University of Glasgow by lecturers in Philosophy Psychology, Computing Science, IBLS, Medicine, Vet School and the Dental School (with GPs), with audience sizes from 20 to 300, and with students in level 1 to level 4. Handsets have been used in lectures and formative assessment sessions. They have been well received by students in all but one case, as judged by responses to our key evaluation question about whether, in each student's view, there was a net gain in benefits over disadvantages. The lecturers who used them have also been asked about their views, and again in all but one (different) case, felt the benefits outweighed the difficulties. Our evaluations, conducted by Margaret Brown, have also amassed a list of benefits and of disadvantages mainly from the student view, which we will be writing up soon.

    As we begin the task of writing up the evaluation studies (March 2002), our initial impression is thus that the handsets do indeed support learning gains in the ways discussed in Draper et al. (2001), but that benefits depend, not directly on the mere technology, but on how well the particular way they are used on each occasion elicits thought and reflection in the learners.

    As only limited time was available [2 contracts (GRUMPS; TLC), each of 10%, but since 12/3/02 only one contract (GRUMPS) of 10%] the main focus of the evaluation so far has therefore been on observation and collection of data, and not on the analysis of data and the production of written reports. Immediate feedback (including comments from the students) after a session involving handset use was given to the lecturers either verbally or in a written report.

    Collation and analysis of all the data and information we have collected, together with further evaluation is necessary in order to ensure the effective use of the PRS handsets in Glasgow University. The amount of work which can be carried out will depend on the funding available.

    Methods used in the evaluation

    The following methods have been employed in the evaluation to date.

    After each session or series of sessions we now ask students to answer the following core question using their handsets. In addition they are sometimes asked to give written comments, and sometimes to answer further questions via the handsets.

    What was, for you, the balance of benefit vs. disadvantage from the use of the handsets in your lectures?

    1. Definitely benefited
    2. Benefits outweigh any disadvantages
    3. Neutral
    4. More disadvantage than benefit
    5. Definite negative net value

    The main problem identified by students is that too much time sometimes is involved in setting up the equipment in lecture theatres other than the Boyd Orr where the receivers are permanently installed. The setting up time has been reduced as we gain more experience. The single most frequent type of setup problem has been with the data projectors. Factors like these can affect the views of students and lecturers.

    Further work has now to be done to look more closely at the information we have collected: voting figures from the lecturers' questions and the evaluation questions; comments from students and lecturers etc. In addition it should be possible to look at the data in the PRS files from questions asked in a specific lecture and identify if it is the same students that are experiencing problems with every question.

    Feedback from six lecturers who had used the PRS handsets in their lectures

    1. The essential feature of the use of this equipment is that both students and lecturer get to know the distribution of responses and, in confidence, how their own response relates to that. The element of anonymity encourages everyone to contribute and, unlike in face to face groups, each individual can express the choice they incline to, rather than the choice they would feel able to explain and justify to others.

      I have been using this equipment in an Introductory Logic course with a class of about one hundred students, and intend to use it in the forthcoming term with a Introductory Philosophy of Mind course. There have been two noticeable results so far. The first is that, if the students are to answer the questions in a way that will be helpful to them, they have to reflect more on what they have learnt and how they are learning. The second is that my teaching is being directed more by what the students need, or at least, say they need, rather than what I think they need. This means that I am not second-guessing or making unwarranted assumptions about their progress.

    2. I found the handsets very beneficial in my lecture and speaking with some students afterwards they also appreciated it. In the 3rd year I have asked questions by way of a written test, and they hand it in to me at the end. They mark it during the lecture, so get to see where they have gone wrong, but I don't until later - so I can't modify the lecture instantly, only for the next year. With the handsets I could see exactly which points I had not conveyed clearly and could rectify it straight away, the major example being when I asked the students what I thought was a simple question - identifying the FCoV carrier cat! Although most (68%) got it right, an astonishing number chose one of the other cats. I could see that they hadn't fully understood that many antibody positive cats are not infected. It was great, because the students who got the wrong answer are very likely the same ones who never utter a word in interactive lectures and it gave them a chance to participate anonymously.

      I wish I could use handsets at all my lectures - is that ever a possibility?

    3. The feeling was that the idea worked well, but that the time it took with a large group was too long. This meant that students lost the thread. The group that we had were generally very good, plite and responsive and some of those that we have lectured to in previous years might have been more difficult to keep in order.

      Our general conclusion was that the system would work well for groups up to about 50 in number, but for a group of this size [200] a set up with buttons that responded instantaneously would be required. When it starts to be installed as a feature of this type in lecture rooms we will use it. We will think about using it for some update courses that we give that have about 50 participants.

    4. I think (and the results also showed this) that the students liked both the experience and the fact that they could test their understanding of the topics as they went along.

      The results of their tests gave me some idea of how they had understood the concepts, and if it had been obvious that they were not following what was going on it would have allowed me to reprise the previous section (as it was I didn't have to do this). It also gave me some information that will help me to plump up the slides on the web to include extra, helpful information.

      As far as the technical side is concerned, I found it extremely easy to use, especially the PRS interface with Powerpoint. The Chemistry Lecture Theatre is certainly not ideal for testing new technology but I think the system stood up to it very well. Even when we were delayed in getting the equipment to the hall, the set up time did not encroach too much on the lecture.

      In total I think it was a worthwhile experience, both for me and for the students. I would recommend it to others, and I would use it again.

    5. I used the handsets in a level 4 option class in social psychology. The class size is about 50. The use of the sets is easy. A slight problem is the time it takes to register the students answers and you have to time this into the lecture. The students on the other hand do not mind the
      delay. The consumer report indicates (informally) that they enjoy the use of the device. From the staff point of view getting the level of the questions right takes time and experience. My questions were too easy (or else I explain things very well). What I notice as a social psychologist is that there is a level of group effect to be seen as the scores come up. People do not feel individually
      exposed because the replies are anonymous but they do watch the distribution of answers as it appears on the screen. That by itself may be a learning experience as they then consider other possible answers. I would need then to decide what do do when the students are having difficulties. I would need a plan B which would involve a fuller explanation. So it would affect the way I plan lectures. But why not?

    6. We used the handsets for a prelab tutorial session with about about 100 students in each sitting. Slides of photomicrographs were displayed using a slide projector. Multiple choice questions were displayed using an overhead projector. The students were asked questions on each of the photomicrographs and then we displayed their responses and Rob went through the correct answers. I felt the session went well although we definitely needed two people to cope with the the slides, overheads and the computer. It was also a bit hectic handing out the handsets and a handout at the same time. All the students that I have been able to ask, enjoyed the session and several commented on how they felt it was useful in finding out how much they knew (without me prompting such a response). Yes I think we would consider using it again, perhaps for a revision session when we could go over their class test. I think the system also has potential for monitoring lecture attendances which seems to be getting more and more of a problem.

    Feedback from students

    Data from an evaluation in the Dept. of Philosophy

    Some of the evaluation was carried out within an additional relevant contract in the Philosophy Dept. (15%) and the following data was gained from the Questionnaire in Level 2 (Logic).

    61 students reported using handsets in Logic lectures (level 2) and rated their usefulness.
    The percentage of students who rated them in each category is shown below:
    Extremely useful 18.0%
    Very useful 21.3%
    Useful 37.7%
    Not very useful 21.3%
    Not at all useful 0.0%
    No rating 1.6%

    Comments from students who rated handsets "Extremely useful"

    The anonymity allows the student to show he/she is unsure of the subject without embarrassing themselves. Lecturer can gauge their method of instruction and ensure all students are absorbing the subject matter. Philosophy courses/lectures could use this method. As they explain and discuss various theories they could confirm that the subject matter is getting across and how well/badly it is going across.

    Allows us to see where we are in relation to class mates - lecturer knows what to cover. Takes up time.

    Fun. Lack of reliability disrupts lecture.

    Comments from students who rated handsets "Very useful"

    To see how students are coping with what's being taught. Performed in a discreet way.

    Students: encourages us to participate; more likely we will be forced to listen this way. Lecturer: Let's her know what we do and don't understand. Can be a bit time consuming setting it all up. Are definitely useful. Would be good if system was inbuilt.

    Good to know if on right track. Takes time. Would be better if set up in advance of lecture.

    Comments from students who rated handsets "Useful"

    Students: Know how I am doing compared to other students. Interesting. Lecturer: Indication of students' gaps, not what lecturer thinks might be gaps. Students: Can distract from the learning point entirely. Lecturer: Has to be able to give clear instructions on what I am voting for. 3 options are too many for voting if comparing 1 and 2 to be used only and vary the 1/2 . Encourage speed in giving vote to avoid lengthy intervals between "lecture" and voting. The intention is not to give a diversion from the lecture, welcome though it may be.

    Good fun. Quite good for gauging how many others are as lost as I am ! Time consuming at first, but getting better. Keep using them please!

    Let's you see if you're on the same level as the rest of the class. It takes time to organise it, which could be used for lecturing. I think the benefits outweigh the disadvantages and therefore a good idea.

    Allows lecturer to know if they are making it clear enough but only useful to students if they follow up.

    Comments from students who rated handsets "Not very useful"

    Find out what sections are difficult and how I am doing in relation to rest of class. Very time consuming. Could sometimes be substituted for a show of hands as confidence on material increases.

    Students - it's fun, like "millionaire"! Lecturer - can see how class are coping. More useful for lecturer but the colours are nice

    Instant idea of understanding. Amusing distraction. V. fiddly! Clearer labelling on display

    Some benefits and problems collected from students

    From the comments from students on different courses (including those studying Logic), we identified the following suggested benefits and problems of using handsets in lectures. Students in some classes have helped us identify which of these are important (this data is still to be processed), and these should be addressed by lecturers using, or intending to use handsets.

    Benefits

    1. Using handsets is fun and breaks up the lecture.
    2. Makes lectures more interactive/ interesting and involves the whole class.
    3. I like the ability to contribute opinion to the lecture and it lets me see what others think about it too.
    4. The anonymity allows students to answer without embarrassing themselves.
    5. Gives me an idea of how I am doing in relation to rest of class.
    6. Checks whether you are understanding it as well as you think you are.
    7. Allows problem areas to be identified.
    8. Lecturers can change what they do depending on what students are finding difficult.
    9. Gives a measure of how well the lecturer is putting the ideas across.

    Problems

    1. Setting up and use of handsets takes up too much time in lectures.
    2. Can distract from the learning point entirely.
    3. Sometimes it is not clear what I am supposed to be voting for.
    4. Main focus of lecture seems to be on handset use and not on course content.
    5. The questions sometimes seem to be for the benefit of the lecturer and future students and not us.
    6. Annoying students who persist in pressing their buttons and cause problems for people trying to make an initial vote.
    7. Not completely anonymous in some situations.
    8. Some students could vote randomly and mislead the lecturer.
    9. Sometimes the lecturer seems to be asking questions just for the sake of it.

    Questions currently used in evaluations

    Here are the current versions of questions used in evaluating the handsets by displaying the questions and having students (the audience) use the handsets themselves to indicate their answers. In addition, we would normally hand out a small piece of paper to each person asking for any further comments on the handsets or the questions.

    Here we indicate the text of the questions between horizontal lines. It would be displayed on an overhead projector slide or in Powerpoint. Participants respond by pressing the corresponding digit on their handsets.

    Core question


    What was, for you, the balance of benefit vs. disadvantage from the use of the handsets in your lectures?
    1. Definitely benefited
    2. Benefits outweigh any disadvantages
    3. Neutral
    4. More disadvantage than benefit
    5. Definite negative net value


    Variant question


    1. Extremely useful
    2. Very useful
    3. Useful
    4. Not very useful
    5. Not at all useful
    6. No rating

    Questions about particular pros and cons

    These questions were done by displaying two slides: one with a numbered list of advantages or disadvantages, and a second asking questions about the list.


    Benefits

    1. Using handsets is fun and breaks up the lecture.
    2. Makes lectures more interactive/ interesting and involves the whole class.
    3. I like the ability to contribute opinion to the lecture and it lets me see what others think about it too.
    4. The anonymity allows students to answer without embarrassing themselves.
    5. Gives me an idea of how I am doing in relation to rest of class.
    6. Checks whether you are understanding it as well as you think you are.
    7. Allows problem areas to be identified.
    8. Lecturers can change what they do depending on what students are finding difficult.
    9. Gives a measure of how well the lecturer is putting the ideas across.


    Problems

    1. Setting up and use of handsets takes up too much time in lectures.
    2. Can distract from the learning point entirely.
    3. Sometimes it is not clear what I am supposed to be voting for.
    4. Main focus of lecture seems to be on handset use and not on course content.
    5. The questions sometimes seem to be for the benefit of the lecturer and future students and not us.
    6. Annoying students who persist in pressing their buttons and cause problems for people trying to make an initial vote.
    7. Not completely anonymous in some situations.
    8. Some students could vote randomly and mislead the lecturer.
    9. Sometimes the lecturer seems to be asking questions just for the sake of it.


    Acknowledgements

    This evaluation was supported in part by the EPSRC funded grant to GRUMPS (GR/N38114). It was also supported by a grant from the Philosophy LTSN to Susan Stuart.


    Last changed 15 Feb 2005 ............... Length about 200 words (2,000 bytes).
    (Document started on 15 Feb 2005.) This is a WWW document maintained by Steve Draper, installed at http://www.psy.gla.ac.uk/~steve/ilig/lecture.html. You may copy it. How to refer to it.

    Designing whole sessions around EVS

    By Steve Draper,   Department of Psychology,   University of Glasgow.

    List the alternative whole session plans:
    Meltzer p-solve
    CT diagnostic
    Class tests
    JITT: reading before, ...
    Mazur: lecture, but with brain teasers
    


    Last changed 23 Jan 2005 ............... Length about 300 words (9,000 bytes).
    This is a WWW document maintained by Steve Draper, installed at http://www.psy.gla.ac.uk/~steve/ilig/notes.html.

    Undigested notes on Web references on Handsets and teaching with them

    (written by Steve Draper,   as part of the Interactive Lectures website)

    These are undigested notes, especially URLs, related to both interactive lectures, and handset use itself.

  • Leeds simple introduction.

    Teaching / learning gains and methods

    Hake:
  • Home page/ list of relevant papers by him
  • SDI (Socratic Dialogue-Inducing) site
  • Hake on Interactive-engagement methods: the meta-analysis
    Hake,R.R. (1998a) "Interactive-engagement versus traditional methods: A six-thousand-student survey of mechanics test data for introductory physics courses" Am.J.Physics vol.66 no.1 pp.64-74

  • Hake on Interactive-engagement methods: more details about implementation

  • Shapiro, Rutgers, SRS

  • First year chemistry at Berkeley

  • online physics

  • another paper

  • bedu list
  • A short article on Mazur and his method
  • Another short article on Mazur and his method An IBM study of this stuff, stored by CPS.
    An APF study of this stuff, stored by CPS.
  • Eric Mazur project site
  • Mazur group education research
  • Mazur's key book: PEER INSTRUCTION: A User's Manual by Eric Mazur ISBN 0-13-565441-6 Prentice Hall Series in Educational Innovation, Upper Saddle River, NJ 07458 ©1997
  • Amherst group on (interactive) physics teaching especially this, and their online papers.

    There are some papers from MIT who have been using PRS a bit - papers are from the American Society of Engineering Education conferences:

  • http://www.asee.org/conferences/default.cfm
    This work is part of the MIT CDIO initiative:
  • http://web.mit.edu/aeroastro/www/cdio/index.html
    and also involves others - in particular Swedish institutions who we also have been told use PRS:
  • http://www.cdio.org/

    Interactive lectures:

  • Social Policy and social work LTSN / Bristol
  • Interactive lectures

    Publicity:

  • David Williams Tuesday May 21, 2002 The Guardian
  • Gerard Seenan January 04 2000 The Guardian
  • Tuesday January 11, 2000 The Guardian

    Which equipment?
    J.Smith (2001) Dialogue and interaction in a classroom environment (Final year research project, School of Mechanical Engineering, University of Bath). Supervised by Nick Vaughan An undergraduate project that did a study on potential use, comparing PRS, Classtalk, and CPS.
    See also

  • http://www.bath.ac.uk/~ensrfn/
  • http://www.bath.ac.uk/e-learning/JulyNews.htm
  • http://horizon.unc.edu/TS/default.asp?show=article&id=864

    University of Liverpool

    PRS or handset or equipment refs:

  • R.F.Ngwompo@bath.ac.uk Postal Address: Dept of Mechanical Engineering, University of Bath . He is said to have done a report comparing different handset technologies, and CPS in particular. A mention of this.
  • PRS site, and updated software
  • Short note "How to PRS"
  • the Respondex Palm system
  • the software-only version of PRS from Cue*s US company
  • Banxia: New software?

    undigested

  • http://www.mecheng.strath.ac.uk/natalie.asp#
  • http://www.strath.ac.uk/Departments/CAP/courses/interactive/powerpoint/sld001.htm
  • http://www.ust.hk/celt/ideas/prs/
  • http://www.hku.hk/caut/scholar/abstracts/037_snider.htm
  • http://www.economics.ltsn.ac.uk/showcase/elliott_prs.htm

  • Discourse
  • http://www.bedu.com/Publications/Samos.html
  • Dufresne, R.J. et al.(1996) Classtalk, A Classrom Communication System for Active Learning Journal of Computing in Higher Education vol.7 pp.3-47
  • Irving, A. et al. (2000) Use of Information Technology in Exam Revision 4th International CAA Conference
  • Nicholls, J., (1999) The Web between Lectures and Self-Managed Learning 3rd International CAA Conference
  • Thalheimer (2003) The Learning Benefits of Questions
  • RxShow
  • Mitchell
  • IML

    B. Shneiderman, M. Alavi, K. Norman, and E. Borkowski. Windows of Opportunity in the Electronic Classroom. Communications of the ACM, 38(11):19-24, Nov. 1995.

    Meltzer, David E. and Kandiah Manivannan. "Transforming the Lecture Hall Environment: The Fully Interactive Physics Lecture," American Journal of Physics 70:6, pp. 639-654. June 2002.


    Last changed 11 May 2003 ............... Length about 700 words (5000 bytes).
    This is a WWW document maintained by Steve Draper, installed at http://www.psy.gla.ac.uk/~steve/ilig/bookings.html.

    Past uses of lecture theatre handsets

    This is an approximate list of old bookings: occasions where the handsets were used.



    Last changed 19 Aug 2007 ............... Length about 700 words (6,000 bytes).
    (Document started on 11 Aug 2007.) This is a WWW document maintained by Steve Draper, installed at http://www.psy.gla.ac.uk/~steve/ilig/ped.html. You may copy it. How to refer to it.

    Summary of pedagogic purposes for EVS

    By Steve Draper,   Department of Psychology,   University of Glasgow.

    This page is an attempt to keep an up to date, complete list in one place of the different pedagogical purposes that EVS can be used for. For explanations of what is meant in more detail, you'll either have to look at a 2001 paper presenting some of them here or a page on (some) alternative question types for different purposes, and how to create them.

    High level purposes or functions

    Techniques

    1. Diagnostic SAQs ("self-assessment questions"), to allow each student to measure the state of their understanding at the moment.
    2. Questions to drill down to find and address problems. Essentially a tree of questions, that change what the teacher does in the session: "contingent teaching".
    3. To initiate a discussion. I.e. ask a brain teaser, do not tell the audience what the right answer is. As in Mazur's method of "peer instruction" or more generally what Hake calls "Interactive engagement". This is one of the few uses of educational technology that has been shown to produce large positive effects on learning measured in objective tests.
    4. A problem or proof is worked through by the presenter, subdivided into a number of stages. At the end of each stage, the audience answers a question e.g. on what the result of that step should be. It's a way of keeping the audience and presenter in step through a long multi-stage argument.
    5. Course feedback: asking students about aspects of the course: Formative feedback to the teacher.
    6. Practice exam, where the answers are keyed in rather than written by students, and the results can be compiled and commented on, and discussed by students with staff all in one session.
    7. Peer assessment could be done on the spot, saving the teacher administrative time and giving the learner much more rapid, though public, feedback.
    8. Community mutual awareness building. At the start of any group e.g. a research symposium or the first meeting of a new class, the equipment gives a convenient way to create some mutual awareness of the group as a whole by displaying personal questions and having the distribution of responses displayed.
    9. Collecting data in experiments using human responses:
      • Politics (demonstrate / trial voting systems)
      • Psychology (any questionnaire can be administered then results shared)
      • Physiology (Take one's pulse: see class' average; auditory illusions)
      • Vision science (display visual illusions; how many "see" it?)
    10. Have student (groups) design EVS questions and present them as part of talks given to the rest of their class. The discussion that ensues during the design of the questions, alternative responses, and justifications to be used when explaining the answers, can be highly generative of learning.

    Social and individual benefits

    Besides the lists above, however, another way of looking at it that may be fruitful, is to consider that EVS generally simultaneously serves two different kinds of function: promoting individual learning, and promoting an integrated, functioning learning community. Most applications do both to some extent. For instance, a set of quiz questions might seem to be designed to allow an individual to check their own learning privately and anonymously: but it actually simultaneously makes each learner aware of how they compare to the class as a whole overall and on each question. This is often important in making learners feel comfortable asking for explanations (many others clearly need it too) from either teachers or peers; and it makes the teacher feel in touch with the current level of understanding in the class.

    But perhaps it isn't exactly "social". There are three parties here: the learner as individual, the teacher, and the group as a whole. EVS keeps them mutually aware of each other's position.


    Peer instruction The sequence of activities in Peer Instruction and in class-wide discussion; originally presented in Nicol and Boyd (2003).

    "Peer Instruction":
    Mazur Sequence
    "Class-wide Discussion":
    Dufresne (PERG) Sequence
    1. Concept question posed.
    2. Individual Thinking: students given time to think individually (1-2 minutes).
    3. [voting] Students provide individual responses.
    4. Students receive feedback -- poll of responses presented as histogram display.
    5. Small group Discussion: students instructed to convince their neighbours that they have the right answer.
    6. Retesting of same concept.
      [voting] Students provide individual responses (revised answer).
    7. Students receive feedback -- poll of responses presented as histogram display.
    8. Lecturer summarises and explains "correct" response.
    1. Concept question posed.
    2. Small group discussion: small groups discuss the concept question (3-5 mins).
    3. [voting] Students provide individual or group responses.
    4. Students receive feedback -- poll of responses presented as histogram display.
    5. Class-wide discussion: students explain their answers and listen to the explanations of others (facilitated by tutor).
    6. Lecturer summarises and explains "correct" response.


    Last changed 13 Nov 2009.............Length about 1873 words (25,000 bytes).
    (Document started on 18 Aug 2005.) This is a WWW document maintained by Steve Draper, installed at http://www.psy.gla.ac.uk/~steve/ilig/pi.html. You may copy it. How to refer to it.

    Website (permuted) index of page titles

    By Steve Draper,   Department of Psychology,   University of Glasgow.

    This page lists the titles of all pages in the "ilig" website on interative lectures and electronic voting systems (EVS); and gives a permuted indext of the words on those titles.

    Index of web page title

  • A video of using voting equipment in class
  • Ad hoc bibliography on EVS
  • Authoring new outlets for video
  • Bookings for lecture theatre handset use
  • Change of comment on RxShow
  • Compilation of some interactive lecture web pages
  • Compilation of some interactive lecture web pages
  • Compilation of web pages for PRS video 1
  • Designing a contingent question set
  • Designing and managing a teaching session
  • Designing teaching sequences with questions
  • Designing whole sessions around EVS
  • EVS technologies, alternatives, vendors
  • Electronic Voting Systems and interactive lectures
  • Electronically enhanced classroom interaction
  • Feedback on the video of handset use
  • Feedback to students
  • Gregor Kennedy
  • Having students design the questions
  • Increasing interactivity in lectures using an electronic voting system
  • Increasing interactivity using a voting system
  • Interactive Lectures
  • Interactive lectures interest group
  • Interactive teaching
  • Introducing the EVS to a new audience
  • Kinds of evidence about the effectiveness of EVS
  • Length and number of questions
  • Main EVS question types
  • Newsletter ad
  • Other common questions
  • Page for Glasgow Caledonian EVS users
  • Past uses of lecture theatre handsets
  • Pedagogical formats for using questions and voting
  • Peer instruction
  • People receiving copies of video
  • Presenting a question
  • Question banks available on the web
  • Question formats
  • Rationale for video
  • Second Newsletter ad
  • Some pictures of PRS in action
  • Some tactics when using EVS questions
  • Some technical details on PRS
  • Summary of pedagogic purposes for EVS
  • Support for startup uses of electronic voting in lectures
  • THES letter on interactive lectures
  • Terms for EVS
  • Transforming lectures to improve learning
  • Turning Point equipment
  • UK handset users and sites
  • Use of PRS handsets at Glasgow University
  • Using EVS at Glasgow University c.2005
  • Using EVS for interactive lectures
  • Using an electronic voting system in logic lectures
  • Video Programme notes
  • Video streaming testbed and tech. notes
  • Web references on Handsets and teaching with them
  • Website (permuted) index of page titles
  • Website by links
  • Website table of contents (list of pages)
  • Week 17 plan
  • Why use EVS? the short answer
  • Workshop on handset equipment

    Permuted index of web page title words

  • about the effectiveness of EVS, Kinds of evidence
  • action, Some pictures of PRS in
  • Ad hoc bibliography on EVS
  • ad, Newsletter
  • ad, Second Newsletter
  • alternatives, vendors, EVS technologies
  • an electronic voting system in logic lectures, Using
  • an electronic voting system, Increasing interactivity in lectures using
  • answer, Why use EVS? the short
  • around EVS, Designing whole sessions
  • audience, Introducing the EVS to a new
  • Authoring new outlets for video
  • available on the web, Question banks

  • banks available on the web, Question
  • bibliography on EVS, Ad hoc
  • Bookings for lecture theatre handset use
  • by links, Website

  • c.2005, Using EVS at Glasgow University
  • Caledonian EVS users, Page for Glasgow
  • Change of comment on RxShow
  • class, A video of using voting equipment in
  • classroom interaction, Electronically enhanced
  • comment on RxShow, Change of
  • common questions, Other
  • Compilation of some interactive lecture web pages
  • Compilation of some interactive lecture web pages
  • Compilation of web pages for PRS video 1
  • contents (list of pages), Website table of
  • contingent question set, Designing a
  • copies of video, People receiving

  • design the questions, Having students
  • Designing a contingent question set
  • Designing and managing a teaching session
  • Designing teaching sequences with questions
  • Designing whole sessions around EVS
  • details on PRS, Some technical

  • effectiveness of EVS, Kinds of evidence about the
  • electronic voting in lectures, Support for startup uses of
  • electronic voting system in logic lectures, Using an
  • electronic voting system, Increasing interactivity in lectures using an
  • Electronic Voting Systems and interactive lectures
  • Electronically enhanced classroom interaction
  • enhanced classroom interaction, Electronically
  • equipment in class, A video of using voting
  • equipment, Turning Point
  • equipment, Workshop on handset
  • evidence about the effectiveness of EVS, Kinds of
  • EVS at Glasgow University c.2005, Using
  • EVS for interactive lectures, Using
  • EVS question types, Main
  • EVS questions, Some tactics when using
  • EVS technologies, alternatives, vendors
  • EVS to a new audience, Introducing the
  • EVS users, Page for Glasgow Caledonian
  • EVS, Ad hoc bibliography on
  • EVS, Designing whole sessions around
  • EVS, Kinds of evidence about the effectiveness of
  • EVS, Summary of pedagogic purposes for
  • EVS, Terms for
  • EVS? the short answer, Why use

  • Feedback on the video of handset use
  • Feedback to students
  • formats for using questions and voting, Pedagogical
  • formats, Question

  • Glasgow Caledonian EVS users, Page for
  • Glasgow University c.2005, Using EVS at
  • Glasgow University, Use of PRS handsets at
  • Gregor Kennedy
  • group, Interactive lectures interest

  • handset equipment, Workshop on
  • handset use, Bookings for lecture theatre
  • handset use, Feedback on the video of
  • handset users and sites, UK
  • Handsets and teaching with them, Web references on
  • handsets at Glasgow University, Use of PRS
  • handsets, Past uses of lecture theatre
  • Having students design the questions
  • hoc bibliography on EVS, Ad

  • improve learning, Transforming lectures to
  • Increasing interactivity in lectures using an electronic voting system
  • Increasing interactivity using a voting system
  • index of page titles, Website (permuted
  • instruction, Peer
  • interaction, Electronically enhanced classroom
  • interactive lecture web pages, Compilation of some
  • interactive lecture web pages, Compilation of some
  • Interactive Lectures
  • Interactive lectures interest group
  • interactive lectures, Electronic Voting Systems and
  • interactive lectures, THES letter on
  • interactive lectures, Using EVS for
  • Interactive teaching
  • interactivity in lectures using an electronic voting system, Increasing
  • interactivity using a voting system, Increasing
  • interest group, Interactive lectures
  • Introducing the EVS to a new audience

  • Kennedy, Gregor
  • Kinds of evidence about the effectiveness of EVS

  • learning, Transforming lectures to improve
  • lecture theatre handset use, Bookings for
  • lecture theatre handsets, Past uses of
  • lecture web pages, Compilation of some interactive
  • lecture web pages, Compilation of some interactive
  • lectures interest group, Interactive
  • lectures to improve learning, Transforming
  • lectures using an electronic voting system, Increasing interactivity in
  • lectures, Electronic Voting Systems and interactive
  • Lectures, Interactive
  • lectures, Support for startup uses of electronic voting in
  • lectures, THES letter on interactive
  • lectures, Using an electronic voting system in logic
  • lectures, Using EVS for interactive
  • Length and number of questions
  • letter on interactive lectures, THES
  • links, Website by
  • logic lectures, Using an electronic voting system in

  • Main EVS question types
  • managing a teaching session, Designing and

  • new audience, Introducing the EVS to a
  • new outlets for video, Authoring
  • Newsletter ad
  • Newsletter ad, Second
  • notes, Video Programme
  • notes, Video streaming testbed and tech
  • number of questions, Length and

  • Other common questions
  • outlets for video, Authoring new

  • Page for Glasgow Caledonian EVS users
  • page titles, Website (permuted) index of
  • pages for PRS video 1, Compilation of web
  • pages), Website table of contents (list of
  • pages, Compilation of some interactive lecture web
  • pages, Compilation of some interactive lecture web
  • Past uses of lecture theatre handsets
  • pedagogic purposes for EVS, Summary of
  • Pedagogical formats for using questions and voting
  • Peer instruction
  • People receiving copies of video
  • pictures of PRS in action, Some
  • plan, Week 17
  • Point equipment, Turning
  • Presenting a question
  • Programme notes, Video
  • PRS handsets at Glasgow University, Use of
  • PRS in action, Some pictures of
  • PRS video 1, Compilation of web pages for
  • PRS, Some technical details on
  • purposes for EVS, Summary of pedagogic

  • Question banks available on the web
  • Question formats
  • question set, Designing a contingent
  • question types, Main EVS
  • question, Presenting a
  • questions and voting, Pedagogical formats for using
  • questions, Designing teaching sequences with
  • questions, Having students design the
  • questions, Length and number of
  • questions, Other common
  • questions, Some tactics when using EVS

  • Rationale for video
  • receiving copies of video, People
  • references on Handsets and teaching with them, Web
  • RxShow, Change of comment on

  • Second Newsletter ad
  • sequences with questions, Designing teaching
  • session, Designing and managing a teaching
  • sessions around EVS, Designing whole
  • set, Designing a contingent question
  • short answer, Why use EVS? the
  • sites, UK handset users and
  • some interactive lecture web pages, Compilation of
  • some interactive lecture web pages, Compilation of
  • Some pictures of PRS in action
  • Some tactics when using EVS questions
  • Some technical details on PRS
  • startup uses of electronic voting in lectures, Support for
  • streaming testbed and tech. notes, Video
  • students design the questions, Having
  • students, Feedback to
  • Summary of pedagogic purposes for EVS
  • Support for startup uses of electronic voting in lectures
  • system in logic lectures, Using an electronic voting
  • system, Increasing interactivity in lectures using an electronic voting
  • system, Increasing interactivity using a voting
  • Systems and interactive lectures, Electronic Voting

  • table of contents (list of pages), Website
  • tactics when using EVS questions, Some
  • teaching sequences with questions, Designing
  • teaching session, Designing and managing a
  • teaching with them, Web references on Handsets and
  • teaching, Interactive
  • tech. notes, Video streaming testbed and
  • technical details on PRS, Some
  • technologies, alternatives, vendors, EVS
  • Terms for EVS
  • testbed and tech. notes, Video streaming
  • theatre handset use, Bookings for lecture
  • theatre handsets, Past uses of lecture
  • them, Web references on Handsets and teaching with
  • THES letter on interactive lectures
  • titles, Website (permuted) index of page
  • Transforming lectures to improve learning
  • Turning Point equipment
  • types, Main EVS question

  • UK handset users and sites
  • University c.2005, Using EVS at Glasgow
  • University, Use of PRS handsets at Glasgow
  • use EVS? the short answer, Why
  • Use of PRS handsets at Glasgow University
  • use, Bookings for lecture theatre handset
  • use, Feedback on the video of handset
  • users and sites, UK handset
  • users, Page for Glasgow Caledonian EVS
  • uses of electronic voting in lectures, Support for startup
  • uses of lecture theatre handsets, Past
  • using a voting system, Increasing interactivity
  • Using an electronic voting system in logic lectures
  • using an electronic voting system, Increasing interactivity in lectures
  • Using EVS at Glasgow University c.2005
  • Using EVS for interactive lectures
  • using EVS questions, Some tactics when
  • using questions and voting, Pedagogical formats for
  • using voting equipment in class, A video of

  • vendors, EVS technologies, alternatives
  • video 1, Compilation of web pages for PRS
  • video of handset use, Feedback on the
  • video of using voting equipment in class, A
  • Video Programme notes
  • Video streaming testbed and tech. notes
  • video, Authoring new outlets for
  • video, People receiving copies of
  • video, Rationale for
  • voting equipment in class, A video of using
  • voting in lectures, Support for startup uses of electronic
  • voting system in logic lectures, Using an electronic
  • voting system, Increasing interactivity in lectures using an electronic
  • voting system, Increasing interactivity using a
  • Voting Systems and interactive lectures, Electronic
  • voting, Pedagogical formats for using questions and

  • web pages for PRS video 1, Compilation of
  • web pages, Compilation of some interactive lecture
  • web pages, Compilation of some interactive lecture
  • Web references on Handsets and teaching with them
  • web, Question banks available on the
  • Website (permuted) index of page titles
  • Website by links
  • Website table of contents (list of pages
  • Week 17 plan
  • when using EVS questions, Some tactics
  • whole sessions around EVS, Designing
  • Why use EVS? the short answer
  • Workshop on handset equipment


    Last changed 15 Feb 2005 ............... Length about 900 words (6000 bytes).
    (Document started on 15 Feb 2005.) This is a WWW document maintained by Steve Draper, installed at http://www.psy.gla.ac.uk/~steve/ilig/seq.html. You may copy it. How to refer to it.

    Designing teaching sequences with questions

    By Steve Draper,   Department of Psychology,   University of Glasgow.

    This will move out from single question design.
    Qu plus expl
    2 tier questions
    Mazur Dufresne (move that here)
    ?Meltzer
    Quintin: web <-> class
    


    Last changed 19 Sept 2008 ............... Length about 1,000 words (9,000 bytes).
    (Document started on 28 Dec 2007.) This is a WWW document maintained by Steve Draper, installed at http://www.psy.gla.ac.uk/~steve/ilig/studdesign.html. You may copy it. How to refer to it.

    Having students design the questions

    (written by Steve Draper,   as part of the Interactive Lectures website)

    Another use of MCQs, and hence perhaps of EVS, is to get learners not just to answer questions, but to design (write) them. I myself only realised this because Andy Sharp (then at Glasgow Caledonian University) did it: but with hindsight, this is obviously a powerful idea. A number of people have done it and published about it, but it is still a very uncommon piece of good practice.

    It can work well without EVS, but may work better with EVS since that makes if faster and cheaper to administer the questions to the whole class in one way or another.

    Designing questions is likely to be strongly productive of (better) learning because:

    Support for such an exercise

  • Group work: when the activity is new to the students, then doing it in groups is generally a comfort and support for them. If this became a familiar and frequent activity, then they might do it faster and get more from it by doing it solo.
  • Train/brief them on Bloom's taxonomy.
  • Have the questions tried out on the whole class with EVS: their peer's responses are powerful feedback on the design process.
  • If the questions produced are "marked", then give more credit for:
    1. Higher levels in Bloom's taxonomy
    2. Specifying which response option is correct, and giving clear reasons (in their documentation) for the correctness or wrongness of each answer option.
    3. Questions that discriminate within the class i.e. that some other students get right and some get wrong.
    4. Strong relationship to the learning aims and objectives of the course.

    Motivations and contexts for student question design

  • Require students (or groups of students) to write and deliver a talk to the whole class, and to include some EVS questions as part of their presentation (Sharp & Sutherland).
  • Examine the whole course by MCQs. Tell the students that the final exam will be composed of the teacher's selection from the set of questions composed by the students during the course.
  • One useful activity for student study groups, especially during revision time, is to test each other on questions. Designing questions for use in this context is a further improvement.
  • Weekly quizzes for the class (e.g. using EVS) with students in turn providing questions for these. Thus students coooperate in creating a question bank for the class.

    Having students create the topics rather than the wording of questions

    Nick Bowskill has developed an application of EVS that both has students provide the content of questions, and do so for the purpose of an in depth course feedback exercise.

    The elicitation session he has developed is really a form of the pyramid evaluation technique for course evaluation in depth: a mixture of solo and small group and plenary to identify issues, then identify which are common to many learners. (Actually instead of applying it to a single module, he applied it to the whole first year of a programme of course makes it a great attack on improving induction, and making it responsive to individual student cohorts and departments.)

    It is also a kind of halfway case of student-generated EVS questions: they provide the material for the subject of the questions, which staff then actually edit and deliver. You could argue that this is actually better (more student-active) than students inventing MCQs to test staff-specified subject matter.

    A rough recipe for such a session is to begin by asking each student to write down the one or two most important issues or problems for them with the course on a slip of paper. Then having them discuss in groups of 5 what one or two issues the group as a whole would suggest as most important. Then as a plenary of the whole class, have each group shout out their issue which is typed into the EVS software as a question. Then display the list of issues as options to the EVS question "Which for you is the most important issue or problem?". (If you are using software that allows the EVS handsets to be used to express rankings, then you could collect these instead and get more information.)

    References

    Here's a list of papers I know about, dealing with students designing questions (not necessarily with EVS).

    Arthur, N. (2006) "Using student-generated assessment items to enhance teamwork, feedback and the learning process" Synergy: Supporting the Scholarship of Teaching and Learning at the University of Sydney Issue 24, pp.21-23

    Bali, M. & Keaney, H. (2007) "Collaborative Assessment Using Clickers" From the REAP International Online Conference on Assessment Design for Learner Responsibility, 29th-31st May, 2007. Available at: http://ewds.strath.ac.uk/REAP07 Fellenz, M.R. (2004) "Using assessment to support higher level learning: the multiple choice item development assignment" Assessment and Evaluation in Higher Education vol.29 no.6 pp.703-719

    Sharp, A. & Sutherland, A. (2007). "Learning Gains...My (ARS)S - The impact of student empowerment using Audience Response Systems Technology on Knowledge Construction, Student Engagement and Assessment" From the REAP International Online Conference on Assessment Design for Learner Responsibility, 29th-31st May, 2007. Available at: http://ewds.strath.ac.uk/REAP07

    Bloom

    Bloom, B. S. (1956) Taxonomy of educational objectives: the classification of educational goals (London, Longmans)

    See also this page.


    Last changed 11 Sep 2003 ............... Length about 900 words (6000 bytes).
    (Document started on 11 Sep 2003.) This is a WWW document maintained by Steve Draper, installed at http://www.psy.gla.ac.uk/~steve/ilig/tech.side.html. You may copy it. How to refer to it.

    Change of comment on RxShow

    By Steve Draper,   Department of Psychology,   University of Glasgow.

    My original entry on RxShow (up to about 8 Sept 2003) read like this:

  • RxShow (PC only). Unlike the others, you have to pay for this software. After a recent price drop it is something like 2k for each room you use it in, and 80 for a copy to use on your personal laptop to write the questions. However they are obviously ashamed because their website doesn't tell you this, nor give any emails or phone numbers for enquiries.
  • When I wrote it I believed it was accurate, but I have been sent this rebuttal by Socratec, and have corrected my entry:


    REBUTTAL DATED 11 Sept 2003

    Dr. Draper's earlier descriptions of RxShow contained erroneous information about our Company and RxShow products. After we brought this to his attention, most of the inaccuracies were corrected. For those who read his earlier versions, significant points about RxShow are clarified below -- plus other important factors should be considered when comparing RxShow to other alternatives:
    1. RxShow interfaces seamlessly with PowerPoint

    2. RxShow processes interactive questions far beyond just MCQ

    3. Companies charge for MCQ software either separately or part of a bundle

    4. Prices that were originally cited by Dr. Draper were more than double our actual prices.
      - The Public area of our website does provide pricing information:
      (RxShow costs less than $1,000 and RxShow Lite costs $299)

    5. Interactive Questions can be prepared on any PC without requiring special licensing.

    6. Home Page of our website gives detailed address and phone information for inquiries

    For more information, please visit our website at www.rxshow.com  or  www.socratec.com



    Last changed 24 Aug 2001 ............... Length about 900 words (6000 bytes).
    This is a WWW document maintained by Steve Draper, installed at http://www.psy.gla.ac.uk/~steve/ilig/thes1.html.

    Draft letter to THES on interactive lectures

    Possible titles:
    Ask the audience
    Anti-Passivity lecture technology
    Anti-Passivity technology
    Who wants to defeat lecture passivity?
    The 64,000 answer question

    [An edited version of this draft appeared on the Times Higher Education Supplement letters page, 24 Aug 2001, p.15]

    Agony aunt Kate Exley offered advice for combatting passivity in large first-year lectures (THES 6 July 2001, p.28). There is a technology that we believe can help scaffold, and so make much smoother, just the steps that she highlights. It is possible to buy equipment functionally similar to that in the TV show "Who wants to be a millionaire?" that allows all students in an audience to register their own response to a question privately, and for the aggregated results to be immediately displayed. This allows the implementation of Kate Exley's tactic of getting students to answer questions, while addressing the issues she alludes to: the problem of privacy (that otherwise inhibits most from answering), of giving everyone the mental exercise of answering instead of only one out of the group, and yet of building a sense of (learning) community by the shared display of the range of responses.

    Used in Hong Kong and the USA, its introduction to the UK has been pioneered by Prof. Jim Boyle of Strathclyde University where it is regularly used in first year classes of about 100. Glasgow University has now invested in it, and Quintin Cutts plans to use it in classes of 300 from October, while others will explore its use in a variety of other classes. For more on why we believe it is useful see
    http://www.psy.gla.ac.uk/~steve/ilig/

    Steve Draper, University of Glasgow


    Last changed 1 Feb 2005 ............... Length about 900 words (6000 bytes).
    (Document started on 1 Feb 2005.) This is a WWW document maintained by Steve Draper, installed at http://www.psy.gla.ac.uk/~steve/ilig/turningpoint.html. You may copy it. How to refer to it.

    Turning Point equipment

    This information from Mitt Nathwani, February 2005.

    TurningPoint is simply a toolbar designed for Microsoft Office 2000 onwards, rather than a full application in iteself. It is accessed predominantly from within PowerPoint. TurningPoint is very feature rich but the most striking is that it takes just seconds to learn how to use. To create an interactive slide requires the user to click on 'Insert Slide' and choose one of the interactive templates. They get a blank PowerPoint slide which allows them to type in a question and a number of answer options. The chosen display (bar chart, pie chart, etc.) automatically reformats itself depending on how many options the lecturer provides the students. The slide is now ready and this entire process can take just 2 mouse clicks and 1-2 seconds. Everything else is just PowerPoint (equation editor, embedded videos, etc.)

    With respect to hardware, TurningPoint is distributed in the UK at the moment with an infra-red keypad known as the ResponseCard. This is a credit-card sized pad with 10 option buttons on it. The buttons are sealed making them resistant to every day grease which normally limits the life of a handset. The ResponseCard receiver can process one response every 27ms which means it can deal with roughly 36 responses per second (in theory). Of course, in reality infrared can only be processed on response at a time which means that in real life it would take around half to two thirds of that number of responses in a second. In a lecture hall this means that 200 responses should take no longer than 10 seconds, and this can be improved by adding more receivers (each receiver can deal with up to 80 keypads). More importantly there is no limit (again, in theory) to the number of ResponseCards that can be used with TurningPoint - the practical limitations get in the way most of the time. However it has been used in real conferences with up to 1500 people voting at the same time. ResponseCard's battery life is extremely long too, having been tested to well over 250,000 keypresses on a single set of batteries.

    ResponseCard isn't the only hardware choice though. Should the customer decided that they want to use H-ITT, CPS or some versions of the Reply hardware they can do so without any trouble since TurningPoint is designed to work with them also. Using an applet called vPad the university can also put this onto PDAs or PCs instead of using handsets. I guess in summary the extremely simple yet feature-rich software, robust and compact hardware, and hardware choice makes TurningPoint a pretty serious consideration in the UK. In terms of price simple kits of 32 handsets are £1699 to education but site licenses (hardware and software) for over 1000 users come down to £35 per handset.

    Mitt Nathwani
    Product Manager
    Tel: 0208 213 2100
    Mob: 0784 172 1436


    Last changed 20 March 2004 ............... Length about 300 words (2,500 bytes).
    (Document started on 7 March 2004.) This is a WWW document maintained by Steve Draper, installed at http://www.psy.gla.ac.uk/~steve/ilig/vfeedback.html. You may copy it. How to refer to it.

    Feedback on the video of handset use

    By Steve Draper,   Department of Psychology,   University of Glasgow.

    We have made a video "An example of using the PRS voting equipment". Please send me feedback about it, after watching it through once, by email to Steve Draper: s.draper@psy.gla.ac.uk

    You may find it more convenient to copy-and-paste the text from this page into an email; or to find and reply-to an email I may have sent you in advance with these questions.

    What I'd like to know from you is:

    1. What you found most useful/best about it
    2. What you found worst about it
    3. How easy was it for you to find a machine to play it on
    4. What you think of the format(s): were we right to choose DVD, ...
    5. What were the programme notes like? Best and worst features; other comments? ( http://www.psy.gla.ac.uk/~steve/ilig/video1/video1notes.html)
    6. What else should be added to the programme notes?
    7. If we video some more (which we are considering), what would you most like to see included in future?
    8. Any other comments?

    Last changed 6 June 2004 ............... Length about 700 words (5,000 bytes).
    (Document started on 20 Mar 2004.) This is a WWW document maintained by Steve Draper, installed at http://www.psy.gla.ac.uk/~steve/ilig/vidauthors.html. You may copy it. How to refer to it.

    Authoring new outlets for video

    By Steve Draper,   Department of Psychology,   University of Glasgow.

    I have recently, Feb-March 2004, had a video produced, and by accident found myself as a very early adopter of distributing this in (for academics) new formats: DVD, streaming video, CD, downloaded video files. I would recommend these formats, and here are a few notes on why for those at Glasgow University. I expect I will not update these notes and that they will quickly go out of date.

    Firstly, this particular video was based on filming a lecture, and was done at rather short notice by Media Services, who were also very quick at editing the finished version. They also, having shot and edited in a mixture of analogue and digital, offered the result on any or all of a variety of formats, including VHS tape and DVD. I worked with Colin Brierly (cb3v@udcf.gla.ac.uk).

    From the viewpoint of an academic and talk presenter, DVD as a format has the disadvantage that you can't integrate clips directly into a powerpoint presentation. On the other hand:

    The video can also be converted for streaming video: where the video is accessed from a web page and played over the internet with little download delay before playing begins. This means you can offer it round the world and/or to students without sending disks. New staff machines are typically set up already for this. New student cluster machines have the power for this, but their configuration may need to be adjusted after negotiation with support staff. The file conversion to Windows Media streaming format was done by Steven Jack (s.jack@compserv.gla.ac.uk), and then mounted on his server all within a day or two. The file conversion to Quicktime streaming was done by Colin Brierly (cb3v@udcf.gla.ac.uk) and John Morrison (j.morrison@psy.gla.ac.uk), and the streaming hosted by John McClure (j.mcclure@psy.gla.ac.uk) in psychology. In theory it is unnecessary to offer both formats, as any fairly new machine can be configured to receive both formats. In practice, machines are more likely to be already set up only for their "native" format: Quicktime on Macs, AVI on PCs.

    It is also possible to convert the same video to either QuickTime (Mac; .mov) or Windows Media (PC; AVI; .wmv) format; and offer these on CD or for downloading on the web.

    You can access some of these different versions of my video from here.


    Last changed 2 Nov 2005 ............... Length about 1400 words (12,000 bytes).
    (Document started on 28 Feb 2004.) This is a WWW document maintained by Steve Draper, installed at http://www.psy.gla.ac.uk/~steve/ilig/videos.html. You may copy it. How to refer to it.

    A video of using voting equipment in class

    By Steve Draper,   Department of Psychology,   University of Glasgow.


    See also videos of another product in action: Discourse from ETS
    See also the JISC video of Strathclyde PRS usage, with interviews. -->


    We have made a video "An example of using the PRS voting equipment" showing EVS use in one class, and have some free copies available. Filmed on 19 Feb 2004, it shows the 8 votes and 5 distinct questions used in one session of an introductory statistics course with 61 students, complete with short interviews with students and the lecturer. The point is to convey what it might feel like to use electronic voting within a university class, and so to supplement the other material on this web site.

    PRS audience PRS screen PRS question
    (Click on the pictures to see enlarged versions.)

    There are full programme notes about, and to accompany, this video.

    There is a 30 second trailer illustrating what is on the video which you can see on streaming video for PCs or Macs (click, and if your machine is set up for this, a test should start to play now on your screen); or alternatively download.
    30 sec trailer
    Download  Streaming
    Picture size: Small Medium
    Quicktime MPEG1 .mov (Macs) 650 kbytes 2.7 Mbytes   QT
    Windows media AVI .wmv (PCs) 554 kbytes 6 Mbytes   wmv

    The main video is 36 minutes long. They are both available in these formats:

    (If you are interested, you can read brief notes on authoring these formats at Glasgow University.)

    DVD

    This format gives the best quality and resolution on screen. Most new desktop and laptops (i.e. purchased since about Jan. 2003), whether PC or Mac, if they have a CD drive at all it is likely to be a DVD-ROM drive, and they should be able to play this: consequently most academics should be able to find a machine to play this on for themselves, and will be able to play it in most talks/seminars by taking in a suitable laptop and using a data projector. Thirdly, the DVD format allows us to provide an index, to make it easy for you to jump about to the points you want within the video. The DVDs I have are in PAL (not NTSC for the USA) format (though it might be possible to make NTSC ones), but I believe that this will not matter if they are played on a computer (with DVD drive) as opposed to a domestic player.

    Streaming video

    If you watch the video, please email me comments: see these questions.

    If your machine is set up right for the format you click on, it will just start to play after a short delay. Unfortunately if it is not set up right, you may not get any sensible error message, and it may even hang for several minutes as well as ending by doing nothing. If it doesn't work, sensible people will waste no more time on it, at least on that machine. However if you are determined to spend time reconfiguring your machine, I have a few hints and technical notes.

    Streaming video
    Time length 36.5 minutes 30 seconds
    Picture size Medium Small Medium Small
    Quicktime MPEG1 .mov (Macs) QT QT QT QT
    Windows media AVI .wmv (PCs) wmv wmv wmv wmv

    Download the files

    If you watch the video, please email me comments: see these questions.

    Download video files
    Time length 36.5 minutes 30 seconds
    Picture size Medium Small Medium Small
    Quicktime MPEG1 .mov (Macs) (152 Mbytes) 35 Mbytes 2.7 Mbytes 650 kbytes
    Windows media AVI .wmv (PCs) (402 Mbytes) (171 Mbytes) 6 Mbytes 0.5 Mbytes

    Picture sizes: "Medium" is 640 X 480. "Small" is a quarter the area: 320 X 240.

    A Mac is likely to be set up for playing Quicktime files, and a (fairly new) PC for playing Windows Media format. (But you can get both players free for both types of machine, and Real player will play both formats. See my hints and technical notes.)

    For comparison, the DVD version takes about 2Gbytes.

    CD disk

    CD of video files. Both MPEG1 quicktime (.mov) and AVI (.wmv) Windows versions are on the same CD. It contains all four of the larger picture size versions in the download table above.

    VHS video cassette

    If you really need this.

    Requesting copies

    People who have been sent a copy are shown on my list of recipients.

    We will post you a copy provided:


    Last changed 2 Nov 2005 ............... Length about 1400 words (15,000 bytes).
    (Document started on 18 Mar 2004.) This is a WWW document maintained by Steve Draper, installed at http://www.psy.gla.ac.uk/~steve/ilig/videotech.html. You may copy it. How to refer to it.

    Video streaming testbed and tech. notes

    By Steve Draper,   Department of Psychology,   University of Glasgow.

    This page is only a few points I had to pick up for myself and may as well store here as anywhere. No guarantee of accuracy. For more and better information, you need to go elsewhere.

    First, here is a set of test cases to work with (return to my main video page for more cases).

    Test cases
    File download Streaming 1 Streaming 2
    Quicktime .mov (Macs) 650 kbytes our trailer John's demo Calum
    Windows media .wmv (PCs) 554 kbytes our trailer (may not work) our trailer 2 Jack's demo

  • LTDF streaming video seminars

    URLs look like:
    Quicktime: rtsp://www.psy.gla.ac.uk/steve/streaming/sample_300kbit.mov
    Windows AVI/.wmv: http://130.209.38.90/commemorationday.asx which calls "mms://commsvs1.cent.gla.ac.uk/Psychology/PRS/PRS-Clip-LAN.wmv"

    The indirection via a .asx file allows information on titles, copyright etc. to be added (although these are also usually encoded in the .wmv file); and may allow CompServ to move servers around without disturbing people's bookmarks to the .asx files.


    Temporary test

    Test file download:
    qt
    qt
    wmf
    wmf.czc
    avi

    web header get

  • http://www.dcs.gla.ac.uk/~mitchell/hdr.cfm

  • web server logs

    Points about configuring your machine

    If you want to test machines on their ability to play videos, particularly streaming videos, and to reconfigure them to do so, then you want to:

    Hardware

    One page to look at on hardware requirements: (to support latest Windows player)

    At least for some player software, the sound hardware must be enabled even if you don't want to play the sound, or else the player refuses to work.

    Operating system patches / extensions

    I have heard that on a PC, a widely available OS patch is needed to get Quicktime to work.

    Movie player

    Player software is mostly free; and there seem to be versions of each player for both Mac and PCs.


    Quicktime players: Download page   (General website) At least for my video, you seem to need at least version 6.x Otherwise you get it playing sound but no picture.

    Windows Media player. At least for my video, you seem to need at least version 7.x Otherwise the error message says it can't download the right codec.

    Real player

    Browser protocols

    For Quicktime streaming video, your browser must handle the rtsp (as well as http) protocols. For WMV streaming video, your browser must handle the mms (as well as http) protocols.

    Problems / symptoms

    Symptoms Actual problem, solutions
    Windows player on a PC refused to play the vision because no sound on machine. Fix: change OS to make it at least look as if there were sound.
    Movie plays sound but no vision (quicktime) Upgrade Quicktime from v.4 to v.6
    Movie plays sound but no vision (wmv) Upgrade Windows Media Player e.g. from v.7 to v.9 In Player, choose File menu, Get Info and it shows you the codecs used in that movie being played. In the Apple menu, About Media Player it shows you the version number of the player.
    Movie plays vision but no sound. (Small PC) I seen this, but dunno the details or the solution
    Errors about unable to download a codec (Windows movie) Later version of Windows player needed: at least v.7
    IE browser wouldn't play streaming QT Manually add a protocol helper for RTSP -> Quicktime player
    "Can't do this format" when quicktime Open command on file is attempted ?? Actually: just Mac file type not set on file
    Hangs silently when trying to play QT streaming video Protocol failure in browser. Set its mappings so that rtsp protocol handled by something e.g. Quicktime plugin
    Browser download fails: no apparent result Need upgrade of quicktime to v.6
    IE Browser error page "This page cannot be displayed" followed by "page may be missing". But this is not in fact an error page from the server but from the browser. Although it looks like a missing page / bad URL error page, this is actually a browser (not server) error page saying it cannot understand what was sent. It means it can't do the rtsp protocol: you need to install the quicktime plugin and/or change the settings to point to it.
    Netscape: Page missing page from server Actually, browser unable to do quicktime streaming (rtsp protocol)
    Download (of wmv) fails: displays garble as text IE fix: edit config to use mime type text/plain and suffix wmv mapped to Windows player
    But don't work in netscape because that "mime type taken"
    "The playlist format not recognised" from Windows player for WMF streaming video (in IE on mac) In fact an obscure failure of the WMP plugin to parse .asx files and also do http redirection. Solution: configure browser helper to redirect .asx to the WMP (windows player).
    "Switching transports" message on QT player: Mac OS-X, MAG, IE, streaming opens QT but never does it. ? newer QT?; McClure fix to browser page?
    "connecting..." browser message, but never does. IE, Stuart's machine, attempt to download .wmv file. Machine recognises file suffix type, but browser doesn't: probably MIME type doesn't.
    xx yy

  • Sound not working on a machine that actually had it.
  • Page not found: actually, PC browser unable to do quicktime streaming.
  • On Mac OS9 IE browser wouldn't play streaming QT. Fix: manually add a protocol helper for RTSP -> Quicktime player. On second attempt that got streaming quicktime to work.
  • On res.rm PC in IE, must say "yes" to "do you want to reassign mime types"; and must say "yes" to "do you want to play the movie inside the browser".

    Server stats on viewing the streaming video versions

    Windows streaming

  • http://www.gla.ac.uk/services/computing/video/reports/Report26/
    Hit the link "video content" e.g. http://www.gla.ac.uk/services/computing/video/reports/Report26/Week21Mar2004/VideoContent.html

    Quicktime streaming


    DVD higher level structure

    DVDs are divided into "titles", and then into "chapter points". However their movie parts may be divided into separate "cells", with user interaction (typically on graphics screens with active link points) deciding which cell to play next. In fact I understand there is essentially a programming language for these control flows, with branch and test, access to some parameter values and to a clock for timing, and 16 registers (variables) to use e.g. for keeping a score, counting the number of attempts at something, etc.

    Thus it would be possible to build a piece of CAL (Computer Assisted Learning): play a segment of movie; ask a question (in text or sound or both), and which button the user chooses to press determines which cell they get next e.g. "Rubbish", "Good", explanations of what was wrong, try again, etc.


    Tech. Facts

    MPEG 1 MPEG 2
    CD
    650Mbytes 74 mins
    DVD
    4.38 Gbytes (15.9) 75 mins at max. quality
    PAL or NTSC, but not both
    Quarter area TV screen size/resolution
    352 X 288 720 X 576 frame size
    1.5Mbps (bits) 10 Mbps (bits)

    Picture sizes

    NTSC vs. PAL

    Windows format has neither (and no interlacing). Both MPEGs, and so quicktime, has either NTSC or PAL built in (and with it, resolution and frame rates). However QT can play any QT file on any computer. The problem is probably only with domestic equipment feeding to a TV monitor without much conversion.

    Presumably then a DVD (which definitely is either NTSC or PAL) would play on a laptop regardless, but not on a domestic/specialised DVD player.

    Other things

    CD disks: good for both macs and PCs at once
    Quicktime seems much better at compression than WMF

    DVDs hold files with 3 extenstion types:
    .IFO (control flow data)
    .BUP (backup duplicates of the .IFO files)
    .VOB (the video chunks)
    The VOBs are actually a subspecies of MPEG2 file.


    More wider points

    GIF movie (Netscape only??): press.

    OS "edge" movie:
    original page

    < img src="http://www.g-intelligence.co.uk/emails/select/150104/email/i/2.jpg" alt="" border="0" width="393" height="144" />xx

    still only

    The code: < img src="http://www.g-intelligence.co.uk/emails/select/150104/email/i/vid.jpg" alt="" border="0" width="192" height="144" dynsrc="http://www.g-intelligence.co.uk/emails/select/150104/email/bike.mal" codectype="raolb" xrate="15" cntval="true" />

    local relink


    Last changed 29 June 2003 ............... Length about 400 words (3,000 bytes).
    (Document started on 11 May 2003.) This is a WWW document maintained by Steve Draper, installed at http://www.psy.gla.ac.uk/~steve/ilig/workshop.html. You may copy it. How to refer to it.

    Workshop on handset equipment

    By Steve Draper,   Department of Psychology,   University of Glasgow.

    Why come?

    Quintin and I are proposing to hold a series of workshops to introduce possible new lecturer-users to the handset equipment at Glasgow University. Thinking now about possible places to use it in your teaching next session means you can be ready by the new session (Sept. 2003).

    We are submitting a Newsletter article to get at a wider audience. To see the argument there, and so why to come to the workshops, have a look at it.

    The dates

    The first proposal for the workshop was: Tuesday 20 May, 2-4pm, at TLS (Florentine house). This has now run.

    The second was at 2pm on Tuesday 27 May; in the conference room (F121) in Computing Science. Let Quintin know ASAP if you would like to attend, and when suits you. Quintin Cutts email: quintin@dcs.gla.ac.uk ext.5691 This has now run.

    Thirdly, we ran a workshop on Wed 18 June, to cater for those responding to the Newsletter piece. This has now run.

    The format

    Our thoughts on the format are as follows, but we are very open to suggestions from the audience.
    1. Quick demonstration of the equipment in use
    2. List of possible pedagogic uses
    3. List of some users so far
    4. Discuss with audience members their particular cases, contexts, proposed uses.
    5. If time permits, then for anyone who has decided to use it and has their laptop with them, we could we could try installing the software on the spot, and testing it with the equipment.

    More information about the equipment and using it can be found through the main page.