You are here

BestChoice: Learning how to teach interactively over the web

Author(s): 

Sheila Woodgate and David Titheridge
The University of Auckland
Auckland, New Zealand

Abstract: 

The BestChoice web site (bestchoice.net.nz) was born in 2002 out of a desire to offer additional learning support to students in large first-year university courses. The aim was to create web-based activities that modeled a one-on-one interchange with an experienced teacher. Thus it was intended that content be developed systematically, using both information and question pages, and that users receive instructive feedback in response to their answers.

The BestChoice web site (bestchoice.net.nz) was born in 2002 out of a desire to offer additional learning support to students in large first-year university courses. The aim was to create web-based activities that modeled a one-on-one interchange with an experienced teacher. Thus it was intended that content be developed systematically, using both information and question pages, and that users receive instructive feedback in response to their answers. 

The activities currently on the web site have 24 000 opportunities for users to interact with the system and receive feedback. A selection (75) of activities is available in a menu accessed by clicking on DEMO MODE at bestchoice.net.nz. The design of these activities, the tools used to create them and the system that delivers them has been driven by the pedagogical requirements of teaching model, direct feedback from our users, and analysis of their usage data. The purpose of this communication is both to introduce you to the nature of the BestChoice activities and to share with you insights that have been gained as a consequence of our analyses of their usage by various cohorts of students.

THE IMPORTANCE OF HAVING A VERSATILE SET OF AUTHORING TOOLS

Our previous experience with developing computer-based learning activities led us to envisage a text-based system where images were used only when the pedagogy demands. We chose to develop our own authoring tools so that these could be modified as required by the teaching model. In order to achieve the goal that we had set ourselves, the author needed the capability to

  • place answers and feedback in the flow of text anywhere on the page

(so that multi-step problems can be developed on a single screen page)

  • mix-and-match answer input styles on the page and use any number of these

(so that the answer style could be chosen to suit the objective of the question)

  • hide page sections on loading and have their appearance triggered by a correct response

(so that in development of multi-step problems, the user is not overwhelmed by complexity at the outset)

The necessity for all of this (and more) is illustrated by a BestChoice question page for a drill-and-practice exercise where the user constructs and balances the half equations and the overall equation for the reaction between iron(II) ions and dichromate ions. The page loads as shown below. 



The objective was to create an on-screen version of the pen-and-paper approach (balance atoms other than O and H, balance O, balance H......). This requires 23 answers. 10 are formulae for the species involved, 13 are numbers. 

Students learning to balance redox equations on pen-and-paper would have all chemical formulae available so that they can focus on the balancing. Thus the formulae are on dropdown lists in the corresponding web-based exercise. 



To simulate the balancing sequence, the completion of each part (for example, the balancing of chromium) triggers appearance of the next part (for example, the list from which water would be chosen). 

On paper, once the correct formula for a species is written, the stoichiometric coefficient is entered. The screen shot shows that in the web-based balancing, choosing a formula from the dropdown triggers appearance of a text box into which a number can be entered. 



Once the half-equations are complete, they are added to give the overall equation. In the overall equation, the order in which the reactants (and products) are chosen from the dropdowns should not matter. In the web-based balancing, this requires that each coefficient-answer is coupled to its formula-answer. 



Finally for the completed page to have a text-box appearance, all buttons used to access the dropdown lists, all text-surround boxes and all coefficients of 1 must have disappeared on display of the correct response. 

While the page above simulates fairly well the balancing of a redox equation with pen and paper, a significant difference between this exercise and the pen-and-paper one is that in the web-based exercise, incorrect answers are exposed immediately and can be corrected.  Thus the user has an opportunity both to learn from his/her mistakes and to proceed to complete a correct balanced equation. 

While every page authored informs the authoring of future pages, our priority has been to let the content drive the style of question page created. Some page styles (like the redox equation one) are re-used. However, each system has its unique features and pedagogical possibilities, and the point ofBestChoice is to go beyond drill and practice to expose the thinking behind the problem, placing a greater emphasis on the problem-solving process than on the overall answer.

THE IMPORTANCE OF COLLECTING DATA AND ANALYSING THAT DATA

The users of BestChoice are in equal parts New Zealand high school students, UK high school students and New Zealand first-year university students. 80 000 users have entered 30 000 000 correct answers on BestChoice pages. We have learned that students can be a great help in designing systems to teach them. With the best will in the world, we cannot put ourselves in their shoes. We can, based on our experience as teachers, create activities that we think that students will like and deliver these in what we consider to be a learner-friendly way, but students should be the judge and jury. They are the customers.

With a view to encouraging users to give us feedback, a survey was placed on the last page of everyBestchoice activity. Users may enter a Likert scale rating (1 to 6) and/or a text-comment. This is one of the best things that we ever did.  24 000 comments relevant to teaching and learning have been entered along with 122 000 Likert scale ratings. Overall, 79% of the comments are compliments or suggestions, 21% are criticisms. 30% of the Likert scale module ratings are the maximum rating, and 79% of responses are positives. 

Completion of the survey is voluntary, so it is important also to collect usage data for all users. A companion application (BestChoice Reports) was developed to view these data. This application is both used by the developers and also made available to teachers of students who are BestChoiceusers.

User comment has informed the approach taken when constructing the activities.

The first activities developed used an enquiry approach. The user was given minimal information on screen pages preceding the question pages, and content was developed through the questions and feedback. Comments from users indicated that they liked the integration of information pages, questions and feedback.

13 Mar 06 I really like this because you have a review and the questions are right after it and everyone knows you need to do alot of questions to know the material. Keep it up! 

31 Mar 05 This is really an awesome site, it has helped so much especially with the step by step instructions and the helpful explanations after you get a question right. The exclamation mark when you ask for help is good too cos its like ah hello, its this obvious. I love this site.I can see it is going to help me immensly this year


However, users did not like being asked about something had not featured in earlier information pages.

13 Mar 06 i really liked how there were notes to read throughout the quizes. Maybe for future quizzes notes could be added before the question is asked to give hints as to what the answer will be


This feedback led us to alter our module format to supporting users with information pages prior to the relevant question pages.  This type of activity has a better completion percentage and receives more favourable student comment. Students like to be able to flick back to information pages inBestChoice without losing their previously-entered answers. 

In a similar vein, carefully-worded instructions are important so that users know what to expect when they embark on entering/choosing answers on a question page. Even with experience, the author does not always anticipate the detail of instruction needed as per the comment below that pertains to a recently-authored page.  The student was emailed, and it transpired that he had a perfectly valid point, and a further instruction was added.

3 Oct 10 instructions on what is required to be done is highly unclear. This causes some answers to be marked wrong despite the answers technically being correct, only because the answers have been entered incorrectly. I repeat, the INSTRUCTIONS on what the question requires, particularly on HOW the question needs to be answered, are usually unclear and/or incomplete. Highly frustrating...

 

User comment has had considerable impact on our marking practices

Scores are important to students, even though completion of BestChoice activities commonly does not contribute to course assessment. Our first scheme was one mark for each BestChoice answer irrespective of whether it was entered correctly on the first attempt or on a subsequent one. Each answer was marked instantly and feedback was displayed adjacent to the answer. The feedback feature was (and is) very popular. The instant marking was much less popular for reasons pointed out in the two comments below.  As a consequence, instant marking is no longer our default marking style.

9 Oct 10 for the last question i think it would be best if the answer didnt auto correct as with the previous ones. Sometimes you dont understand the reaction scheme until youve played with combinations for a while. 

9 Oct 10 i prefer it when i can manually press 'mark answer' so i can correct myself when i accidentally click the wrong answer


Additionally, users told us that they did not think that it was fair that they got the same mark for a first-right answer as they did for a changed-right-answer. Thus, in 2007 we put in place the system that currently operates. The overall score is still for completion, but this is now the sum of first-right-answers and changed-right-answers, with the marking bar displaying both of these. 

This development, driven by student feedback, has been a huge bonus for our evaluation program because the percentage of first-right answers, in conjunction with the number of Give-Ups, gives us evidence for (a) whether a question is achieving its objective for most users and (b) whether the feedback for a wrong answer is sufficiently instructive to enable most students to recover from their initial mistake.

Survey and usage data guide us in providing scaffolding appropriate for a particular topic.

Our intention was to create activities to help students bridge the gap between being presented with information (listening to a lecture or reading a book) and flying solo where they solve multi-step problems or apply concepts without support. Thus BestChoice activities ask students lots of little questions that would not normally feature in written exercises.  The aim  is to probe whether the student understands the background to the system in the problem.  This approach means that there are lots of opportunities to give users feedback at a point when they are receptive due to just having engaged with the content.  The usage data that accrues give us clues where understanding breaks down. 

It is important that the questions asked include content that authors perceive to be the obvious because what is obvious to us is not necessarily obvious to learners.  These questions also build student confidence, showing them that they do have some of the prior knowledge required to be successful.  However, surprises abound when one takes this approach and then examines the data.  

It is pleasing, for example to know that users do not find balancing equations difficult, provided they are given the reactants and products. On the other hand, it was disturbing to find that 30-40% of students do not know that 2Cl(g) is a higher energy system than Cl2(g).  Likewise, who would have believed that students who had happily identified tetrahedral, trigonal planar and bent shapes from static pictures of mononuclear species would not cope nearly so well with identifying the same shapes at particular atoms in a multi-nuclear Jmol of an amino acid? 

It is important to go beyond simply looking at the data, to using it to inform modifications that may make the question more accessible. This could involve building in repetition where knowledge of concepts is probed from various perspectives. For example, users can be asked to identify the nucleophile and identify the electrophile many times before they start to get the idea. 

Despite our general emphasis on supporting students, there are areas where, based on usage data, we have removed scaffolding. Stoichiometry is one of these areas. In our early stoichiometry question pages, users worked through problems in a linear way, choosing/constructing the relationships required, entering the given data into the relationships on-screen and then calculating and entering the answers. 

Our first attempt to go beyond what was a somewhat over-scaffolded approach was to introduce problem-planning pages before the number-work pages.  This was consistent with our philosophy of fostering development of transferrable skills. Users were asked to identify the unknown and known as well as the quantities that needed to be calculated and the relationships required to calculate these.  Usage data revealed that students found the planning pages much harder than the numbers pages. 

The introduction of planning pages had obviously significantly increased the number of answers that needed to be entered to solve the problem, so the next step was to reduce the scaffolding on the number-entry section, making the assumption that if students had in front of them a simple mathematical relationship (x = yz or x = y/z), and values for y and z, they could calculate the answer. This achieved a reduction in the number of answers to be entered and enabled integration of number-entry and planning sections onto the same page. 

We have also moved away from revealing answer fields one at a time (as in the redox equation page), to revealing answers in blocks: first the known and unknown, second all of the intermediate quantities, third all of the relationships, fourth all of the calculation fields. This gives users the possibility of working forward from the known or backward from the unknown. The screen shots show, for one page, various stages in the answering process. 

On loading:


Planning complete:


Calculation complete:
Note the annotations that are displayed on screen as feedback for correct answers.



We have data from a variety of cohorts for a three-problem sequence, the one above and two others based on reactions with different stoichiometries. The screen shot shows, for a class of first year university students working through the sequence, that % first-right answers and % completion improve, and time taken diminishes.  



While the data above are encouraging, the general area of stoichiometry continues to challenge us as both % completions and survey feedback on this type of activity are somewhat below BestChoicenorms.  We continue to experiment with different approaches.


In conclusion 
The possibilities that the web offers for experiments in instructional design, like BestChoice, are beyond what can be imagined.  There is so much potential to complement, extend and inform conventional modes of instruction. Our voyage of discovery in the area of interactive learning has given us a different perspective on teaching and learning, and enabled us to connect with our students in ways that we would not have previously thought possible. The result is a reciprocal learning situation where in reward for our efforts to support their learning, students support us by providing good suggestions and encouraging comments, such as those below. 

12 Oct 10 Extremely helpful in discovering my weaknesses in this topic and also at helping me overcome them.
6 Oct 10 That was really good! I had no idea on this topic, and by slowly going through it, I was learning then practising and now I mastered it! Thanks!


We would encourage to have a look at BestChoice (bestchoice.net.nz)In addition to the 75 learning activities mentioned above, there are two presentation modules in the DEMO mode menu, under the heading BCCE.  These show in more detail how we use our data.  The blue text on the pages are links to more information or data (sometimes at the bottom of pages). If you choose to SIGN UP in order to have a more valid user experience, Choose Other as your Institution and check out the General courses.  

Lastly, if you are involved in any web-based education initiatives, do set up automatic data collection.  It is like opening your eyes after years of working blind.

We would like to acknowledge financial support from The University of Auckland, the New Zealand Ministry of Education (2006-9) and the Royal Society of Chemistry (2006-10).

Date: 
12/13/10 to 12/15/10