You are here

Preliminary results using CPR (web-based Calibrated Peer Review) for pre- and post-lab writing assignments in general chemistry

Author(s): 

Lawrence D. Margerum and Maren Gulsrud, Department of Chemistry, University of San Francisco (USF), 2130 Fulton Street, SF, CA 94117

04/04/03 to 04/10/03
Abstract: 

Chemistry students at the University of San Francisco (USF) complete many traditional wet chemistry techniques in their introductory lab courses. What is missing is exposure to quantitative analysis using modern analytical instrumentation. This project makes use of an atomic absorption spectrophotometer (AAS) and a web-based writing-to-learn technology (CPR) to explore the question, "Is there lead in my house?" The objectives of the project are to improve conceptual understanding of technical subjects, to develop skills in reading for content and in scientific writing, and to link the writing assignment to hands-on environmentally significant experiments using AAS.

This paper will describe the structure and timing of a CPR assignment, give examples of the pre-lab and post-lab writing assignments, and present preliminary assessment from the pilot program completed this fall (2002).

Paper: 

I.        Introduction: Learning through writing about environmentally significant analysis using AAS

A long-standing major challenge in chemical education is how to develop better conceptual understanding of the topics covered in introductory courses. The traditional chemistry lab experience and the classroom lecture are now being supplemented or replaced by active learning strategies and alternate assessments as exemplified by the NSF systematic chemistry reform projects. A well-developed body of literature now exists for educators interested in implementing such strategies as in-class concept tests[1], cooperative learning groups (guided inquiry) [2-7], peer-led review groups[8], writing across the curriculum[9-13] and modular chemistry[14]. Many faculty members are skeptical of the outcome of these changes, or lack the time, tools or training necessary to implement them. At the same time, most faculty members recognize the value of improved science communication skills for our students. At the University of San Francisco (USF), a national liberal arts university of 3800 undergraduates, we are concerned that many of our beginning students are not well prepared, or do not have the depth of understanding, to clearly express technical concepts in oral and written form.

 

One major challenge is to implement new technologies that a diverse set of students and faculty can use to address these shortcomings. One such solution is the web-based writing-to-learn technology called Calibrated Peer Review (CPR) developed by the Molecular Science Project[11, 15]. This internet-based instructional tool enables students to learn course material by electronically submitting short essays with the help of guiding questions. The students demonstrate competency by reading and assigning scores to three calibration essays on-line. Finally, they score three anonymous essays from their peers and rate their own essay. We believe the use of this technology may lower the barrier that faculty see in producing and grading writing assignments in large classes, since the CPR program takes over most of these functions. There is strong agreement among scholars that "writing is thinking"[16]. It makes sense to help our students develop this important learning skill. To make the link between chemical principles and real world problems clearer, this project aims to locate source material and produce writing assignments that enrich what is available from the course textbook and lab manual.

Top

 

A second major challenge facing many chemistry departments is getting beginning science students exposure to modern analytical instruments. The objective is to stimulate student interest in chemistry and to give students more confidence with quantitative chemistry. A well-designed quantitative analysis laboratory using real samples is thought to increase the quality and relevance of the undergraduate experience[17]. We chose to purchase an automated Atomic Absorption Spectrophotometer (AAS) as an entry point for introductory experiments on environmentally relevant samples. The first experiment was adapted from professor Peter Markow's excellent article "Determining the Lead Content of Paint Chips: An Introduction to AAS"[18].

 

This project seeks to address the major challenges of improving conceptual understanding of introductory topics and getting students exposure to modern instrumentation simultaneously. The desired outcomes of the project are: 1) to improve conceptual understanding of introductory chemistry concepts via the writing-to-learn technology, 2) to increase student interest and confidence in chemistry by doing an environmentally significant experiment using an AAS, and 3) to develop student skills in reading for content and technical writing. The major groups targeted for the project are first- and second-year science students in general and analytical chemistry, plus a group of non-science majors in a general education course called Toxic World? The total number of students in these courses at USF is approximately 200 per year.

Top

II.      Details of one writing-to-learn assignment

A.    The pilot study

There is a general feeling among most faculty members that clear writing is a direct result of a clear and deep understanding of the subject. A large body of literature associated with the writing across the curriculum movement supports this feeling[19]. Recently, Kovac and Sherwood published a handbook called Writing Across the Chemistry Curriculum that shares three insights into the subject[20]. First, writing is thinking. Second, writing facilitates conceptual understanding of a technical subject. Finally, chemistry instructors need practical tools for the effective use of writing in chemistry and how to grade writing assignments. The web-based CPR technology is a tool that implements these insights for relatively large courses.

 

In the fall of 2002, one group of 18 general chemistry students from one lab section started a series of assignments over the course of three weeks. In the first week, they participated in an instructor lead tutorial on CPR (how to create an account, log onto the course and how to start a practice CPR on atomic absorption and emission). In the second week, they did a pre-lab CPR essay on the workings of an AAS (see below for details) before completing a four-hour lab experiment for the determination of lead in paint chips. The following week was spent completing a post-lab CPR report using data from an AAS analysis for lead in paint.  These writing assignments were meant to replace the traditional pre-questions and post-lab written report (other lab sections in the course completed a pre-lab and a traditional lab report from a visible spectroscopy experiment).

B.    The pre-lab CPR: Is there lead in my paint?

In a CPR assignment, the instructor locates source material (on-line or paper) and generates guiding questions to help students study the subject. Once a student can answer all questions, he/she writes a short essay (200-300 words in HTML format) that answers these questions. They submit their essay on-line for anonymous distribution to fellow students.

 

During the summer of 2002, we developed three writing assignments and tested the AAS experiment. The first CPR assignment was a practice exercise on atomic absorption and emission (using a computer simulation found on the General Chemistry Interactive CD-ROM, Version 3.0)[21]. The second CPR was a pre-lab on how an AAS detects lead in a sample, while the third CPR was a post-lab for reporting AAS results of lead in paint from a calibration curve. To help the reader understand the level of detail involved in a student assignment, we present a somewhat shortened version of what the student encounters after logging on to the pre-lab CPR. A tutorial of the CPR program is also available on-line[22]

Top

1.      Text (or essay) submission  (20% of grade)

Pre-lab: Is there lead in my paint?

Goals

When you have finished this pre-lab assignment for your experiment called "Is there Lead in my Paint?" you should be able to:

1. Describe the goal or objective of the experiment (i.e. Why are we doing this experiment?).

2. List the major parts of an Atomic Absorption Spectrophotometer (AAS) and describe how it detects lead from your samples (What will we use to measure lead and how does it work?).

3. Briefly describe the procedure to be used in the experiment: "The Lead Content of Paint Chips" (How will we do it?)

 

Source Material

You should review your lab handout and the on-line source material before beginning to draft your essay. Links to the on-line sites are provided in the "Resources" section.

 

Using a Pipette and Pipette Pump

An important step in your lab is to make accurate dilutions of a standard lead (Pb2+) solution. This page from York University shows how to use volumetric pipettes. Be able to describe how you would use a Mohr pipette to deliver a known volume of lead solution to a clean flask.

 

Introduction to Atomic Absorption Spectroscopy (AAS)

Read the entire page first, focusing on learning the different parts of the AAS equipment and the "job" of each part. Pay special attention to the section called "The Monochromator and PMT", since you will need to explain how the AAS detects lead in your sample.

After reading the page, click on the link in the Introduction called: "animation on AAS here." This takes you to a page of Chemistry-Based animations from Professor Thomas Chasteen at Sam Houston State University. Scroll down the first column labeled "QuickTime Movies" and click on "Atomic Absorption Spectroscopy (3.9 MB)".

Top

 

Source Material Resources:

Using a Pipette and Pipette Pump - Focus on the Mohr pipette. The volume dispensed is established by calculating the difference, much like the case when using a burette.

URL: http://www.chem.yorku.ca/courses/chem1000/equipment/pipette.html

Introduction to Atomic Absorption Spectroscopy - A website by Professor Thomas Chasteen with a brief outline of the job of each component in an AAS.

URL: http://www.shsu.edu/~chemistry/primers/AAS.html

 

Guiding Questions

In studying your lab handout, the web resources and in writing your text, consider the issues raised by the following questions:

1. What are the main children's health concerns with paint that contains lead? What is the current upper limit of lead in paint (reported as weight percent)?

2. How will you use a Mohr pipette (pipette pump already attached) to make your calibration standards?

3. What are the main components of an AAS and how are they arranged?

4. What is the job of the Hollow Cathode Lamp and what element will the cathode have for your experiment?

5. What happens when analyte atoms (lead atoms) are present in the flame and how are they detected by the AAS?

 

Writing Prompt

It turns out that your professor has received a grant from the National Science Foundation (NSF) to buy a new AAS and develop a lab called "Writing-to-Learn using an AAS to Measure Lead in Paint". USF gave your high school our old AAS (still working!). Your former chemistry teacher, Ms. Leona Leadbetter, wants you to send her a brief written description of your experiment on finding lead in paint chips using AAS (she doesn't know the first thing about AAS, but she does know her chemistry!).

 

Please write a letter of the required length to Ms. Leadbetter in which you explain the experiment you are about to carry out. Start with a salutation (i.e. Dear Ms. Leadbetter), close with a formal greeting (i.e. Sincerely, or Best Regardsetc), BUT DO NOT TYPE YOUR NAME. Be sure to include the following four paragraphs (each starting with a topic sentence):

 

1) A paragraph that starts with the purpose or goal of the experiment, followed by a few sentences on the health concerns of lead in paint.

2) Write one paragraph describing how to use a Mohr pipette (with attached pipette pump) to make the 8.0 ppm Pb-containing calibration standard.

3) Write one paragraph describing how the AAS detects lead in your unknown paint sample.

4) Write a summary paragraph of at least two sentences that concludes your letter to Ms. Leadbetter.

 

Students are encouraged to create a draft essay and to use software spelling and grammar checks before submitting their essay. The instructor sets a word limit (150-300 words for this letter) and a deadline. The majority of students submit their text within 30 minutes of the whole class deadline (usually midnight on a Friday).

Top

2.      Calibrations (30% of grade)

After the text deadline, students are free to read and rate three benchmark essays of high, medium and low caliber (instructor written). These benchmarks are presented randomly and are not identified as being high, medium or low.  Students answer a list of content and style questions to help them read for content, then they rate the assignment on a scale of 1 to 10. The goal is for the student rating to match the instructor rating. The CPR software generates a report card of the results for the student that includes instructor-generated feedback on calibration answers. A calibration must be repeated if several answers are incorrect. Here is an example of a medium caliber essay with the associated content and style questions.

 

The reader should confirm the score on this essay (6 points) keeping in mind that correct content (answers to guiding questions) is worth the majority of points. In our experience, it is very hard for students to grade non-specific style questions. An examination of student responses in the peer review phase to the last question shows little feedback on grammar mistakes. The following table is taken from the CPR reporting function for one student in the course.

 

Calibrations Stage                     (%Correct Answers)

 

%Style

%Content

Rating

Dev

Retake

Score

Key

50.00%

50.00%

 

3.00

 

30.00

Calibration 1 Retake

33.33%

100%

4

2.00

Yes

0.00

Calibration 2

100%

83.33%

10

1.00

No

10.00

Calibration 3 Retake

66.67%

50%

6

0.00

Yes

5.00

 

The grading key shows that a student must correctly answer 50% or more of the questions on the first try to score 10 out of 10 on each calibration. This student did not answer more than 50% of the questions correctly on Calibration 1 and 3, so was required to redo the calibration. Even after the retake of calibration 1, the student failed to get 50% correct on the style questions and received zero points.

Top

3.      Reviews (30% of the grade)

After the calibration step, the student uses the same questions to rate three essays from other class members. The software randomly distributes the peer essays and they are anonymous. Thus, each student rates three essays and receives three peer ratings on their own essay. The grade for reviews is based on correctly answering questions and matching the weighted average of the three peer reviews. The grade a student gets on their own essay (part a) is a weighted average (based on calibration scores) of the reviewers rating.

 

Review Stage

Student's Rating

Weighted Average from Peer group

Dev

Score

Key

 

 

3.00

30.00

Review 1

    6

6.58

0.58

10.00

Review 2

    5

4.67

0.33

10.00

Review 3

    3

5.50

2.50

10.00

 

For example, the student rated essay #3 as 3 out of 10, while the weighted average of this essay from all three reviews was 5.50. This is a deviation of 2.50, but he receives full credit because the instructor allowed a deviation of up to 3.0 (this can be changed in the software).

4.      Self Assessment (20% of the grade)

 Finally, the student goes back and rates their own essay using the same criteria as above. The essay submitted by this student is shown by clicking on this link: Student pre-lab text submission.

 

Self-Assessment Stage

Rating

Peer Rating

Dev     

Score

Key                            

 

 

2.00/3.00

20.00

Self-Assessment

5            

5.00       

0.00       

20.00

 

To master self-assessment at this grading level, the student must not deviate by more than 2.0 points from the peer rating of their text to receive full credit, or not deviate by more than 3.0 points from the peer rating to receive half credit. This student correctly recognized the shortcomings of his own essay. An attempt to give a perfect score would result in zero credit for this stage.

 

If one is keeping track, the student examined seven essays on this pre-lab topic (three calibrations, three peer essays and their own essay). The CPR program keeps track of all the timing and informs students of their progress. Most importantly, the instructor has CPR assign weighted grades to each part of the assignment (student text, calibrations, peer review and self-review). The program takes into account how well each student answers the content and style questions for all seven essays; it also shows the student how close their ratings are to the peer group.

Top

III.    Evaluation

A.    Improve conceptual understanding

This first stage of our project  (only 18 students) does not include a wide-ranging evaluation plan. We are still evaluating the best ways to measure the long-term project outcomes. We plan to measure the first outcome, to improve conceptual understanding of introductory chemistry topics, by final exam questions and student reported learning gains (see below). Most of the topics for this project (atomic absorption, atomic spectroscopy and instrument-based analysis of lab data) were not taught in detail previously, so it is difficult to establish any baseline data from students. In addition, other students in the course are doing visible spectroscopy experiments, making direct comparison gains somewhat difficult. 

 

B.    Increase student interest and confidence

The second expected outcome, to increase student interest and confidence in chemistry, is measured by the on-line survey called Student Assessment of Learning Gains (SALG) developed by Elaine Seymour for the NSF Chemistry Coalition projects[23]. There is closely related information at the FLAG (Field-Tested Learning Assessment Guide) site at the University of Wisconsin[24]. The SALG survey is designed to assist faculty in understanding the effects of course innovations on students' learning. For example, it measures the relative impact of factors in the course on student perception of learning, understanding concepts, and improving specific skills. We added several questions to a general chemistry survey to probe student assessment of the writing done in the general chemistry course. In the fall of 2002, we gave this survey to three lecture sections taught by two professors (the response was about 80%). The pilot group of 18 CPR students was spread throughout these sections and we were not able to identify individual responses. All students completed the same lab experiments and assignments except for the AAS experiment. This fall we also required students to use an on-line homework package[25] that could be influencing some of the responses (it was well received by the majority of students). Here are responses to selected questions in the SALG:

Top

How much did each of the following aspects help your learning in the course?

(1 = no help, 2= little, 3= moderate, 4=much, 5=very much help)                                                    

Average (std dev)

section 1 (n=40)

section 2/3 (n=62)

1. Pre-lab/Post-lab CPR essays (tues PM lab only)*

3.58 (0.82)

3.07 (1.09)

2. Post-lab written reports (all students)

3.29 (1.07)

2.60 (1.07)

3. The pre-lab questions (all students)**

3.30 (0.90)

2.93 (1.17)

*more than 18 responses, indicating some responses by non-CPR students

**all experiments required answers to pre-lab questions

 

How much has this class added to your skills?

(1 = not at all, 2=a little, 3=somewhat, 4=a lot, 5=a great deal)

Average (std dev)

section 1 (n=40)

section 2/3 (n=62)

4. Solving problems

3.51 (1.08)

3.59(1.08)

5. Writing lab reports or essays

2.56 (1.1)

2.33 (1.00)

6. Carrying out lab experiments

3.49 (0.81)

3.57 (1.08)

7. Finding trends in data

3.23 (0.95)

3.22 (0.97)

 

To what extent did you make gains as a result of what you did in this class?

(1 = not at all, 2=a little, 3=somewhat, 4=a lot, 5=a great deal)

Average (std dev)

section 1 (n=40)

section 2/3 (n=62)

8. Understanding the main concepts

3.64 (0.83)

3.64 (0.91)

9. Understanding the relevance of this field to real world issues

3.56 (0.90)

3.46 (1.08)

10. Confidence in your ability to do this field

3.59 (0.81)

2.97 (1.16)

 

Examination of the average response to question #5 for the entire class suggests a need for more guidance and practice in writing lab reports or essays. The more positive response to question #1 (CPR students) compared to #2 (all students) and #3 (all students) argues for an expansion of the CPR essay project to more students. 

Top

C.    Develop student skills in reading for content and technical writing

The final project outcome, to develop student skills in reading for content and technical writing, is currently evaluated using the CPR results. We are attempting to develop short tests to evaluate reading for content and technical writing skills before and after a course that uses CPR. Each of the three CPR assignments in this pilot study varied as to length and difficulty making cross comparison difficult. However, the class average for peer rated text improved from 5.6 out of 10 in the practice CPR to 6.1 in the post-lab CPR. This may indicate improvement in writing as measured by peers searching the text for content and style problems. However, the text rating could be lower in the first CPR due to the unknown expectations of a new assignment. More work is planned with the USF Expository Writing Department to evaluate these writing samples.

 

The student calibration scores give an indication of improvements in reading for content. The software generated Reviewer Competency Index (CPI) is a measure of how well a student completes the calibrations. The class averages were 4.7, 4.4 and 5.1 for CPR 1,2 and 3, respectively. Again, one must be careful about claiming an improvement here since one CPR may be more complex or longer than another one. The average scores (after retakes) to the high, medium and low exemplars remained steady at 79-83% correct for content. There was a noticeable improvement in scores for style questions (71% for the first CPR to 85% for the third CPR). The average scores for students completing the three assignments (using the least difficult grading criteria from the CPR software) were:

Average class grades

CPR 1: Atomic absorption and emission:

82 (+/- 12)%

CPR 2: Pre-lab on how an AAS works:

83 (+/- 7)%

CPR 3: Post-lab report on Pb in paint

89 (+/- 7)%

Top

IV.  Future considerations

There is a considerable amount of faculty time needed to craft the CPR assignments. One needs to write clear learning goals, search for source material and write calibration questions, calibration answers and calibration exemplars. The authors also spent many hours searching the Internet for appropriate source material. Our selection of web-based source material became a search for clear, concise scientific explanations and appropriate use of visual learning aids or animations (not just reproduction of textbooks). In our experience, it was much harder and time consuming to come up with the practice CPR described here, because it was notlinked to a lab experiment. In this regard, the pre-lab and post-lab CPR's for the AAS experiment are more detailed than "normal" assignments of this type, but were easier to generate due to the clear goals of the given experiment. The real time savings is apparent during the assignments. There was no faculty time spent grading essays (this pilot project generated 54 student essays).

 

Another consideration for successful CPR projects is the amount of student training. We walked the class through the logon procedure and the on-line CPR tutorial using a multimedia computer lab with 20 networked computers. This step becomes problematic for 10 lab sections. We believe that some sort of in-lab computer training will be required. We also provided handouts to students on expectations for grading and the timing of assignments. The co-author (Gulsrud) was the TA for the lab section and she gave constant reminders of the due dates. Despite all of this, two of the 18 students submitted text for the practice CPR 1, but forgot to complete the calibrations on time (thus shutting them out from the rest of the assignment). Four students forgot to submit anything for the pre-lab CPR 2 and three students did not complete the post-lab CPR.

Top

 

For these computer-based assignments, the student needs to access the software several different times over the course of a one-week CPR assignment. There is no problem with computer access on our campus, however we need to take a hard look at the timing of the various steps. One solution is to insert an extra day or two beyond the text cutoff time before starting the calibrations. This allows a late student to post the text (with an appropriate penalty) and participate for the remainder of the assignment. Ultimately, the student needs to take responsibility for completing the assignment.

 

Finally, the lead in paint AAS experiment was originally planned for two 3-hour lab sessions, but was condensed into a single 4-hour lab due to a restructured curriculum.  The original plan was to help students plot their results (or rerun poorly prepared samples) in the second lab period. Instead, we had groups complete the experiment in one lab period, but with only one replicate of their paint sample and one replicate of an NBS standard sample. This lead to large uncertainty is some group results. Student groups observed a demo of the AAS, but did not get hands-on use of the instrument. Instead, the TA loaded all the samples into the autosampler later in the day to obtain the results (six lab groups generated 42 samples). Students had to come back and pick up the raw results in order to make their calibration curves and calculate their answers.

 

The pilot phase of this project continues with student in a second year analytical chemistry course this spring. Students will get hands-on experience with the AAS in this course (20 students). Current plans are to expand the lead in paint experiment to all lab sections of general chemistry in the fall of 2003. We conclude with an unsolicited student comment from the SLAG survey:

 

"The calibrated peer review pre/post-lab essays were fun. I think that they better your understanding of what you are really doing because you have to explain it in a way to make others understand, which in return shows what you truly know. I would recommend this to anyone in any chem class/lab."

 

Copyright 2003 by Lawrence D. Margerum and Maren Gulsrud, all rights reserved.

Top

 

V.   References

 

1.            Mazur, E., Conceptests. 1996, Englewood Cliffs, N. J.: Prentice-Hall.

2.           Pavelich, M.J. and M.R. Abraham, Guided inquiry laboratories for general chemistry students. J. Coll. Sci. Teach., 1977. 7(1): p. 23-6.

3.           Smith, M.E., C.C. Hinckley, and G.L. Volk, Cooperative learning in the undergraduate laboratory. J. Chem. Educ., 1991. 68(5): p. 413-15.

4.           Dougherty, R.C., et al., Cooperative learning and enhanced communication: effects on student performance, retention, and attitudes in general chemistry. J. Chem. Educ., 1995. 72(9): p. 793-7.

5.           Wenzel, T.J., Cooperative group learning in undergraduate analytical chemistry. Anal. Chem., 1998. 70(23): p. 790A-795A.

6.           Farrell, J.J., R.S. Moog, and J.N. Spencer, A guided inquiry general chemistry course. J. Chem. Educ., 1999. 76(4): p. 570-574.

7.           Bowen, C.W., A quantitative literature review of cooperative learning effects on high school and college chemistry achievement. J. Chem. Educ., 2000. 77(1): p. 116-119.

8.           Gosser, D.K., V.S. Strozak, and M.S. Cracolice, Peer-Led Team Learning: General Chemistry. 2001, Upper Saddle River, N. J.: Prentice-Hall.

9.           Zimmerman, S.S., Writing for chemistry. Food for thought must be appetizing. J. Chem. Educ., 1978. 55(11): p. 727.

10.         Cooper, M.M., Writing: An approach for large-enrollment chemistry courses. J. Chem. Educ., 1993. 70: p. 476.

11.         Russell, A.A., O.L. Chapman, and P.A. Wegner, Molecular science: network-deliverable curricula. J. Chem. Educ., 1998. 75(5): p. 578-579.

12.         Kovac, J. and D.W. Sherwood, Writing in chemistry: an effective learning tool. J. Chem. Educ., 1999. 76(10): p. 1399-1403.

13.         Feldman, S., V. Anderson, and L. Mangurian, Teaching Effective Scientific Writing. J. Coll. Sci. Teach., 2001. 30(7): p. 446.

14.         Stewart, J.L. and V.L. Wilkerson, ChemConnections: A Guide to Teaching with Modules. 2000, New York: John Wiley and sons.

15.         http://cpr.molsci.ucla.edu/cpr_info/index.asp, Calibrated Peer Review, Last accessed February 14, 2003.

16.         Eisenberg, A., J. Chem. Educ, 1980. 59: p. 566.

17.         Boehnke, D.N. and R. Del Delumyea, Laboratory experiments in environmental chemistry. 2001, Upper Saddle River, N. J.: Prentice-Hall. 279.

18.         Markow, P.G., Determining the lead content of paint chips. An introduction to AAS. J. Chem. Educ., 1996. 73(2): p. 178-9.

19.         Shires, N., Teaching Writing in College Chemistry: A Practical Bibliography. J. Chem. Educ, 1991. 68: p. 476.

20.         Kovac, J. and D.W. Sherwood, Writing Across the Chemistry Curriculum: An Instructor's Handbook. 2001, Upper Saddle River, N. J.: Prentice Hall. 91.

21.         Vining, B., J. Kotz, and P. Harman, General Chemistry Interactive CD-ROM, Version 3.0. 2003, Brooks-Cole Publishing.

22.         http://cpr.molsci.ucla.edu/, Molecular Science Project, Last accessed: February 15 2003.

23.         http://www.wcer.wisc.edu/salgains/instructor/default.asp, Student Assessment of Learning Gains, accessed February 15, 2003.

24.         http://www.wcer.wisc.edu/nise/cl1/flag/tools/Tframe.asp, A Field-Tested Learning Assessment Guide (FLAG), Last accessed February 15, 2003.

25.         http://www.brookscole.com/chemistry_d/, OWL (Online Web-Based Learning), Last accessed February 15, 2003.

Top