You are here

Building Student Confidence With Chemistry Computation


Peter R. Craig

11/08/17 to 11/16/17

I work in a liberal arts college as a chemistry professor.  Not educated in the US, I have been welcomed into my adopted culture by being given the opportunity to teach that has allowed me to learn about a system I knew little about beforehand.  At the college level, chemistry can be portrayed as applied algebra.  In my experience algebra takes on a whole new level of difficulty and significance for students when the level they need to understand it underpins the comprehension of a subject it is a prerequisite for.  This appears to be exacerbated by the following factors: the lack of continuity of the offering of high school algebra and chemistry prior to entering college, the increased emphasis of attaining confidence at the cost of learning content at high school, and the numbers of students attaining access to college who don’t know how to study.  This paper looks at attempts to redevelop the robustness of students’ chemistry read-only memory (ROM) – their ability to identify and apply appropriate computational methods to solve problems without much thinking or hesitation.  With the confidence of a reliable ROM, students are better able to learn chemistry at college.



Many students in my first-year college chemistry classes struggle to solve chemistry problems.  Sometimes it is not knowing where to start – what is the question actually asking?  At other times, it is which data matter in formulating a response to a question.  The most vexing to me the teacher are students who have a solution strategy, and when they try to execute it their math computation skills let them down.  Not lacking in confidence outside of chemistry, the students rationalize the situation by informing me that they have not taken high school chemistry or that it was experienced too long ago, and in many cases the same goes for mathematics.  However, when I show students the relevant math in the context of trying to address a chemistry question, even with the aid of non-programmable calculator they stumble doing the computation.  I characterize this as students with rusty ROMs – or using the language of computer science, read-only memory.  Since 2011 I have been experimenting in ways to make students’ ROMs more reliable so they improve in their ability to identify and more importantly apply appropriate computational methods to solve problems without much thinking or hesitation.  Students trained through positive reinforcement solve chemistry problems when they have confidence in the tools they have at their disposal to do so.

The main approach I have used is quite simple.  I have required students to have a math workbook in addition to a standard first-year college chemistry text for my classes.  Students who use the math workbook get to (re-)learn in a fundamental way math operations regularly applied to the context of chemistry that don’t require them to use a calculator.  This forces them to engage their brains to connect the numbers and units to the math operations to solve questions ideally in their heads.  Such operations include working with exponents, significant figures, logarithms and undertaking dimensional analysis.  Selected parts of the workbook are assigned as out of class work each week.  I get the students to write up the assigned work in a notebook, and I provide an evening study session each week for students to do the work and get my help where needed.  At the end of the week, I collect the notebooks and provide a grade based how complete it is and not on whether they did the work correctly.  As I am grading their notebooks, the students undertake a quiz that contains a subset of the work assigned for that week – this is graded for working and correctness.  Such math work parallels class work involving PowerPoint presentations, demonstrations and related class activities.  In fact, taking a significant amount of problem solving out of regular class time frees up time to make the class work more interesting and memorable.  Since 2011 I have incorporated such math work in my chemistry classes in much the same way as described above.

So are the ROMs of the students less rusty as a result?  In the rest of this paper I report my findings in relation to student performance in the American Chemical Society (ACS) 2003 General Chemistry Examination. 


Materials and Methods

Students were required to purchase the math workbook “Calculations in Chemistry”, 1st edition, Dahm & Nelson, W.W. Norton publishers.  For years before the workbook was published, or for class topics covered outside of those contained in the published workbook, related modules provided by were used instead.  The ACS 2003 General Chemistry Examination was obtained from the ACS Exams Institute and used as received.  Data was tabulated and graphed using Microsoft Excel.  The z-test (and associated p value) calculator used for 2 population proportions can be found at



Two research questions were posed.  Firstly, does a math emphasis (as described in the introduction) to the teaching of first-year college chemistry improve student test scores? 

I gave the same examination to students in my classes in 2006, 2007, 2008, 2010 and 2011.  From 2006-2010, no math emphasis was provided (n = 114 students).  In 2011, math emphasis was provided (n = 46 students).  The graph of data obtained is given below (Figure 1):

Figure 1: Impact of Math Emphasis on ACS 2003 General Chemistry Exam Results.  The difference between these two proportions at each raw scoring bracket is statistically significant at the x% confidence level (p-values determined from for two proportions (two tail) at the 0.05 significance level).

After 2011 at McDaniel, first-year college students taking chemistry classes were partitioned into Introductory Chemistry (included math emphasis) and General Chemistry (math emphasis was absent) sections.  The section type differed in that students qualifying for General Chemistry placed out of arithmetic and algebra placement tests all take prior to entering the college.  Both types of sections followed the same set of chemistry topics, only the depth of field in General Chemistry was greater owing to the documented higher math proficiency.

Secondly, does partitioning of first-year college chemistry students into Introductory (math emphasis) and General (no math emphasis) Chemistry sections improve test scores?

I continued to give the same examination to students in my classes in 2012, 2015 and 2017.  In 2011, I emphasized the math to all students I taught (n = 46 students).  For 2012-2017, a math emphasis was provided only to students in Introductory Chemistry sections I taught (n = 43 students).  The graph of data obtained is given below (Figure 2):

Figure 2: Impact of Partitioning Students into Introductory (Math Emphasis) and General (No Math Emphasis) Chemistry sections on ACS 2003 General Chemistry Exam Results.  The difference between these two proportions at each raw scoring bracket is statistically significant at the x% confidence level (p-values determined from for two proportions (two tail) at the 0.05 significance level).



In comparing the two histograms of data related to the impact of applying a math emphasis to teaching first-year chemistry students (Figure 1), the relative proportion of those who now occupied higher raw scoring brackets definitely improved.  The gray boxes above the two-colored sets of data for each raw scoring bracket signify the confidence level where the two proportions are statistically significantly different.  The raw scoring brackets where this is the greatest are the 31-35 (95%), 51-55 (91%) and 61-65 (88%; all out of 70 possible questions correct).  This clearly indicates that teaching with a math emphasis in the manner described in the introduction has a positive impact on first-year chemistry student test scores.  It also suggests that it helps students who score in the 50% range as well as those scoring in the 75 or 90% range: it helps both the challenged and smart students.

In comparing the two histograms of data related to the impact of partitioning first-year chemistry students into Introductory (math emphasis) and General (no math emphasis) Chemistry sections (Figure 2), the picture is also pretty clear.  More students proportionally score in lower raw scoring brackets when in separate Introductory and General Chemistry sections.  This is particularly evident for the 26-30 and 31-35 raw scoring brackets.  This reverses in the higher raw scoring brackets (especially in the 51-55 and 61-65 raw scoring brackets).  This indicates that the students in General Chemistry who routinely score in these higher raw scoring brackets would still benefit from a math emphasis to the teaching in their section. 



Having a math emphasis to the teaching of first-year chemistry students clearly benefits them from a test score perspective.  Separating the students (based on math placement test scores attained when entering college) into Introductory (math emphasis) and General (no math emphasis) Chemistry sections doesn’t appear to improve the situation.  These results put into question the benefit of partitioning students into the two types of sections offered at McDaniel, or at least suggests that another type of metric be used to do the partitioning.  A chemistry placement test may be a better option.  The results also suggest that even first-year chemistry students proficient in math can still benefit from a math emphasis to their learning that will make their ROM more reliable.




Dr. Craig –

1. To ask about your teaching situation (which may be different from that of faculty at larger schools):

a.) In many schools, students will have a different instructor for the first and second semester of general chemistry. Did you in large measure teach the same students for both semesters of first-year chemistry?

b.) Were the ACS scores you reported for first semester exam or the entire year exam?

2. Studies by Diane Bunce et al. in JChemEd have encouraged “frequent quizzes” as a way to encourage spaced study (“distributed practice”) rather than “massed practice” (cramming). Your system also included weekly quizzes (as well as help sessions and a reviewed problem notebook). Did you find good rates of on-time homework completion? Was there a difference in the homework completion rates in the non-partition and after-partition classes?

3. Figure 1 shows significant gains ACS exam scores in 2011- - the year of the “math review for everyone.” How likely would you say it is that the intervention produced the improvement, rather than a change in the background of the 2011 students compared to prior years? Was there a difference in dropout rates during the course in 2011 (when weekly quizzes on homework started) versus the prior years?

-- rick nelson

One thing that I have noticed with all the papers in this ConfChem is that we are looking for the data to support changes in the curriculum and provide help for our students. This is an excellent policy.  Yes, we who have taught large numbers of students over many decades, see things and know things, but it is the data which support our findings that is important. Sharing our knowledge with other ChemEd researchers is one thing, but how do we share with the admin types that actually make decisions--even in some cases, the Chair of the department--and make effective changes? 

I think that by taking what you understand you know from years of experience and putting it to the test. Statistical tests of hypotheses. Once you have data take this to the first level that can be convinced about reasonable data to take teaching in a different direction. For example, as a result of this paper, our (albeit) small department is revisiting chemistry placement options for incoming students, and seeing whether what we offer our students as first-year chemistry choices meets their and our perceptions of what is best. I have found that being scientists, presenting data to other scientists who are department chairs is a stepping stone to convincing administrators higher up to consider change.

1 a) Depends on the teaching demands for the semester. As I teach on a 3:3 fall:spring load, and two of those fall load counts are taken up by sophomore level analytical class and lab sections, and the corresponding situations mirrors itself in the spring semester with inorganic chemistry, I seldom get the chance to teach the same group of students for the entire two semester first-year chemistry sequence (like the bigger schools).

1 b) The ACS exam results were always for the end of the second semester of first-year chemistry sequence, which for me is in Spring.

2. Homework completion wasn't usually an issue as students would get points merely for submitting their work that I examined for completeness as they took their Friday quiz. I’d say that students submitted completed homework 90% of the time. The after-partition classes did their homework at comparable rates of completeness.

3. I think that the intervention produced the improvement. I did not ignore math in my teaching prior to providing a math emphasis in 2011; the students and I did problems related to class topics akin to many other first-year chemistry classes. But we never directly examined the basics more than re-emphasizing log rules, for example, when they were needed. I wouldn’t say dropout rates changed much upon adopting a deliberate math emphasis – but the students who stayed appeared to have more awareness of the math they were needing to undertake, and thus less anxiety about what it was they were to do.

To me, these two words "Friday quiz" are painful.  My own academic experience as a pupil in school and as a student at undergraduate and post-graduate levels in a superior Canadian university some decades ago was accumulated with examinations at the end of an academic year.  In universities in Germany, the tradition in former decades was for students to seek examination at the end of increments of two years in their progression through the degree programmes.  The preparation for those examinations was a serious effort, not just the night before the event but through days and weeks of focused study, beyond the nightly review of that day's lecture notes and relevant textbooks.  My recent experience with courses in which a "Friday quiz" or equivalent is that such episodes interfere with an enduring understanding of the course material in favour of a necessity to recall what the instructor said a few days ago.  A colleague even boasted that he administered a quiz at the beginning of each lecture period because only that way could he be sure that the students had 'understood' from the preceding lecture.  Instructors of general chemistry in this conference have complained that their students have not learned or are unable to recall basic mathematics, or even arithmetic, that is essential for the solution of chemical problems within general chemistry; I have not noticed analogous complaints about the chemistry that their students would be supposed to have learned in school, likely both because the topic of this conference is mathematics and because general chemistry at tertiary level makes typically little demand on preceding knowledge in chemistry.  The cumulative results of these "quizzes" might produce a (small) failure rate to satisfy administrators that the instructors are competent.  What instructors of general chemistry might not like to know is that instructors of subsequent courses in chemistry can be almost as vocal in their complaints about the knowledge of students who have been given credit for general chemistry, although for ethical reasons involving colleagues in the same department perhaps the latter complaints are not as strident as what I read here.

In this vein I relate an experience of a senior colleague in organic chemistry -- one who has no hesitation in private conversation with me in deriding the lack of understanding of his students of the content of prerequisite general chemistry.  According to a typical pattern, he administered several tests during the semester in Organic Chemistry I, which provided a basis for the grade of the course.  At the first meeting of Organic Chemistry II he administered, without warning, a simple test involving what he had taught in the preceding course and of what the passed students have 'proven' their knowledge.  The results were disastrous, with a few exceptions.  These exceptions occurred for some students who were originally failed in Organic Chemistry I on the basis of their poor scores on the several tests but who were permitted to take a supplementary examination after the semester that covered the entire content of that course. So the failing students succeeded whereas the successful students failed!  Do you comprehend my pain with "Friday quiz"?

rpendarvis's picture

In my many years of teaching organic chemistry at the College of Central Florida, I made it a common habit to have a quiz from the classroom material at the beginning of the laboratory period.  Sometimes I could get one of the faculty to monitor the lab while everyone had a reflux going and give it during that period.  Although I have no "control group" to compare with, it seems that frequent quizzes do have positive results.  Generally, more than half my class would surpass the 50th percentile on the ACS Standardized Final for Organic Chemistry.  I did the same thing with my Gen Chem classes but did not get that level of results.  Perhaps the second year in chemistry makes students more serious.  I really do not know.

Gregorius's picture

I tend to view end of topic quizzes, "Friday quizzes" if you will, as summative, more for grading purposes, and, coming at the end of a knowledge unit, of little use to advance student learning. However, I also provide group work, homework, recitation, etc. which can also be graded, but more for the purpose of giving students the opportunity to self-assess their understandings, correct misconceptions, and determine what in their knowledge needs shoring up.

I've found that if I do away or reduce the amount of these formative assessment there will be a consequent poorer performance in summative assessments. Likewise, if I do away with summative quizzes, performance in the midterm exams suffers.

Although I've tried to make the formative assessments qualitatively different from summative questions, I can't be certain (and I don't know how to evaluate) whether the students are simply using the formative exercises as practice/drills for the summative quizzes or are using the mid-week exercises, as they were intended, to further their knowledge. The "Friday (summative) quizzes" are certainly being used as practice for the midterm exams, but I'm hoping that the midweek exercises are formative.

I feel your pain. I really do. When I started teaching in the US (not educated in US at all other than doing postdoc training), I was shocked that students expected to engage in "homework" at college - I thought it was something that ended at high school. It did for me. You could always do some on your own, but that was the basis of the tertiary education system - if you wanted help you could get it, or create it by doing exercises for yourself: you'd have to be much more of a self driven student to succeed in that system. I reluctantly got used to this busy work and the expectation students had in doing it. An aspect of this was quizzes or some sort of regular repeated assessment.

I have been concerned that what quizzes do is train the mind to have a week only long memory. But basically I have learned to work within the system I have joined, with occasional outbursts in classes to bring external sometimes alien ways to learn and reinforce that learning in chemistry.

Layne Morsch's picture

There is actually alot known about "The Testing Effect", there are many journal articles you could look up, but a nice place to start is Chapter 2 of Make it Stick, by Brown, Roediger and McDaniel titled "To Learn, Retrieve". They quote many studies that show that more frequent quizzing interrupts the forgetting process and leads to better retention.

I’d join Dr. Morsch in recommending the book “Make It Stick” on the science of learning, A review from the Chronicle of Higher Education is available at

A 9-page article PDF summarizing of many of the “Make It Stick”findings, titled

Optimizing Learning in College

includes a one-page list of study tips for students. A free download is at

From the abstract:

“Every fall, thousands of college students begin their first college courses, often in large lecture settings. Many students, even those who work hard, flounder. What should students be doing differently? Drawing on research in cognitive psychology and our experience as educators, we provide suggestions about how students should approach taking a course in college. We discuss time management techniques, identify the ineffective study strategies students often use, and suggest more effective strategies based on research in the lab and the classroom. In particular, we advise students to space their study sessions on a topic and to quiz themselves, as well as using other active learning strategies while reading. Our goal was to provide a framework for students to succeed in college classes.”

-- rick nelson

Rich Messeder's picture

Friday quiz, or quizzes versus accumulated exams
To begin with, I will say again that I am a physicist. I have helped the odd chemistry student here and there as circumstances presented, and I assisted Cary a few times in one of his chem classes at university. Apart from context, undergrad physics math, chem math, mech eng math, etc., all look very similar to me.

One of the values that I am taking from this conference is reference materials. I am very interested in seeing other perspectives on what is needed and how to help students.

And I'll apologize right up front for being wordy. I hail from a story-telling culture, and being wordy comes all too naturally.

A few years ago, a nurse came to me for math help. She was a top performer, and thought of herself as one. Not arrogantly, but good self-image. She garnered ~3.95/4.0 GPA getting her BS. But she had been shifted (or so I recall) to ER work, and, under the gun, she needed to do concentration calculations. She was (naturally) afraid of getting it wrong. So I helped with her math. Basic math. How did she manage a 3.95 GPA without knowing basic chem math? I'll leave that for now, but the answer is really embedded in every comment I see in this conference.

Quizzes: The good, the bad, and the ugly. Before I had my first teaching experiences (simultaneously part-time at a HS and a college), I had the benefit of being the brother-in-law of an outstanding HS teacher. When she died unexpectedly a decade ago, hundreds of parents and students turned out for her memorial. Two lessons she taught me were:

1. Students need to see material repeatedly. Her students knew that at any time an exam might ask a question about material presented earlier in the year, or even previous years. The parallels in life are so strong that I am surprised that academia has shifted from this. At university, I repeatedly see students talking very frankly about studying for the short term, and cramming to pass an exam. I know top students who brag that they don't even crack a textbook, because they will be spoon fed the material in class.

2. Don't back down. Parents and staff will come whining to you that their child is note getting the grade they "deserve". Make darn sure that you are testing fairly and grading fairly (this requires a teacher to know their material, and I am conscious here of the comments on an earlier thread about grading TEACHERS' quals), and then stand up to everyone. When I did this in HS, I soon had a signup list outside my door that was longer than I could service. Why? Because those students (the great majority in my classes) WANTED to do better. They could tell immediately that they were being held to a high standard, and that the payoff was in a better education for the time spent.

Follow on: I heard from some students about a math professor at a university. Every comment was complimentary (or at least not negative). They also commented that she didn't hand out end-of-term student appraisal forms, and I was under the impression that they are required at that university. So I looked her up for a chat. During the discussion, I asked about the forms. She confirmed that she did not hand them out. I asked why. She said that it was her job to be an excellent teacher, and students were not qualified to evaluate her. You can imagine that I agree. Or mostly agree. I also knew of a physics professor whose classes I had sat in on to widen my experiences. I thought that he did an excellent job. But he later told me that he had gotten dismal appraisals from those very students --- junior physics students --- all because his exams asked them to apply what he taught, rather than just write down a few equations, and do some calculator math.

So while I see the value of the cumulative exams John mentioned, I can also see the value of quizzes and exams. But I strongly oppose the type of quiz or exam that promotes cramming. I have seen way too many ostensibly "top" students who couldn't remember what had been taught even a month earlier, and all the while pulling down 3.5+ GPAs.

Along the lines of: quizzes need to repeat material, students need to repeatedly access material to cement it in place, and faculty don't have time for all this nonsense, how about using computers for what they do best? Require students to put in time on automated quizzing. One approach is to have the program keep track of right and wrong answers to the end that the program will randomly present questions answered incorrectly more often than those answered correctly. I believe that programs such as ALEKS do this now. It is really not difficult to compose such a program...the harder task is to populate the question and answer base. If rote math is being tested, all that is needed is a timeout to enter the answer, not multiple choice. The computer rooms need to be monitored to make sure that calculators are not used, if that is the intent, because cheating is all too prevalent on US campuses these days. Here, though "grades" are recorded so that faculty can see how students are progressing, the programs are really "teaching" programs. I see this approach as worth considering these days, because the general agreement here seems to be that /remedial math practice/ is required to make up for deficiencies at the secondary level.

A question for Dr. Mason: In “First Flight,” how many hours were spent in prep for gen chem, and how much of that was prep for “math without a calculator?”

A tougher question for Dr. Craig:

“Tracking” is a much studied and much debated practice. Students scheduled into a “lower track” may develop expectations about their performance before the teacher even enters the room. Studies also suggest that all sections need some “sparkplugs” that encourage others to keep up with a fast pace.

At the same time, if students do have two significantly different levels of preparation, offering different sections may allow the different levels to better be addressed.

Not an easy issue. But in your view, IF a placement test could accurately separate those who need some math review from those who need quite a bit of math review, for your population, do students divide into “two levels” of background such that it would it likely be better to give keep the two partitions and give each perhaps a different amount or type of math review?

Or is the spectrum of your student backgrounds narrow enough that it makes more sense to combine the two partitions and offer everyone the same math review, as was done in 2011 (and achieved notable results)?

-- rick nelson

We live in a society where there are perhaps too many venues for education (particularly tertiary), so prospective students are propositioned by many colleges and universities before they get there to choose one from many. There are inducements to consider - sticker price of the college/university (that may provide a perception of relative value/reputation of the options available), scholarships to discount that sticker price that may extend into the sporting realm, and resources (such as additional academic support) put forward to help a student succeed at a particular college/university over the other options. It is not appropriate for me to address the question of whether all students who seek a college education should get one - in a competitive market place such as that in the US there is a venue for everyone. Market forces see to that. What I can try to address is the issue facing many colleges and universities and that is the admittance of increasing numbers of students lacking certain skills - in this forum's case the emphasis is on the mathematics.

If a placement test was able to accurately discern between students who needed a math refresher versus a significant boost to their math understanding to be able to succeed in first-year chemistry, I think that even at the small college that I work at, there would be enough students in each camp to benefit from a tailored form of math re-education. Obviously such a decision would be very dependent of the math background trends of incoming students at your institution, and the ability of your institution to resource sections of classes/labs that could cater to the needs of the different student groups. But many colleges/universities succeed/create competitive advantage by meeting the students where they are when they come in the door and motivating them to take them further. In our case, we are seeing increasing number of students who need more than a math refresher, and for this coming spring semester we are offering a special topics class that is essentially an introduction to chemistry to get students ready to cope better with first-year chemistry courses they would take subsequently. Ideally we would like to offer a matching section of students who could, like in second language departments, do an accelerated form of first-year chemistry such that they get a years education in a semester. However, limitations in high school laboratory exposure students obtain may make them too inexperienced to succeed in subsequent chemistry courses. More realistically though, we would probably not have enough students who would be prepared to take such an intense a course if offered - but some colleges/universities might and perhaps do already.

What is a likely outcome to my college's situation is that with what ever metric we use to assess incoming students for chemistry section placement, it is likely to result in students who score below a certain threshold taking the introduction to chemistry course before tackling a two semester introductory/general chemistry sequence of courses. Student who score above the threshold would go straight into this sequence. Students who are particularly well prepared and do exceedingly well on the placement metric could be offered a place in a sophomore level chemistry course. This would shorten their major significantly.

There are open questions that I can't answer (yet): for a student placed into the introduction to chemistry course: should they obtain college credit for it? and if this is effectively a pre-college course done at college, can they realistically graduate with a chemistry major in four years if this is there starting place?

We looked at a prep class for our gen chem sequence and decided against it largely due to results in the literature that stated it didn't have the desired effect (eg. J Chem Ed, 2005, 82, 125). We changed the pre-req for the course to require the students to have completed college algebra with a C or better or to place into pre-calc first. This decision was based on a correlation in our students between the students that failed algebra and those that failed chemistry. We have seen a reduction in DWF from this, but I would attribute that to never having a selection of students in the class in the first place, as I am still regularly confronted with students that struggle to perform even basic algebraic tasks quickly and correctly.

rpendarvis's picture

At the College of Central Florida, we had a math prerequisite but it was often ignored by counseling.  It also did not seem to be that effective.  We also had a preparatory chemistry class for many years.  Those who took it generally were much better prepared than those who did not.  The fact is that the class was about half students taking it to meet requirements for various 2-year degrees.   We gave the Toledo placement test and had developed statistics to show that 80% of the students below a minimum would be in the DWF category in General Chemistry I.  We wanted to get the minimum score written into the prerequisites because we thought it was terrible to put unprepared students in such a stressful and hopeless situation.  The administration seemed to be more concerned about other things, possibly enrollment numbers.

Rich Messeder's picture

+1 "We wanted to get the minimum score written into the prerequisites because we thought it was terrible to put unprepared students in such a stressful and hopeless situation."

I feel strongly that moral leadership in academia is largely absent (it seems to vary by school and by individual faculty), and that a great disservice, if not outright injustice, is done to students by letting them take courses where it is likely they will not do well. One of my primary goals in teaching is to have students leave my classes with a sense of confidence that they are successful, not because they got a good grade, but because they met the challenge fairly and were able to put their capacities and capabilities in perspective. Not every student is an A+ student, but every student needs to know where they honestly stand. I see too many students with pain on their faces as they struggle with material for which they are unprepared, yet they think that putting off for a year a class that they need for the schedule progression is too humiliating. Some students struggling thus have told me that their parents would never sanction a delay.

There were about 3 h/day for 4 days, ALL without the aid of a calculator! I wish the sample size was bigger, but maybe the coming summer since we will have more data, we will be able to exert a greater influence on First Flight. 

What grade on the Toledo test was used as a cut-off between the prep chem course and gen chem?  


rpendarvis's picture

I last worked there over 10 years ago so I cannot remember the exact number very well.  I believe it may have been a raw score of something like 27-29 but we were using a very early version of the Toledo and I cannot remember which year it was.

Wish I could remember more.



Apart from any other aspects, the use of time for a 'quiz' within a lecture period, or even within a laboratory session, detracts from other use of that time.

My devious mind wonders about a possible correlation between the perceived stature of post-secondary institutions in USA and the tendency to impose frequent quiz evaluation in chemistry.  Does anybody have an idea about the present use of such frequent quiz in Harvard or Stanford universities, for instance?  I am aware that some participants in this discussion might have obtained advanced degrees in elite institutions some decades ago, but my interest is in the present practice.

We might also distinguish between evaluation based on frequent quiz and on frequent assignment of exercises or problems to be undertaken by students outside class hours.  Such assignments might be more tedious for the instructors to administer, but more beneficial for the students.

I have tried to connect the work students do with math workbook exercises outside of class with class time through the Friday quizzes undertaken in class. The next week they do a MasteringChemistry problem set outside of class that uses the math of the math workbook exercises of the previous week. This culminates in a test after a number of repeated cycles of the above. The goal is that students gain reinforcement experience in using math in chemistry so their confidence and ability to do so improve.

Cary Kilner's picture

Wow! We’re certainly entertaining a wide variety of issues and options here! From my experience as a HS, a private summer-school, a community college, and a 4-year college instructor, might I respectfully offer these suggestions I take from the Chem-Math Project.


First I shall address the intervention. A larger university offers the opportunity to have an intro-to-chem course for the woefully underprepared along with the gen-chem track. I would suggest that most schools would not give college credit for it for STEM majors, but perhaps do so as a gen-ed for liberal-science majors. If the majors end up on the five-year plan, so be it. What is so sacrosanct about four years? Don’t we want well-prepared graduates?? This system is the only fair way to provide students with majors in engineering and the physical-sciences with the best value for their tuition dollars. When properly-prepared and ready, they will be exposed to a genuine, rigorous mathematic-driven general-chemistry course. There will also be students who are able but who had a disastrous HS experience. For these students, an assigned recitation or accompanying Q-course would be appropriate.


We cannot arbitrarily include students from gen-chem who did not have HS college-prep chemistry, because I have had such students who were precocious and/or had a very good physical science course who subsequently excelled going directly into gen-chem.


Then we have the life-science students, some of whom would have a GOB course for a year. I certainly don’t see the point of trying to cram G, O, and B into one semester, although many institutions do this! This would certainly foster memorization over true learning!


Regarding diagnosis, there would/could certainly be the precalculus placement test (or SAT-math) for assessing formal-math.

A chem-math instrument would contain ratios and proportions with units, whole-number exponents, scientific notation, unit-analysis for converting units and unitary-rates, and other aspects of chem-math as explicated in my Chem-Math Units (q.v.), i.e. chemistry applications of arithmetic, prealgebra, and algebra. The results from these two sources would measure numeracy.

The GALT or TOLT test could be used to measure the Piagetian stage of intellectual development and assess for formal-reasoning attributes.

And the Toledo test, California test, or one written by the instructor(s), could be used to assess for prior exposure and understanding of physical-science and chemistry concepts.


These four instruments should give a reliable and valid assessment of a student’s readiness for gen-chem. What remains is to find a way to integrate them into an effective placement score. In my Chem-Math Project research, I found structural equation modeling to be a useful tool as it allows the researcher to create a model for the mix of these four indicators he/she seeks in students for his/her courses.


Finally, the size of the tertiary institution would be a factor in what kind of assistance would be viable. And the freedom of the instructor(s) vis a vis the administration would be the final barrier to confront in solving the issue of matriculating underprepared students by implementing a proposed program.

At my institution, we are now extremely focused on student retention and graduation rates, which used to be pretty abysmal.  If we accepted students and told them they would take 5 years to graduate, they would never matriculate with us.  I doubt we could even get them to come for the summer before freshman year for a program even if it was free, because they would be losing their summer job income.  We are stuck with trying to remediate their weaknesses during the semester.

About a decade ago, we changed our requirements for freshman chemistry (both the science/engineering 2 semester sequence and the 1-semester GOB for nurses, and when we added a 2-semester sequence for non-science pre-physical therapy majors we included them): all freshman chemistry students must have received a satisfactory score on the math SAT to be able to take freshman chemistry their freshman fall semester.  If they didn't, they would have to take a remedial math course, Math 101.  (Only 1 remedial course can count towards graduation).  That did improve our DFW rate somewhat, but we still struggle with students who are woefully unprepared and don't get what they need from Math 101 (see below).

We have been running a pilot program since 2011 to put Math 101 science majors in a cohort where we try to address these deficiencies.  There is way too much information for me to include here, but we have learned that Math 101 is too advanced (it does not address the freshman chemistry skills that have already been enumerated here over the past couple weeks).  We are now considering making Math 101 a 4-credit course for these particular freshmen with the 4th hour designed to work on fractions, percentages, solving for x in the denominator, using calculators, etc.  But we have also found that there are plenty of other deficiencies, particularly in reading, that also need to be addressed, and also systemic issues like family problems and insufficient funds to purchase textbooks and other supplies.

I could go on, but I will end with one comment on the 1-semester GOB for nurses.  It is a tremendous amount to throw at our students in one semester, but the nursing curriculum does not have room for additional chemistry (at least not chemistry taught by the chemistry department).  We tried a couple of integrated one-semester texts over the last 6-8 years, and we have been asked by our Nursing department to go back to the highly mathematical traditional general chemistry portion because the students will simply not make it through the program if their quantitative reasoning is weak.

I would like to thank everyone for some tremendously interesting papers and subsequent discussions.  If I have time, I may chime in more about calculators!

I do hope we can hear from Dr. Martin on calculators, and maybe flashcards? Either this week or next -- when we revisit the impact of the NCTM standards on quantitative reasoning? A special thanks for conveying the views from the School of Nursing.

-- rick nelson

And I totally forgot to mention that student financial aid ends after 8 semesters.  How will a student pay for a 5th year of college? 

I've looked at several of the one-semester GOB textbooks and didn't find a solution I liked to get what the students needed for the nursing program.  As a result, I have assembled/written a text on LibreText.  It let me focus on skills they needed and leave out non-essential content such as nomenclature of inorganic compounds.  You can see the topics covered in the book at 

FYI - I'm in the process of rearranging some content in the last few chapters. Also, I have it set up to cover one chapter per week so exam week chapters are shorter.

Math is one of the many issues that students have.  Trying to overcome some of these issues like math and reading are not easy.  One approach is to see which courses these students are taking along with chemistry and try to embed things to address the deficiencies.  At my institution, my first experiment (called boot camp) is based partly on using Excel.  Most of my students are also taking biology.  The first biology lab also involves teaching them about using Excel.  In this way, you can divide up things and also reinforce ideas.  This approach depends upon instructors working together for the common good.

This is strictly anecdotal evidence, but we require our Gen Chem I lab students to use Excel extensively, and it is surprising the number of students who have no experience with Excel.  Colleagues have commented to me that they feel the student knowledge of Excel has actually decreased over the last few years.  We feel this is an extremely important tool for students to learn, so we actually do not allow the use of calculators in the chem lab.  We have not tried to do any remediation - for the most part there is usually at least one student in each lab groups who has some proficiency - but when this is the case, we often find that the proficient student does the calculations all semester and the others in the group struggle on their lab exams and practical.  We try to minimize this by rotating lab roles, per the POGIL model, but I am not convinced we do as good a job of this as we need to.  I will say that when we do integrated rate laws in the second semester, I am pretty jealous when the students can calculate the zero, first and second order data points and plot them in a span of 5 minutes when I had to do the calculations either long-hand, with a slide rule, or with log tables and then had to draw each plot on graph paper and calculate a slope manually!  As I recall, it was a week of work outside of lab to crunch the numbers.  So I am all for having these tools, but I am concerned about the numeracy literacy (or lack thereof) our students have since they have never had to do a lot of manipulation manually.  This brings me to a question that came to mind recently - are current college students so habituated to getting an answer in a matter of seconds that they do not have the ability to concentrate on problem-solving that requires minutes of mental effort? 

Excel and calculators are tools.  You would like the students to know how to use the tools.  You would also like the students how the tools give you the results.  If Excel is used to plot a graph, the student should know how to plot the graph.  Students should also know what information can be obtained from the graph.  The problem is students struggle with usage of the tools.  The earlier paper on how to use the calculator would help them use a calculator.  Unfortunately, students want answers in seconds.  I believe this is due to the students being told in their pre-college years to focus on the answers and not the process to get the answer.

Would any like to make a list of what skills you would like to see as prior knowledge to gen chem I?

Tony Mitchell's picture

Back in 1989 and 1991, I published results on this topic
"What Do Instructors Expect from Beginning Chemistry Students? Part 2", Journal of Chemical Education, 68, 1991, 116
"What Do Instructors Expect from Beginning Chemistry Students? Part 1", Journal of Chemical Education, 66, 1989, 562
I know there was at least one article after mine were published (because it referenced them). As my own research shifted, I don't now what other individuals have done in this area but these two articles should give a starting point.

And I would be interested in the study, both in terms of what is happening now and how this has or has not changed in the past 30 years.

Cary Kilner's picture

Somewhere in my notes I have such a list that I have studiously made up for some other purpose. Meanwhile I suggest that you go through my 27 Chem-Math Units and decide which of those skills YOU want to see in matriculating students. It totally depends upon the rigor of a given gen-chem course what its students should have “under their belts.” Were I to be teaching a gen-chem course for majors I would expect some expertise in ALL 27 Units! But this would certainly be unrealistic for many students coming into large state universities.

A colleague of mine has found that Google Sheets (and Docs) offer the significant advantage over Excel (and Word) that students can collaborate with each other and the instructor can offer real time guidance.

The list provided in the 1991 J Chem Ed article covers most of what is needed.  In my first year class, students struggle with reading, math, studying, note taking, using calculators and programs like Excel, time management, perseverance.  Students think that studying is memorization.  The process to get an answer is not as important to students as getting the right answer even if it means just following what they find on Yahoo answers.

Dr. Parshotam -

If you decided you wanted to be able to write chemistry articles in Chinese characters, how would you learn to do so? How much of your initial learning would be memorization?

For students starting out in learning to solve chemistry problems, how is it different from learning to write in a very unfamiiliar language?

-- rick nelson

Dr. Craig observed that “ [S]tudents in General Chemistry who routinely score in these higher [ACS] raw scoring brackets would still benefit from a math emphasis to the teaching in their section.”

That was also reported by Don Dahm using the same “just in time math review tutorials” at Rowan University. In “Advanced Chemistry” that taught two semesters of General Chemistry on the schedule of a one-semester course (primarily to students admitted to the Engineering School), scores averaged at the 63rd percentile on the ACS Two Semester General Chemistry Examination for the one semester course (see Paper #4, November 2009 ConfChem). Anecdotally students reported a positive impact on their work in subsequent courses.

When the lessons were used with sections of “regular” two semester general chemistry (primarily for bio/health sciences students), the assessment was that the majority benefited, but not as high a percentage as was the case for the mostly engineers. Even students who had not scored well on an initial test of “mental math fundamentals,” if they did the homework tutorials on time and came to the help sessions, earned good grades in the “standard paced” course, but for whatever reasons, the percentage who did so was lower than for the students admitted to the Engineering School.

Among the current generation of students, the “math review during chem” helped both groups, but helped a higher percentage in the group with the better math background.

-- rick nelson

I agree with you Rick that when learning something new you are going to memorize the characters and words.  The problem becomes when you do not move beyond memorization.  For example, the students can memorize how to calculate an amount in moles from mass and also memorize how to use a stoichiometric coefficients in a balanced reaction.  As an instructor you can give a reaction equation and a mass of one of the reactants and show the students how to calculate a mass of a product obtained from that reaction.  This example is no different when you give the students the mass of the product instead of the reactant and you ask them to calculate the mass of reactant required for this reaction. You are still using stoichiometric coefficients and mass to mole conversions.  All the student is doing is applying what they have memorized to a new situation.  For most of my students, they view the two stoichiometry questions as different and memorize each step in the calculation.  This year I have worked out more examples in class trying to show the students that stoichiometry questions involve applying the same concepts they have seen before.  I will evaluate whether my change in delivery made a difference for the students.


Hi Umesh:

I agree with you, learning is way beyond memorizing content. In my classes, for example while solving Boyle's law problems,  similar to "", I present a random temperature with the first part. They get very perplexed with the problem. They start using temperature randomly. This in my opinion: they have memorized how to do the problem and never really applied it. When trained during problem solving, we discuss what I call "PAUSE method", Every time you see a number pause and write it down, and come up with a game plan (A) given (B) what needs to determined (C) which equation(s) would you use. Based on A and B, they learn how to use or not use extra information. This is helping me a lot (I tell them I could be tricking them with question!). After instances like this they start paying attention to how to get where they need to go. Another example, while calculating stoichiometry, to strengthen their knowledge, I give moles of the ingredients as opposed to mass. They start dividing moles by molar mass, which tells me they have memorized steps. When they get trained with PAUSE method, they kind of get trained that they can't blindly dive into a problem, they have to carefully plan out their steps. It is interesting to see them develop this paying attention skill.


SDWoodgate's picture

I think that one of the ways that we can get students beyond the idea that every stoichiometry problem is different is to get them to do the analysis, identifying known, unknown and relationships.  Distilling the words to symbols and relationships helps them to realise that the same symbols and the same relationships pertain to cases where the words are different.  We should be giving them strategies to recognize the similarities.  Certainly there is a memory work.  There must be, but the key is getting the big picture so that a lot of things can be figured out from a relatively small amount of memorisation.  These are the types of things that should be specified in learning outcomes.  From my personal experience, writing learning outcomes (objectives) does help to focus the mind.



I like the "Pause Method".  I will try to implement next time I teach general chemistry.  I too add information the students don't need to get students to sift through all the information and only use the information they need to solve the question.


SDWoodgate's picture

The Pause method is a very good strategy for in-class work.  The issue is when they are alone in their room the night before the test, and their teacher is not looking over their shoulder.  This is where a good web-based system which simulates the whole process, lets them correct wrong answers, gives them feedback as to where it is wrong comes in very handy.  What is more, based on data, it is easily possible to see where interventions are needed over large numbers of students.  BTW I am not suggesting this as a replacement for the teacher.  Students need to be exposed to a variety of different learning opportunities including face-to-face teaching in a classroom.


Dear Umesh, Jay, and Sheila,

I appreciate the thoughtful and detailed responses on the memorization issue.

Starting tomorrow, we look at two more papers. In the last paper of the conference, we will take a look at what science says about the necessity to memorize. I think you can see paper #8 now, but in response to your comments, permit me a sneak preview.

The scientists who study how the brain works say that students cannot solve problems by quantitative or any other kind of generalized reasoning strategies. Experts in a field can do that, but novice learners provably cannot, due to well-documented, easily measured limits in the working memory where the human brain solves problems.

Novice learners (undergraduates) are nearly always unable to solve problems of the complexity of first-year chemistry problems based on explicit conceptual understanding (“explaining why”). Trying to do so quickly overloads the limited slots for processing information in novel working memory.

I know that is not the answer any of us wanted to hear, but that’s the consensus of the scientific experts in the field of how the brain reasons. Our views and hopes must defer to experts in the scientific sub-discipline that studies how the brain works. The brain we have all been dealt by natural selection is not what we might wish it to be.

So how can students solve well-structured problems (those with clear right answers that can be solved by a systematic process, like the end-of-chapter gen chem textbook problems we assign)? They do so by the “fluent application of memorized facts and algorithms.” They choose the right algorithms based on their experience with the algorithms and with implicit, often unconscious, non-explicit, tacit, intuitive conceptual understanding.

What Jay is teaching with the “given, wanted, path” method is what science recommends we teach to students: After learning fundamental facts, to apply explicit, effective, and generally applicable algorithms. Several papers in this conference, including Dr. Craig’s, have presented such procedures and clear evidence that it is effective to do so.

After fundamental facts and algorithms have been “learned to automaticity,” the brain can begin to grow the connections that are the physiological substance of the brain's conceptual frameworks. But your brain cannot grow those connections until the fundamental elements of knowledge are recallable from memory (“well-memorized”), and that recall takes lots of initial practice to achieve.

So tomorrow, with lots of non-jargoned scientific references to check out, we will look in detail at the general principles that “learning scientists” have recently agreed upon that can help us to prepare our students for majors in the sciences and engineering.

Please put on your dispassionate, skeptical scientist hats and prepare some hard questions for the final papers.

-- Eric (rick) nelson