Since 2007, the reported SAT (reading + math) scores for the state of Texas have steadily fallen from a high of 999 to an alltime low of 944. Solving this problem requires a multifaceted approach. For our part as instructors of a known gateway course, general chemistry, we chose to focus on the most fundamental crosscutting topic in STEM: arithmetic. Hence, the MUST Know (Mathematics: Underlying Skills and Thinking) study was conceived and implemented. General chemistry is widely considered a gateway course because students' success in general chemistry provides entry into several STEM and some nonSTEM careers. Failure to succeed in general chemistry has been linked to students' mathematics fluency that other researchers have attributed to poor algebra skills. However, is it possible that this relationship should really be attributed to students' lack of "mustknow" arithmetic skills? In Fall 2016Spring 2017, a team of 11 chemical educators investigated the relationships between solving simple arithmetic problems and course grades for 2,127 students (60.3% female) enrolled in general chemistry I and II at six postsecondary institutions (3, large public research universities; 2 Hispanic Serving Institutions; and 1, 4year private university) from varied geographic locations in the heart of the state of Texas overlaying 32,000 square miles. The arithmetic concepts evaluated for this study are introduced to most Texas students starting at the 4thgrade level. The selected concepts include multiplication, division, fractions, scientific notation, exponential notation, logarithms, square roots and balancing chemical equations. Results support that students, without the aid of a calculator, succeeded at the 40%correct level (Chem I) and 60%correct level (Chem II). Students' algebra skills might be a better predictor of overall success, but the initiator of the problem we posit starts with lack of automaticity and fluency with basic arithmetic skills. Correlations between final course grades and mathematics fluency ranged from 0.20.5 with the Hispanicserving classes being among the weakest correlations and the research universities exhibiting the strongest. Building a strong profile of a successful general chemistry student is beginning to form from this continuing investigation. Future plans include implementation of HighImpact Practices (HIPs) to increase numeracy followed by dissemination of outcomes and expansion of the study to include other needed successproducing skills like logical thinking, spatial ability, and quantitative reasoning ability.
Declining numeracy in the U.S. is real and gaining concern. The curiosity for this investigation piqued when a coauthor from the Naval Academy noticed that U.S. students were "calculator dependent" and had not received appropriate numbersense training in their K12 studies (Hartman & Nelson, 2016). These authors offered a link to a quiz (http://bit.ly/1HyamPc) that the last author of this paper named the MUST (Mathup Skills Test), and subsequently employed it as part of a pilot study named the MUSTKnow project initiating a statewide investigation.
Introduction
Texas, we have a problem!
In general, collegeready Texas students are less prepared now than they have been over the last 30 years (Fig. 1).
Figure 1. SAT scores over 30 years (points 130) with demarcations indicating changes to state adopted curriculum standards. [Note when Science Director Comer resigned: point 21 (2007).]
Some of the justification of declining scores is attributed to the 201011 academic year (AY), when the Texas Education Association (TEA) funded free SAT exams. With lowerincome students able to take the SAT, potentially additional weaker students may have contributed to the decline. In AY 20112012 and thereafter, some districts began offering SAT exams during the school day, thereby increasing the number of less motivated students (i.e., a get out of class free card!) that may have attracted a population who had not completed the suggested collegeprep curriculum. Related to the graph in figure 1, separating Math SAT scores from SAT Reading + Math, a decline of 22 points occurred from 2010 to 2015 (504 to 482, respectively). However, on a positive note, there is a slight bump in AY 20122013, when the 4´4 curriculum was fully implemented. The Texas 4´4 program required all high school students to sit for four assessments in English, mathematics, science and social studies, and pass a minimum of three in each discipline in order to graduate. One can assume from this small upward movement that “when required” (i.e., when tested), students' understanding will improve. Since 2013, highstakes testing is no longer required and SAT scores in Texas have plummeted.
Texas Curriculum Assessments
Texas has changed the stateadopted curriculum four times over the last 30 years. Each was accompanied by highstakes assessments (estimated to cost about $1M each). TEA instituted a statewide testing program in 1979 for grades 3, 5 and 9. Prior to 1990, there was TABS (Texas Assessment of Basic Skills), and by 1986, TEA implemented TEAMS (Texas Educational Assessment of Minimum Skills) that when not passed students were not eligible to receive a high school diploma stemming from Governor White's "no pass, no play" policy. Curriculum was changed to TAAS (Texas Assessment of Academic Skills) in 1990 and then to TAKS (Texas Assessment of Knowledge and Skills) in 2003, with the latest version (20112012) becoming the STAAR (State of Texas Assessment of Academic Readiness) program that was dismissed by the current governor as a requirement for graduation. Now, from four science assessments being required for graduation there is only one required test in science (Biology STAAR) and poor performance no longer prevents a student from graduating. Another observation coinciding with the constant decline of SAT scores is the resignation of a dynamic TEA science director in 2007. Science Director Comer helped develop and promote the 4´4 curriculum as a strong advocate of advancing study in all sciences and recognizing the necessity of partnering with mathematics education.
Calculator Usage
Currently, Texas high school students only take one highstakes science assessment and STAARs in Algebra I and II. The calculator policy states no calculators are permitted on STAARs in grades 37, but districts must ensure that each student has a graphing calculator to use on all STAARs starting with 8thgrade mathematics (both paper and online versions) and biology. For the biology assessment, there should be one calculator (fourfunction, scientific, or graphing) for every five students. Students may bring their own calculators with them to the assessments, but Internet capabilities must be disabled and calculation applications on smartphones are not allowed. [There was at one time a graduation proposal that student's score on endofcourse assessments would be 15% of their final grade for that course, but this was rejected almost as soon as it was suggested!] Beginning in May 2018, the grade 8 science STAAR will require students to have access to calculators with fourfunctions, scientific or graphing capability (TEA, 2017).
MUST [Mathematics: Underlying Skills and Thinking] Know Pilot Study
Demographics: Institutions
One strength brought to this investigation on what arithmeticfluency levels are necessary to succeed in general chemistry lies in the team's differences. With variations in required institutional prerequisites, class sizes, instructors, textbooks, teaching methods, information and communication technology (ICT) tools, etc., the evaluations have produced similar results leading the team to a "value added" model that may contribute to curricular improvements.
Our research team consists of eight general chemistry instructors employed at six universities (three public research; two Hispanic Serving Institutions (HSIs); and one, fouryear private) spread across 32,000 mi^{2}, about 12% of the state. All faculty team members have acquired IRB approval for this research at each institution.
Abilene Christian University (ACU) is a small private university in west Texas. The student body is ethnically diverse; there are ~4,500 fulltime enrollees with 63% of students listing Caucasian, while 37% are from underrepresented minority groups. Female students comprise 59% of the student population. Texas residents make up 86% of the student body.
Texas A&M University–San Antonio (TSA) was the first Texas A&M University System institution to be established in a major urban center in 2009. Currently enrolled are approximately 5,500 students. Both undergraduate and graduatelevel classes are offered. The Fall 2016 semester marked A&MSA’s first cohort of freshman and sophomore students. Of these students, 74% are first generation, 60% female, and nearly 83% identify as Hispanic or Latino recognizing A&MSA as a HSI. Nearly 1 in 6 students are military connected.
Texas State University (TSU) founded in 1899 is the fourth largest public university in the state of Texas and 34^{th} largest in nation with an enrollment of almost 40,000 students with over 34,000 classified as undergraduates. The university offers 98 bachelors degrees, 91 masters degrees, and 13 doctoral degrees, and is in the top 6 in graduation rates among the 38 public universities in Texas. The population includes 57.9% females, 10.7% AfricanAmerican, 34.7% Hispanic, and 48.1% white with the remaining 6.6% being Native American, Asian/Pacific Islander, or NonResident Alien. This HSI ranks 14^{th} in the nation for total number of bachelors degrees awarded to Hispanic students. The reported sixyear graduation rate stands at 54% and the retention rate of returning freshmen is 77.4%.
Texas A&M University (A&M) opened its doors in 1876 as the state's largest and first public institution of higher learning. TAMU is among nation’s five largest universities with an enrollment of over 66,000 students. TAMU is one of only a few universities in the country to be designated a land grant, sea grant and space grant university, and is reported by the U.S. News & World Report as ranking second in the nation in the "Best Value Schools" category among public universities. Enrollment includes 52% male, with 58% white, 20% Hispanic, and 22% other ethnic groups (black, Asian, international, Native American, etc.). The university has more than 130 undergraduate degree programs, 170 masters degree programs, 93 doctoral programs and 5 firstprofessional degrees as options for study. The reported sixyear graduation rate for the undergrads stands at 79.5%.
The University of Texas at Austin (UTX) is a Tier One research institute, the flagship campus of The University of Texas System, and second largest in the state. Enrollment of 51,000 students (40,000 undergraduates) represents all 50 states alongside 118 countries. Student demographics include a population of 51.5% female, 43.3% white, 20.0% Hispanic, 17.8% Asian, 3.9% black, 10.1% foreign, and less than 5% other or combination of these. UT strives to improve upon several accolades, including Forbes’ 17^{th} Best Value School and Kiplinger’s #13 Best Value Public College. As one of the largest science colleges in the U.S., UT’s College of Natural Sciences includes over 13,000 undergraduates. Many of these students participate in groundbreaking, nationally recognized programs such as the Freshman Research Initiative (FRI) and Texas Interdisciplinary Program (TIP). UT currently reports a sixyear graduation rate of 81.2% for undergraduates.
University of North Texas (UNT) established in 1890 is a fouryear public R1 (Carnegie Classification) doctoral university with an enrollment over 38,000 students (fifth largest in the state), 31,000+ classified as undergraduates. For 21 years in a row, UNT has been named one of America's Best College BuysÒ with 16 programs (5 STEM areas) reported by the U.S. News & World Report as ranking in the Top 100. The reported ethnic makeup includes AfricanAmericans (14.01%), Hispanics (22.12%), and whitenonHispanics (48.41%) with the remaining 15.46% being Native American/Alaskan and Asian and Pacific Islanders or NonResident Alien. The reported sixyear graduation rate for the 2008 UNT undergrads stands at 59.1%.
Demographics: Students
The research team investigated relationships between solving arithmetic problems appropriate for success in general chemistry and course grades of 2,127 students. The combined student population consists of 60.3% female and 85.4% freshmen and sophomores enrolled in general chemistry I and II (Chem I and Chem II) and engineering chemistry courses. Gathering data on the ethnicities within these classes proved problematic at different institutions given various IRB inclinations. However, we assume that the combined students' ethnicities mirror those of Texas given the wide geographic area involved.
Texas Student Profile 20152016 (2017 Texas Public Higher Education Almanac)
MUST Instrument: Statistically valid and reliable
The instrument chosen to assess the arithmetic skills of general chemistry students in the pilot study was published in a report by Hartman and Nelson (2016). This instrument contains a total of 16 items, has two versions, and is named the MUST (MathUp Skills Test). Both versions of the MUST were validated by two UNT mathematics professors. The MUSTs were statistically proven to be highly reliable (KR21 = 0.821) and no statistical differences between versions were shown to exist. The two mathematics professors noted that the concepts covered by this instrument were not taught at the college level because they had been previously taught and assessed prior to postsecondary matriculation.
The MUST was given to students (n = 2,127) facetoface during class without the use of a calculator (time limit of 12 min) followed by with use of a calculator (time limit of 12 min). Each correct answer earned 1.0 point and no points were awarded to an incorrect answer. Table 1 presents the grade level where the various topics on the MUST are introduced to Texas students and the means for each correctly answered question. The overall mean is X̄ = 7.36/16 = 46.0%.
Table 1. MUST questions (without use of calculator, 1.0 point each)
Question 
Topic 
Level Introduced (typical grade) 
Average 
1 
multiplication of two, twodigit numbers 
4th grade 
0.66 
2 
exponential notation multiplication 
algebra I (8th or 9th grades) 
0.52 
3 
exponential notation multiplication 
algebra I (8th or 9th grades) 
0.54 
4a 
division 
6th grade 
0.46 
4b 
number raised to zero power 
algebra I (8th or 9th grades) 
0.69 
5 
exponential notation division 
algebra I (8th or 9th grades) 
0.24 
6 
exponential notation division 
algebra I (8th or 9th grades) 
0.32 
7 
convert fraction to decimal 
6th grade 
0.60 
8 
convert fraction to decimal 
6th grade 
0.68 
9 
solve for an unknown variable 
algebra I (8th or 9th grades) 
0.46 
10a 
determine base10 logarithm 
algebra II (10th or 11th grades) 
0.23 
10b 
determine base10 logarithm 
algebra II (10th or 11th grades) 
0.17 
11 
number in exponential notation squared 
algebra I (8th or 9th grades) 
0.38 
12 
square root of number in exponential notation 
algebra I (8th or 9th grades) 
0.30 
13 
balancing chemical equation 
chemistry (10th or 11th grades) 
0.60 
14 
balancing chemical equation 
chemistry (10th or 11th grades) 
0.49 
As can be seen in Table 1, some of the topics are introduced as early as 4th grade and all have been presented to students prior to high school completion. The last two questions cover the topic of balancing equations, technically an exercise in counting, but not a required course for all high school graduates. Raising an integer to the zero power appears to be the most understood concept with base10 logarithms being the least understood concept. A challenge to teaching general chemistry is presented when only 66% of the students assessed can multiply two, twodigits numbers (like, 87 ´ 69) correctly.
Results
Data without student identifiers from each institution were sent to the research team leader (last author) for compilation. The data analyses to date include descriptive statistics, measures of reliability, correlations, and ttests. As the database grows and the study continues, more statistical evaluations are planned such as Spearman rho correlations and ANOVAs to compare relationships between groups.
Combined data (n = 2,127) from this pilot study were evaluated, then separated by courses, by institution and semester, and reevaluated. Some of the team members presented the MUST with demographic information and IRB consent forms on different days, some gave the MUST without a calculator and with a calculator on different days, and some students did not answer all the required demographic information requested reducing the population with complete data sets to n = 1,415 or 66.5% of the whole. However, for the purpose of this report, the larger population will be acknowledged most of the time.
One of the first observations made was how the scores on the MUST followed the same pattern across multiple classes at various institutions (Fig. 2). It is not that students at the various universities scored the same, but the up and down flow of the means of each question regardless of class (Chem I, II, Engineering), institution, semester (fall, spring) all appear to illustrate the same trends. The majority of these students were educated in Texas secondary schools, so it appears that many have garnered similar understandings.
Figure 2. Pattern produced by MUST scores across multiple settings. Yaxis: point value of 1.0 per question. Xaxis: MUST question numbers.
By Student Success
The percentage of successful (grades of ABC) Chem I students (n = 482) is 66.1%, Chem II (n = 901) is 79.9%, and Engineering Chem (n = 32) is 68.8%. However, some of the successful students in the courses did poorly on the MUST and vice versa. The percentage of successful Chem I students who have a MUST score below the mean (i.e., MUST scores = 04) is 153/319 = 48.0%. The percentage of unsuccessful (grades of DF) Chem I students who have a MUST average below the mean (i.e., MUST scores = 04) is 119/163 = 73.0%, highlighting that a higher percentage of Chem I students with a low MUST score are unsuccessful students in this course. The percentage of successful Chem II students who have a MUST average below the mean (i.e., MUST scores = 08) is 280/720 = 38.9%, and the percentage of unsuccessful Chem II students who have a MUST average below the mean (i.e., MUST scores = 08) is 145/181 = 80.1%. Yes, students can be successful with low MUST scores and with above average MUST scores one is not guaranteed success, but the odds are better for success if a student has adequate arithmetic skill. If you are in Chem II, lacking MUST skills is even more pronounced with over 80% not being successful in the course when MUST scores are below average.
By Course
As reported in Table 2, students without the aid of a calculator and complete data sets (n = 1,415) succeeded at less than 30%correct level in Chem I (4.53/16) and slightly more than the 50%correct level in Chem II on the MUST (8.38/16) with the engineering class' MUST score falling between (7.63/16). With the use of a calculator, students performed better in Chem I and II with approximately 70% and 80% correct, respectively. However, the correlation to course grades without a calculator was higher than with a calculator, r = 0.451(Fig. 3) and r = 0.402, respectively. Even though correlations are low, the MUST was shown to be a consistent predictor of success despite existing variations between the classes; the combined data relationship between MUST scores and course grades appears to be linear (Figs. 3 & 4).
Table 2. MUST without calculators by class
Class Number of students (n = 1,415) MUST mean (SD), max = 16 
Chemistry I n = 482 4.53 (3.33) 
Chemistry II n = 901 8.38 (4.41) 
Engineering Chemistry n = 32 7.63 (3.62) 

Course Grade 
MUST mean(SD) 
MUST mean(SD) 
MUST mean(SD) 


F: 059.4% 
2.94 (2.50) 
3.97 (3.57) 
5.75 (5.50) 

D: 59.569.4% 
3.57 (2.95) 
5.84 (3.78) 
8.33 (4.46) 

C: 69.579.4% 
4.34 (2.97) 
7.49 (4.14) 
7.62 (3.69) 

B: 79.589.4% 
4.97 (3.46) 
9.60 (3.77) 
7.75 (2.05) 

A: 89.5100.0+% 
6.73 (3.51) 
10.71 (3.83) 
10.00 (n/a) 
Figure 3. Relationship between MUST (without calculator) and course grade. (Slope: m = 1.73)
Figure 4. Relationship between MUST (with calculator) and course grade. (Slope: m = 1.51)
Graphical representations of the data supporting Table 2 show the greater linear relationship of course grades to the MUST without the use of a calculator than to the MUST with a calculator especially in Chem I (Fig. 5). Students who used a calculator have a greater variance in success in Chem I as noted the percentage of students who did well on the MUST but not so well in the subsequent course. In Chem II (Fig. 6) both without and with the use of a calculator there appears to be less of a difference when compared to course grades.
Figure 5. CHEM I: MUST scores without and with the use of a calculator vs. grade.
Figure 6. CHEM II: MUST scores without and with the use of a calculator vs. grade.
By Institution and Semester
Table 3 separates data from the various institutions. The research universities in ranked order are UT Austin, A&M and UNT. Noting Chem II MUST scores of these universities in the spring semester, they are 11.41, 10.73 and 5.38, respectively. ACU is a private university in Abilene and performed very well on the MUST, and the two HSIs (TSU and TSA) reported the lowest MUST scores in both the fall and spring courses.
Table 3. Data without calculators by institution (n = 2,127)
Fall 
n 
MUST 
Course 
Spring 
n 
MUST 
Course 
A&M 
405 
8.26 
80.60 
A&M 
428 
10.73 
82.65 
ACU 
106 
8.29 
80.94 
ACU 
30 
8.13 
82.43 
TSA 
29 
3.97 
70.86 
TSA 
17 
2.47 
77.26 
TSU 
171 
4.81 
72.93 
TSU 
270 
3.62 
72.53 
UNT 
273 
6.96 
75.44 
UNT 
300 
5.38 
70.96 




UTX 
98 
11.41 
83.65 
Average (SD) 
984

7.18 (4.12) 
78.62 (12.42) 
Average (SD) 
1143

7.51 (4.47) 
77.19 (16.35) 
By Gender and Classification
When general chemistry data, separated by semesters, were evaluated (Table 4), males outperformed females on the MUST without the use of a calculator (p < .05) in the fall, but not in the spring. As to course grades, no statistical difference was evident in the fall course averages, but in the spring, females statistically outperformed males.
Table 4. Combined data by gender (without calculator)
Fall 
n 
MUST* 
Course 
Spring 2017 
n 
MUST 
Course** 
Males 
402 
8.18 
77.4 
Males 
442 
7.36 
75.9 
Females 
582 
6.49 
77.7 
Females 
701 
7.75 
78.0 
*p < .05 (males outperformed females on MUST in fall 2016 without a calculator; no difference in spring)
**p < .05 (females outperformed males in course averages without statistical difference on MUST)
No statistical differences were discovered between the various classifications (Table 5) where of interest is that freshmen distinguished themselves by bringing the highest MUST and course averages while students identified as juniors had both the lowest MUST and course averages.
Table 5. Combined data by classification (n = 2,127)
Classification 
n (%) 
MUST (SD) 
Course Average (SD) 
Freshman 
1197 (56.3%) 
7.87 (4.30) 
79.33 (13.96) 
Sophomore 
620 (29.1%) 
7.08 (4.22) 
76.94 (15.03) 
Junior 
238 (11.2%) 
5.87 (4.03) 
70.14 (18.29) 
Senior 
72 (3.4%) 
6.07 (4.68) 
72.51 (18.56) 
Successful and unsuccessful male MUST scores were statistically higher (p < .05) than those of the females (Table 6). Course averages presented no statistical differences for successful or unsuccessful students of either gender. A slightly greater percentage of females were successful in the classes on the average than males even though they entered with lower MUST scores. It is possible that this observation is due to the nature of general chemistry curriculum in that algorithmic assessments are paired with conceptual understanding assessments and females improve their overall grades because final grades are not solely based on mathematics fluency.
Table 6. Successful female and male students (without calculator)

n (%) 
MUST Average (SD)* 
Course Average (SD) 
Successful Females 
989 (77.1%) 
7.76 (4.00) 
84.17 (8.39) 
Unsuccessful Females 
294 (22.9%) 
4.28 (3.36) 
56.70 (12.85) 
Total Females 
1,283 
6.96 (4.13) 
77.87 (15.01) 
Successful Males 
618 (73.2%) 
8.94 (4.33 
83.88 (8.37) 
Unsuccessful Males 
226 (26.8%) 
5.25 (3.87) 
56.72 (13.38) 
Total Males 
844 
7.95 (4.51) 
76.60 (15.61) 
*p < .05 (Total males outperformed total females on MUST)
Conclusions
The average successful (course grades of ABC) general chemistry student based on pilot study data has the following profile: Chem I MUST score ≥ >32% correct and Chem II ≥ 58% correct. The team is in the process of building a more definable profile of what it takes to be a successful general chemistry student. In the fall 2017, an algebraskills assessment is being added to the investigation, and the MUST (with calculator) is being eliminated. The demographic section has been expanded in order to give us more information about students' experiences so that those in danger of not succeeding may receive informed advising and hopefully avoid some of the noted attrition, and thereby grow the understanding of a successful general chemistry student.
References
2017 Texas Public Higher Education Almanac Chem13 News (September 2012). Why students fail in college.
De Vega, C. A.; McAnallySalas, L. (2011). USChina Education Review, 1, 10.
Hartman, JudithAnn, and Nelson, Eric A. (2016). Automaticity in Computation and Student Success in Introductory Physical Science Courses. Cornell University Library. arXiv:1608.05006v2 [physics.edph] Paper presented as part of Chemistry & Cognition: Support for CognitiveBased FirstYear Chemistry.
Technical Digest 20082009, Chapter 1: Historical Overview of Assessment in Texas (pp 18).
Texas Education Agency (TEA) Student Assessment Division. (2017). Updated calculator policy. http://tea.texas.gov/student.assessment/staar/
Texas Essential Knowledge and Skills (TEKS) Home Page. http://www.tea.state.tx.us/index2.aspx?id=6148 (accessed September 2017).
Texas CCRS. Website of the Texas Higher Education Coordinating Board, Texas College Readiness Standards. http://www.thecb.state.tx.us/collegereadiness/CRS.pdf (accessed Sep 2017).
Tai, R. H.; Ward, R. B.; Sadler, P. M. (2006). High School Chemistry Content Background of Introductory College Chemistry Students and Its Association with College Chemistry Grades. Journal of Chemical Education, 83, 1703–1711. DOI: 10.1021/ed083p1703
Zeegers, P. L. M. (2001). A Learningtolearn Program in a Firstyear Chemistry Class. Higher Education Research Development, 20, 35–52. DOI: 10.1080/07924360120043630
Supplemental Information for Discussion
For the past 25 years, academic statistics on college readiness have remained relatively constant (Tai, Ward, & Sadler, 2006; Chem 13 News, 1986 and 2012). On average, students take six years to complete a fouryear college degree, and 3060% of these students will require remedial coursework upon entering college (Tai, Ward, & Sadler, 2006). A more disturbing statistic is that roughly 30% of incoming firstyear students consider terminating their academic studies entirely (Zeegers, 2001). Students have many challenges as they progress from secondary to postsecondary education. The failure of many freshmen comes from their inability to become proficient at time management, planned study time, a heavier reading load, no reminders of tests and homework, balancing work and play, and having to seek out help on their own (De Vega & McAnallySalas, 2011).
CollegeReady Students
"Why students fail in college" was published in Chem 13 News (September 2012), but originally published in October/November 1986 in Chem 13 News, pages 1011.
Which of the following remain true today?
1. Unprepared (or underprepared) to assume responsibility for their own learning.
2. Time management skills are lacking.
3. Lack of selfdiscipline needed to study effectively.
4. Do not understand whether or not they comprehend the material needed.
5. Lack of skills to find needed information or how to separate misleading or irrelevant information.
6. Difficulty in synthesizing information from several sources.
7. Failure to complete (and sometimes even begin) assignments.
8. Failure to interpret tables, diagrams, graphs, mathematical expressions, and specialized languages such as chemical equations.
9. Poor communication skills especially when attempting to express their own ideas.
10. Lack of originality needed to synthesize subject matter and draw conclusions.
11. Writing is often poorly organized, grammatically incorrect and riddled with contradictions.
12. Inability to evaluate facts, directions, or other information.
13. Lack flexibility when faced with a poor instructional environment to acquire useful knowledge on own.
14. Meaning of memorized words remains unclear.
15. Failure to understand logic behind the algorithms or rule.
16. Proportional reasoning is lacking at a level required to understanding most chemistry concepts and computations.
17. The logic inherent in mathematical and chemical language, spatial reasoning, mental constructs and how to think about chemical changes are at best in an immature stage.
18. Inability to retrieve information "taught" from longterm memory.
Transition from high school to college
Coursework in high school chemistry and in general chemistry is aligned (Texas College and Career Readiness Standards, 2009). Future employers expect diligence, persistence, reliability, problemsolving skills, and logical thinking that can be promoted in chemistry courses. What is not in place is how to help student adjust to a set of new expectations, but we can help here, too! Maybe when implementing your latest high impact practices (HIPs) consider the following:
(1) High school expectations are mainly effort based.
students who make below 70% on a test can retake it
make up work is required when there is an excused absence
extra credit is routinely available
attendance is required
(2) Postsecondary level expectations are performance based.
Comments
Calculators: Negative Impact on Placement Tests?
Dear Texas Team –
Thank you for collecting so much data from such a wide sample of students. To start. let me ask about one graph in your paper.
A likely majority of US chem department (the list includes Yale) currently use a placement test to steer students into either regular gen chem or an alternate course that provide extra help prior to or during gen chem.
If I understand your Figure 5 (with the red and blue bars) correctly, for your sample of over 400 of first semester gen chem students across six different universities:
• The test of simple math without a calculator, on average, was a good predictor of gen chem success. That agrees with other studies and the predictions of cognitive science.
• On the test of simple math allowing a calculator, the students who on average scored highest were those who scored lowest without a calculator, and ended with the lowest grades for the semester. I think that both correlations are unexpected, and “unexpected” piques interest.
• If your diverse sample is close to typical of US students seeking to enroll in gen chem, and if a placement test for chem in US schools includes math with calculators permitted, won’t the students who most need extra help tend to be the least likely to be scheduled to get it?
Even if “math with a calculator” would be found to be not correlated, rather than negatively correlated, with Chem I success, I suspect that would be something that chemistry departments might want to test in their own populations.
Is that a proper interpretation of your data? Are there factors in the testing that would undermine those conclusions for your sample?
If bullet 2 is a correct statement, what would be the speculation of the authors on the possible reasons? Why might simple calculations with a calculator negatively correlate with either calculations without a calculator or semester grades?
 Eric (rick) Nelson
Your Texas study
Diana,
This is a great study—very thorough and comprehensive. It provides information that reflects several smaller studies I have seen over the years. Kudos to you and your team! Now I am going to ask the “cosmic” questions:
I know you cannot speak to politics here, but I might assume from your Figure 1 data that your respected Comer resigned in discouragement in the midst of some improvement followed by a decline. This suggests what I’ve seen elsewhere – a dynamic public administrator, being stymied in his efforts towards academic improvement, and subsequently giving up. We simply cannot afford to throw away laudatory efforts.
Thank you!
The Texas Study
FirstTo everyone, Happy Mole Day!
Cary,
Thank you and Rick for bringing this very important forum to the forefront of discussion. The Texas SAT results are very troubling and it is time that we do something. We can no longer just have the intradepartmental discussion in the coffee room. Our study across six universities in Texas document that we have a problem. With presenting these data to ConfChem, maybe it will inspire others to do a similar study in their state. Texas has been below the SAT national mean for the past 30 years, but there are other states that have been abovewe need to hear from them. The prior knowledge that students bring with them to firstsemester gen chem is important.
We have been in touch with the Assistant Commissioner of the Texas Higher Education Coordinating Board about our concerns. He noted at our F2F meeting that his data was similar and that he had also noticed the slight upward bump when the 4x4 was in place. Our students are very capable of learning, but the necessary classes have to be required for graduation. Yes, Texas is a very political state. Our latest House Bill (HB 2223) is to put the onus on the back of the community colleges that offer developmental courses necessary for gateway courses. It basically states that we do not want the underprepared student to have to complete extra coursework (i.e., pay for extra classes). This HB wants the instructors teaching the gateway courses to incorporate the needed remedial skills into their courses. It went into effect Sept 1, 2017. No word on who is doing what.
Future plans are to add a new aspect to our study each year. This year we are adding algebra. Next year we might add spatial ability, logical thinking, etc. It is up to the group. We will hopefully uncover an academic issue, but if it is life issueslike have to workthen maybe we will have an insight here, too.
If anyone else out there has insights to share and/or doing similar studies, please let us know. This is a great start to maybe starting a national movement.
Thanks!
STAAR data
Dear everyone
I just found this information tonight. It really supports the disconnect between what is happening at the precollege level in Texas and the SAT scores the Texas students are receiving. The STAAR test is the "high stakes" test students take to graduate from high school. I'm not sure what to say beyond the fact that the Texas students are living upto expectations. If it is taught, it will be learned, and if "we" don't have a lot to learn, all the better!
Calculator Use
Rick,
You have a good eyewhen I added Figure 5 to the paper I saw the low MUST scores with grade of F and higher MUST scores as grades improved and calculators were not used. I did not see a correlation when the calculators were usedI saw the negative trend, but thought it would be a very low negative and probably not worth looking into further. However, you actually did the correlations for us, and showed that it was a relatively strong negative correlation! WOW! Now, what will be interesting, will we see the same correlation this semester? I don't think we need to do more than acknowlege its existance at the moment, but if we see the same issue with this semester's data, then we have something that really needs to be analyzed more thoroughly.
Thank you very much!
Calculator Use
Our observation of a similar trend got us started: it appears that as calculatordependency increases, grades decrease. This suggested to us that it may not actually be helpful to students for instructors to offload calculations when doing chemistry problems. I mean to say that it seems Texas was trying to help its students focus on the chemistry, by automating calculations (via calculator usage). Rick, we agree that more investigation is needed, to determine whether the problem is the calculatordependence or something else.
In practice, I observe that when they have the calculator inhand and they are doing chemistry problems, a lot of students backcalculate from every answer to determine which answer is correct.
My experience
First off, Hi Dr. Mason and Cynthia! I have had similar experiences in my courses; my students are lacking basic math skills especially in algebra. I get frustrated because I do not have the time to teach the chemistry and math! Do you think all colleges/universities should have students take a math placement exam with minimum scores to enter into mathbased classes? If they “fail” they have to take a math remediation course. I have seen this in 2year colleges, which makes me wonder if they see some of the same problems.
Remediation
Hi, Carissa!
I am not suprised to hear you state what you just did. Your students are (for the most part) also products of the Texas system. We have a problem and your observations just add more fuel to the fire. Hopefully, we can hear from a few other states this week to see if these issues are pervasive.
This academic year we are adding the DAT to our instruments (Diagnostic Algebra Test by Cooper & Pearson, 2012). This semester on the MUST (mentioned in the ConfChem paper) and the DAT we are not allowing them to use a calculator because our MUST data showed that without the use of a calculator the MUST scores were more correlated with grades than when calculators were used. In Texas students start using calculators as early as 5th grade. Is this hurting their number sense ability? Only 66% could correctly multiply 87 x 96 without a calculator. I'm sure at one time in their lives they knew what 7 x 6 was, but if you "do not use it, you lose it" comes to mind. From HB 2223 it appears that those of us who teach gateway courses might need to also teach the remedial skills lacking. Are the powersthatbe going to give us more time in the classroom with these students and pay us accordingly? Are students going to have to sign up (and pay) for a 5 h gen chem course that will not transfer as such to another university? I am not sure that the sponsors of HB 2223 actually thought the whole thing through!
As soon as we can find funding, we want to hold a statewide workshop for general chemistry instructors and see what we can do to help solve this very troubling problem. Any ideas?
Statewide work shop
I like your idea of having statewide workshop for increasing math skills in general chemistry. It will be very helpful for all the instructors. I thought Oklahoma before and there were so many students struggling with chemistry. Could you please send me your contact information so we can keep in touch and see the possibility of having this workshop. my email erandi.mayadunne@lonestar.edu
Workshop
Texas STEM BRANCHES Workshop
Texas Science, Technology, Engineering, and Mathematics for Boosting Retention and Numerical Competency for Higher Education Success Workshop
Here's my idea for the workshop. Two locations: Denton (for north half of state) and San Antonio (for south half). Held between fall and spring semesters.
The aim of this workshop is to inspire all Texas chemistry instructors to build a learning community of interested educators to improve the numerical competence of entering general chemistry students in order to improve these STEM students' retention and success rates.
Correlating performance to subject matter/content
Hi All,
We often hear of studies correlating math abilities to success in general chemistry, and my worry (which has grown over the years) is that this is generally true because, over the years, we have steadily migrated general chemistry into a math course instead of a science course.
I was wondering if your team has data on student MUST scores as a function of the content or chapter being covered in general chemistry. I was hoping to evaluate whether students who do poorly in the quizzes when dealing with, say, unit conversion (heavy math) are also the students who do poorly in the quizzes dealing with atomic models (mostly no math). Or maybe if you were able to analyze general abilites in concept type questions versus math/algorithm type questions in the same chapter (say on stoichiometry, where knowledge of chemistry concepts is evaluated alongside math protocols).
Thank you for your time.
Greg
Correlating performance to subject matter/content
Greg,
Part of what we are doing this semester is repeating the MUST, adding the DAT, and giving common paired algorithmic and conceptual questions as part of each exam. We might not be able to pinpoint the MUST results, but this study might allow us to track MUST scores as correlated to these paired questions. It will be interesting to see if the algorithmic or conceptual question causes students the greater concern.
What do you think about our plan?
Conceptual vs Algorithmic Questions
I would be very interested to see your data/results. I would be very interested to see whether an assessment can be developed that corrects for the inertia of teaching (and consequently, learning) that is focused on algorithms.
I taught at UT PanAmerican (now UT Rio Grande Valley) for over 12 years, and my experience there indicated that students will generally do better on algorithmbased questions because that's what they've been trained to learn, that's what they've been taught to do in high school. The thought that chemistry could be learned as a series of mathematical protocols was so pervasive and ingrained that many high school and college instructors could not tell the difference between science and applied science. I was on the statewide committee developing the testing framework for what would become the TAKS, and I still, painfully, remember the heated debate on whether dimensional analysis was an integral part of chemistry or not, whether it should be a part of the testing framework or not.
Might I suggest that you develop a parallel study focused on 4th graders? A small study I did indicated that students at this level have not yet been shaped to think of chemistry as math. Students at this level, given a set of interactive animations, give evidence of learning through inquiry, and can actually develop concepts and models that are the foundation of chemistry as science. Not so much for high schoolers. It would be interesting to see at what point in the students' development does the chemistryasmath kick in. This might guide us to develop appropriate interventions.
On Concepts and Algorithms
Greg –
In your question/comment, are you saying that general chemistry has too many quantitative topics, or are you saying that quantitative topics such as stoichiometry, pH, gas laws, equilibrium constants, and thermodynamics should be solved by procedures that do not involve mathematical procedures (algorithms)? Or are you saying both of those positions are true?
 Eric (rick) Nelson
Science vs Applied Science
Rick 
It's not that there are too many quantitative topics, quantifying what we understand or using data to support our theories must be a part of any natural or physical science. However, as pointed out by Nurrenburn and Nakleh as far back as the 1980's, the math focus in GenChem has gotten us to the point where we have students who tend to be algorithmthinkers. More damning is the suggestion that just because a student can do intricate math calculations on a chemistry topic, does not necessarily mean that the s/he understands the underlying chemistry concepts.
Here's a f'rinstance: I gave my students a picture showing a beaker with 60mL of a 10M sugar solution and an inset "magnification" to what we imagine the solution might look like at the particulate level and then ask what would the magnified section look like if a 20mL portion of the original solution were obtained and similarly magnified. ... These are students who can do intricate dilution problems, who can do stoichiometry involving molarity with relative comfort. ... It is distressing how many students will choose a magnified picture where the sugar to water particle ratio has been reduced to 1/3 of the original image.
I think the math in chemistry issue is suggestive of a much larger issue ... but we can talk about that when the forum gets opened up on Thursday to the issues that the papers bring out. Suffice to say that I'm all for reducing the math, if it means we get to go back to discussing science instead of problem solving, if it means we can go back to teaching students about concepts and models  and imparting a vision of science that is its own raison d'etre.
I agree! Conceptual AND Algorithmic Capabilities needed!
From my perspective good number sense and algebra skills allow students to tackle new Chemistry (and Mathematics) concepts. It's like learning how to walk before you can dance! When students do not understand the magnitude of numbers (in scientific notation or expressed as logs for example) it is difficult for them to get their minds around the ideas that are central to Chemistry. When they are using up all of their working memory trying to remember NEW algorithmic approaches to problem solving taught in a Chemistry course, they can be blind to what is actually happening in a chemical system....but if they have practiced problem solving in a variety of settings often enough and know that to solve for two unknowns you need two equations or know the rules for manipulating logs, then they are free to use these tools and apply them in new settings....without the process of the math process being a distraction. One of the topics where I see this challenge most often is in teaching about equilibrium and buffers! It's such an interesting topic, but students get so bogged down in trying to get the "math right" that they miss the beauty of what's happening in a system! For students who already understand the math well, this topic is a great deal of fun and sometimes the first window into the fascination of solution chemistry!
Math Skills and Concepts
Dr. Powell,
It is nice to see such a clear summary of what science says about the limitations of working memory, the implications for chem instruction, and ways that cognitive science says we can help students work around working memory limits.
Are there special strategies that you or your department use to help students “automate” their math skills before they need them for topics in chemistry?
 Eric (rick) Nelson
Math review...
This year we have ramped up our basic math review in response to what we saw on the assessments we gave during the pilot phase of this project. The students weren't thrilled, but we spent almost an hour of lab for the first several weeks doing routine math review worksheets WITHOUT CALCULATORS...to help them master some basic skills. We have also assigned math review problems at the beginning of each chapter as a homework assignment due BEFORE we start the chapter. These assignments are pure math skill practice related to the types of mathematical manipulations that they will use during the chapter. We hope this separates the "math skills" from the "chemistry content"....and so far we feel like we've seen improvement. We'll see when the final stats are in on student performance and at the end of the year on the ACS standardized exam we give each year!
Dedicated Math Review InClass
Rick, I have a specialized class with extra time per week to cover chemistry basics, study strategies, and math review. So I have created inclass examples and practice to build number sense.
I agree! Conceptual AND Algorithmic Capabilities needed!
Over the past few decades of working mostly in the private sector (engineering) and teaching (3 yrs HS, 1 yr college, 1 year U physics), I conclude that students NEED an intuitive and reflexive grasp of math, so that they can be freed to focus on concepts as they progress...chem, physics, bioengineering, welding (yep)...math is pervasive. Though my experiences are largely in nuclear and conventional engineering, I spend time with folks in different professions, and the symptoms are the same everywhere. Do we really need more research projects, while our students lag so far behind? Isn't it as clear to academia as it is to the business sector that students are dismal (generally...there are many exceptions) at math and math tools, and that their poor intuitive grasp of math fundamentals greatly handicaps them as they try to advance in their fields? Isn't that what these two papers show? We were //drilled// on basic math in grade school in the 1960s. I emphasize in my classesand with the engineers I supervisethat being able to clearly express a problem and potential solutions is more important than the calculation. In real life, really important calculations are independently verified anyway. I can find students who can tell me in precise expressions what is happening in a physics problem, but seldom can I find one who can tell me what is actually happening. In the business sector we are expected to identify a problem, analyze it, find a solution, and implement it. Here we have clearly identified the problem. And we have lots of evidence that we have the solution at hand. University officials need to have the courage to stand up and say to incoming students that they are not prepared for freshman level work, and that they must either expect to have an additional workload during the year, //which they must do well in to progress to later courses//, or spend part of the first year in remedial studies, slowing their progress through a 4year program. In 1975, a dean of a major university said to me that courses were being watered down because students could not do the entry level work, and needed the remedial work. Over the next decade, I saw applicants (with 3.5 GPAs) for entry level engineering positions failing interviews at an alarming rate. One large company (20 000 employees) asked its professional staff to get active in local high schools, and to consider "adopting" students and following them through their bachelor degrees, encouraging them, guiding them, and so on. Why? Because in 1985 they could not find entry level professionals to fill entry positions as the company's employees advanced and left holes at the bottom of the ladder. Universities need to send a clear message to high schools that their graduates need to be prepared to do the work OR be prepared to delay their academic program, at their own expense of time and money. Students' time is precious, and it is wasted when they are not prepared. THEY don't know any better. WE do. Observing my own career highlights and those of others, from machinists to space plasma physicists, I conclude that real progress happens only when fundamental building blocks are available at the level of intuition and reflex.
The use of ALEKS, et al.
Your discussion with Diana (as well as points others have broached) bring up the following thoughts.
My question is always the balance of measurement vs Response
We use ALEKs as a summer bootcamp and get very good results. The boot camp is 2 weeks 4 hours per day, with a faculty member to give a bump when needed. The real issue is getting the students to take the summer prep (and do we have a self selection issue in our results).
I am at University of the Incarnate Word in San Antonio. The demographics of our students will be pretty much the same at Texas A&M San Antonio. I have struggled with being able to measure a predictor (such as the MUST test), and what do I do with the results. I would like to implement some type of mandatory preparation, but the practicalities seem to be challenging (adjusting schedules at last minute, what is an effective intervention? etc.)
I would like to try collecting the MUST data on my 3 Spring General Chemistry sections.
John
MUST Join in!
Hi John, that would be great to have you collect MUST data in spring! We'll be in contact after SWRM. Also, as an aside, I have had good feedback on a workbook that I have my students do during summer (due first week of classes): Homework Helpers: Chemistry, by Curran. The students like that it is "gentle" and stepbystep. I like that it is inexpensive and helps my students ramp up. I do this in addition to the ALEKS. I recommend my students to do the workbook before ALEKS. Those that do, like it and recommend it for future students.
How do students actually solve problems
One of the frutrations that I found i dealing with the mathematical abilities of my students in New York state was the fact that their method of working problems sometimes consisted of memorizing as many examples as possible, then compapring the problems on the tests to their set of memorized examples. This meant that they would attempt to apply their various examples to a problem one after the other, until they either found comething they seemed to work or else quit is frustration.
By no means did all of the students do this, but it was more common than I would have hoped for. This makes it hard for an instructor to provide assistance, since you are looking to see what procedures they are having trouble with and the answer is they are not using any standard mathematical procedures.
The amazing thing to me was that this aproach often worked well enough to get them Cs or Bs on examinations.
P.S. When I would estimate the answer to a calculation without using a calculator, a number of my students suspected that I was using withcraft or some other black art.
Estimation witchcraft
Harry  I see the same effect as well. Some students are amazed at simple estimations or calculations. I had a GOB student yesterday who had to use the calculator to solve 500 / 100 * 8 in a dimensional anlaysis problem.
Be a good witch
Many times during class when doing a problem I would say, well the answer has to be close to X, and then verbalize in detail how I got that w/o doing the detailed math. Sometimes I would be very mean and make a student walk through a Socratic estimation procedure. Don't know how well this worked but there were occasional looks of amazement.
Math readiness for gen chem
Here at Colorado State University, we have observed the same trends that the MUSTKnow study has revealed in TX (albeit on a much smaller scale). That is, for many years, we have given Gen Chem 1 students the math section of the Toledo exam in their first recitation session. We correlated grades to the math scores, finding that students who scored below 65% on the Toledo exam had a very high probability of earning grades of D or F (or withdrawing from the course).
In response to this observation, we created "Chem Prep" as a prerequisite for Gen Chem 1. Chem Prep is similar to what the University of TX at Austin uses to prepare students for gen chem. It uses the ALEKS homework program that the students will go on to use in the course. In our initial trial, we selected 104 topics, virtually all math and arithmetic, and students had to master at least 100 topics. Last fall, we piloted the program. It appears to have helped our students to remember the math that they had been exposed to but had forgotten.
Of course, our Chem Prep does not address the calculator/no calculator issues but it does provide a drill opportunity for students to regain lost math skills prior to taking gen chem. The jury is still out as to its effect on student grades (although the initial data look good...)
Thanks for sharing your data!
Nancy
Is "Chem Prep" a course for
Is "Chem Prep" a course for credit? If it's a prereq for gen chem and you only give the placement at the first recitation session, how do you get students into this course? Also, is it all selfpaced using ALEKS?
Chem Prep
Chem Prep is not for credit. It is a prerequisite for Gen Chem 1. We treat it in a similar way that our math prereq works; students have to take a math placement exam and can place into a certain level that allows them to take Gen Chem (you might ask why this is not sufficient to ensure their math skills for Gen Chem  we are not sure about it  stay tuned). The Toledo exam is something we did to collect data about our students' math readiness. We don't use it for anything except advising. With Chem Prep in place, we may drop the Toledo exam (except that it helps us to see how the students are doing compared to previous years). The ALEKS is self paced.
Quick Prep for General Chemistry
Quick Prep for General Chemistry is available in for both first and second semester general chemistry courses if you use either OWL or MindTap for homework. These are modules that can be assigned prior to the semester or to make them due at the end of the first week of class. The average student time is about 20 hours, but the range is larger. For gen chem I, the topics include both chem concepts and mathematics (matter, naming chemical compounds, measurement and calculations, calculations involving quantities of matter, chemical reactions , and algebra, temperature, density, graphs&logs). There has been a couple of papers on the use of Quick Prep to get student ready for the course, but not a good large scale test with gen chem students.
Estimation Skills
Harry, you are right on. That's the reason I always ask students to make an estimation before a problem is "solved"; that is, they are asked to explain why the answer is reasonableusually based on a rough estimation of what the answer value ought to be. This has been made explicit in the textbook that Conrad Stanitski and I wrote, Chemistry: The Molecular Science, in which every chapter has an Estimation box that shows how to estimate something related to the chapter content and every one of the hundreds of example problems has a "Reasonable Answer Check" at the end of the solution section.
Supplements for numeracyMUST
Dear Texas Team,
Thank you for presenting such interesting and thorough work! It seems like everyone responding agrees that students have a difficult time with mental math, estimating, and moving away from their calculators. Do any of the authors (or others) have worksheets or supplemental materials that you provide for students to increase their numeracy?
Thank you,
Greg
The Texas project began
The Texas project began because of a paper I read where the lead author was from the Naval Academy and the second author was our host for this conference, Rick Nelson. URL: https://arxiv.org/pdf/1608.05006.pdf
Here is the citation info:
Automaticity in Computation and Student Success in
Introductory Physical Science Courses
Authors:
JudithAnn R. Hartman,* Eric A. Nelson†
Affiliations:
*Department of Chemistry, United States Naval Academy, Annapolis, MD 21042 USA
†Fairfax County (VA, USA) Public Schools (retired)
Corresponding Author:
*Hartman@usna.edu
My first thought wasit's not just Texas students! BUT, we need data to back up our observations and gut feelings, so why not get my ChemEd friends still employed at various universities in Texas to collect data. The arithmetic exam we used (and named) the MUST came from this publication. It has been tweaked a little for the continuation of the study this fall, but it basically covers the same topics.
Texas Project
Several of our team members are doing elearning "worksheets" to help improve students' number sense. Two universities use ALEKS (adaptive learning) at the first of the semseter to try to level the playing field. Another professor has a mathprep program using Knewton and a couple of professors are using a mathprep Sapling Learning programs. AND, if I remember, we have others using OWLprep and Math Lab. We will have to wait until after the semester to see if any of these esystems make a difference. It is interesting that with all our differences in the Pilot Study, we had MUST results supportive of students' prior knowledge being very indictative of successs.
Standards to Measure Results
Dear Texas Team,
The paper mentions use of an ACS general chemistry final exam that mixes the traditional exam with the newer “conceptual focus” exam. Do all the schools in the study use those "paired question" exams at the end of each gen chem semester? Or do all of the study schools use one of the ACS exams at the end of each gen chem semester, but different versions of the exam? Or are there other practices?
 rick nelson
Paired questions
Last year (pilot study) we did not use any common questions, but this year we had a committee develop a set of 12 paired questions (6 for first semester topics and 6 for second semester topics). We are using these common questions this fall and spring, so the data from this activity will not be analyzed for another couple of months.
Estimation and other skills
Yes, the ability to estimate is a very important skill. But solving the problem means going back to the preparation students receive in their high schools (and before). And while, estimation is a really important skill, not just for science classes but for life after college, there are two other skills that I believe are also important and related to a discussion of estimate skill. Those are “precision” and “accuracy”. These are skills that should be especially important in a science lab. Has anyone found students with difficulty in not only calculating them, but also distinguishing between precision and accuracy?
I retired from the faculty in 2012, but maintained a presence in a NSF RET project until it ended in 2015. A couple of participating teachers brought some of their best students to participate in the research program. It was starting to find that the teachers were unable to determine accuracy or precision for the results of their experiments, as well as their students. As part of the professional development, which was a component of the project, we spent time “reviewing” the two terms with the teachers, with the expectation that they would take that back to their classrooms. We provided the teachers with a lesson in a Spring workshop that focused on determining accuracy and precision, which they then had to bring back at the beginning of the summer program. My only regret was that we didn’t think of working with the teachers on their estimation skills also.
While it is understandable that we must work with the students at the college level, we should also be giving thought as to providing training for high school teachers on these skills, in a way that they can take it back to their classrooms. And where there are teacher preparation programs, how many people know what is being taught to perspective science teachers in their college? Something to think about?
Now I will get off my “soapbox”.
Prep for teachers
Howard,
You bring up a very good point. Many secondary science teacher certification programs in Texas want their students to become composite science certified, so that they will be more employable—they can teach all sciences, because they are certified to teach all. However, there is a limit to the number of hours one can accumulate to get a degree, so developing a proficiency in any one science suffers.
My solution to this problem is to have a requirement that all secondary science teachers must have an undergraduate degree in some science and then all certification is at the postbac level giving it a more professional status. My "solution" would not cure all the issues, but what we are currently doing is not working. We have got to require more education or better professional development. Something has got to give!
Diana
Fix state standards
Teacher preparation is important, but I’d primarily look elsewhere to start solving STEM achievement issues.
In most US states between 1995 and 2014, state “math standards” required K12 teachers to prepare students to use calculators for arithmetic beginning in third grade. Without math facts in memory, for those kids, math examples after 3rd grade didn’t often make sense.
You can be the world’s bestprepared high school or college chemistry educator, but if those curriculum disabled, curriculum casualties predominate in your classes, without intervention, success will be limited at best.
The newer Common Core math standards on average are better, but still do not call for teaching and learning math in ways that align with how science says the brain learns.
Lobbying for improvement, I’d suggest, needs to begin with state K12 math standards.
 rick nelson
Standards and Prerequisites
When I taught at the College of Central Florida, I learned that in the past the state had a sort of standard system of its own in the college system in the course descriptions of the state course numbering system. The problem was that these were not enforced. In real life, standards are only set when faculty construct assessments, grade assessments, and issue grades for the term. Standards are a great idea but depend heavily upon the enforcement mechanism. We also required math prerequisites for our courses with the idea that math teachers might do a better job than we could since we had so many other things to do during the term. I recall one student who was taking differential equations but could not determine the order of a single reactant kinetics experiment from reactant and time data. There were a fair number who could not do basic algebra and said they had never been taught much about word problems. Again, standards are only as good as their enforcement. I really do not see how there is time to do remedial math in any of the chemistry classes. This is particularly true in our general chemistry programs which are so content heavy. Best Wishes Richard
Can we back up for a moment?
I don't know if this is the appropriate time/place to ask this, and I certainly don't want to highjack the discussion (if so, just ignore this question), but it seems to me that lost in the discussion of math skills in General Chemistry is the question of whether (or how much) math should be in GenChem.
While I agree that quantifying known concepts is important to any science, I would argue that, as a skill (as opposed to knowledge), that is more in the realm of applied science and not science itself. I get the sense that the distinction steadily got eroded postSputnik (and with the advent of multiple choice, Scantronchecked, questions) which favored the more applied assessments  the assessments controling the content  and now we are at the point that we’ve taken it for granted that General Chemistry is Math.
So what do the forum members think? How much math should be in GenChem? Can we even go back to a sciencefocused GenChem that emphasizes the development and learning of concepts and models? If we did, what would that look like?
Yes, let's back up:
Gregorius,
I wrote earlier this morning in response to your thread of Oct 24, and I follow up on the discussion here.
The amount of mathematics needed in genchem (“chemmath”) is what is required to quantify those core fundamental concepts for the population in question (GOB, general lifescience service course, engineers, chemmajors). In other words, the mathematics should be in the service of understanding and applying those concepts – no more, no less.
We have seen the evolution of “nonmath” curricula, such as ChemCom and Chemistry in the Community. However, I think most of us feel that our chemistry students need more than what is included in those courses and thus we would add chemmath to them as necessary.
As I discuss in Paper #6, students DO equate chemistry with mathematics. Our role is to try to disabuse them of this and to show them that chemmath is not the formal math they study in their “pure” mathematics courses, where science applications, unfortunately, are minimal. We show them that chemmath is largely arithmetic, prealgebra and algebra I, and that they CAN negotiate chemistry exercises and problems when given the requisite practice within the course.
Cary Kilner  I agree with
Cary Kilner  I agree with you that Chemistry in Society type courses is not GenChem. I would go so far as to say that such courses are an application of chemistry principles on modern life, and as such, they are, again, applied science and not science itself.
Can we agree though that the math in chemmath type genChems has taken over the science (even going so far as to rob the students of the joys of discovery, inquiry, and wonder)? I wonder if instead of working to disabuse students of the idea that genChem is math (or algebra), might we not have better student buy in if we refocused genChem onto the concepts rather than the quantification.
When I feel like driving myself crazy, here's what I imagine goes through a textbook preparer's mind: we need to teach stoichoimetry, stoichiometry is best approached as a dimensional analysis calculation (!), so we need to teach dimensional analysis; let's put that in the early chapters, this way, we can shoehorn dimensional analyis as a technique in other calculations (unit conversion, molarity, unit cells, etc.) ... and then, we, instructors, have to deal with students who just don't get dimensional analysis  and so must not understand chemistry.
I mean, what does it say about GenChem when most of its textbooks begin with math (units and measures, sigFigs, unit conversion, dimensional analysis)?
Keep the Math
Gregorius, I wonder if you have seen a pattern that I'm seeing: students with low math skills also don't do very well with Lewis Dot or VSEPR. When the calculations and equations are removed, you are still left with patternrecognition, spatial reasoning, and even just logical flow of argumentsthat are still tied to chemistry. Just watching students attach four F atoms to C, then watching them struggle with attaching two O atoms to C...there's an aspect of math there. It's again with high energy states vs. low energy states, even if one removes the numbers from the argumentthe concept of size, an understanding of numbers, is still required. I do not think it possible to completely remove math from chemistry, as it seems that math is the basis of language and communication in science. How much math should be in Gen Chem? Enough to give our students a fighting chance to succeed in their future careers. We just have to spend some time and effort to find out how much that is. (And it may be shifting due to computers and technology changes.)
Reasoning skills
Based on my students, scientific reasoning skills correlate to mathematical skills. Both math and reasoning require that students think about the concept and apply it where many students wnat to just memorize and regurgitate facts and definitions. Taking out math wouldn't resolve that issue and neglects the fact that math is part of science.
Scientific Reasoning Skills
Allison,
How do you measure your students' reasoning skills? This is one of the directions in which the Texas Team is currently in discussions about. We will determine from this year's MUST (arithmetic) vs. DAT (algebra) which is more correlated to students' success and then go to the next step. Is this next step: logical thinking, analytical reasoning, precal, spatial, vocabulary, reading? Anyone, an ideas?
Reasoning
We administered the "Classroom Test of Scientific Reasoning" last year but haven't done a complete comparison with math ACT scores or similar metric. I should have clarified that it was more based on anecdotal evidence in talking to students and seeing that many students do poorly mathematical questions as well as conceptual questions on exams.
CTSR
Thank yousounds like a very valuable instrument.
Measuring reasoning skills
Dr. Mason,
I agree that measuring raw sciencereasoning skills is difficult at best, especially if we are hoping for some kind of standardized test or scantrontype multiple choice questions. While I've had some success in getting a better picture of the scope of my students' understanding using something like the pairedquestions (concept and algorithm) exams, it is difficult to come up with good pure conceptual questions on a multiplechoice type test for the entire span of GenChem. And, after a while, as with most multiplechoice type questions, the assessment still seems to be behavioristic with a onequestiononeanswer feel to it.
We experimented with multiplechoice questions where all the answers were correct to varying degrees or correct within a particular concept domain (macroscopic, particulate, energy, etc.) and this allowed us to gauge the level of learning (using Rasch modeling). But, we ended up using the data/results as a formative assessment so that students could selfremediate. We could not use the results to give a student a grade.
For assessments that avoided behavioristic principles of learning, it seems that openended questions seem best (and could be used for grading purposes)  but then grading such tests became a fulltime job. I guess it comes down to whether we subscribe to learning as cognitivist (vs behaviorist), and in believing so, face the morass that comes with openended questions.
Math skills as indicator of student quality (?)
At my home institution, we've used Math SAT scores as the main indicator for potential success in GenChem I (and preregister incoming freshmen into a oneyear alternative to the first semester of GenChem). So, yes, in general, students with low Math SAT's generally do not do well in all content areas (whether in the Mathintensive stoichiometry calculations or the patternrecognition Lewis structures).
But, if we look closely, students with low Math SAT's also have low scores in the Verbal and Critical Thinking SAT's. So it's not that they can't do the math in GenChem, it's more like they don't have very effective learning skills.
Instead of trying to improve their math skills, I've taken the tack of trying to improve the learning habits of these low performing students (by showing them that they can teach themselves with interactive animations, or giving them the position of authority like in a flipped classroom). I've "saved" a good 60% of students this way, students who would have failed GenChem I.
I guess my point here is, iff we want our GenChem to have less math and more conceptual understanding  on the principle that the math in GenChem is a tool and not chemistry itself  I think even our weakest students can succeed in that.
Pages