You are here

Assessment for Active Learning


Pam Auburn

11/13/16 to 11/15/16

Effective assessment is a key component of student learning and motivation. It engages students in the learning process and pinpoints individual areas for improvement. Effective assessment also informs teaching. So how can we be sure that assessment is working for these ends and not against them? Assessment must be planned and built into instruction. In-class activities that target common misconceptions not only serve as formative assessment, they also stimulate peer interaction and build metacognitive skills. Assessment as, for and of learning together can support and improve both learning and teaching. This presentation will discuss the roles of both formative and cumulative assessment in learning and instruction. Tools for planning and aligning assessment according to instructional goals will be provided.


When faculty get together to discuss assessment too often the conversation is restricted to the evaluation of students. In a national survey of chemistry faculty, 90% reported that the motivations for assessment were external relating to accreditation and certification.1 Only 7% believed that these assessment efforts were of value. Assessment in its various forms is much broader and can have a much broader context. Frequently faculty and departments are required to report on assessments of student learning at both course and program levels. At the same time classroom assessment is used for a variety of diagnostic and evaluative purposes. Here it will be proposed that through the use of hierarchically structured learning objectives, assessment can be aligned and integrated to serve all these needs. Moreover, informing students of these learning objectives at every stage of their learning engages them and allows them to better monitor their learning progress. Through the development of hierarchical aligned learning objectives there is a shift from a focus on assessment OF learning to one of assessment FOR learning. The core principles of assessment for learning include: (1) clearly communicating measurable learning objectives prior to instruction and (2) providing meaningful feedback that guides teaching and learning.

More meaningful assessments require a focus on objectives and outcomes. Often this process begins when a gap is noted between what faculty expect students to learn and what they do in fact learn. In many cases faculty expectations are implicit and are never clearly articulated or communicated. Instructors may not have clearly thought out exactly what it is they want students to be able to do as a result of instruction. Learning objectives are a clear statement about what students should be able to do as a result of instruction. Having a clear set of learning objective for a course, unit or lesson allows one to better plan instruction. Learning objectives not only set a framework for instructional design, they also guide assessment.

Any effective assessment plan must begin with a clear articulation of what faculty expect of students. Learning objectives need to follow the S.M.A.R.T framework in that they must be specific, measureable, achievable, relevant and completed within a specified timeframe. In addition the number of learning objectives should be limited in number. Given that one of the goals of this form of structured hierarchical assessment is to focus teaching and learning, too many learning objectives will undermine this intent.

So where does one begin the process to develop aligned, hierarchical assessment? Begin at the top with the end in mind. For most faculty this means beginning with course level learning objectives. In developing these consider what underlying concepts connect the content in the course. If a student were asked to explain content six months after completing the course what enduring concepts should be included in the explanation. Course level learning objectives should be:

  • Central – What are the key concepts that thread through the entire class (no more than 7) (e.g. stereochemistry, electronic structure, functional groups, thermodynamics, kinetics, mechanism)
  • Leveraged – What are the concepts that connect multiple units within the class (e.g. Acids and bases, nucleophilicity, electrophilicity…)
  • Enduring – What concepts in this class will students need in future classes or career applications

The ideal number of course level learning objectives is somewhere between 7 and 10 and certainly not more than 15. For some state institutions course level learning objectives are mandated. Even if this is the case, going through the process of developing course level learning objectives forces a valuable reflection on the content and scope of a course. Once course level learning objectives are set these are broken down to unit level and lesson level objectives.

At each stage learning objectives provide a measurable learning target and thus guide instruction and focus learning. The gold standard for the development of measureable learning objectives is Bloom’s taxonomy.3 To be of maximum utility as a guide for instruction learning objectives should reflect how they will be measured. The goals must communicate both what is to be measured and the cognitive level at which the measurement will be structured. Bloom’s taxonomy provides a framework for accomplishing both these tasks. While it is certainly desirable for students to know or understand some fact or concept, understanding and knowledge are not directly measurable. Bloom’s taxonomy provides a mechanism for structuring learning objectives such that a measureable specified level of cognitive complexity is specified.

An example is provided below

  • Course level: Analyze reaction mechanisms in terms of energetics, electron flow, reaction kinetics, and thermodynamics
  • Unit Level: Distinguish between kinetic and thermodynamic control in electrophilic addition reactions
  • Lesson Level: Predict the major product of an electrophilic addition to conjugated dienes

Learning objectives once established should be introduced early; preferably in the syllabus.4 The syllabus introduces the course structure to students. It typically contains a list of course graded course requirements but rarely how these requirements will be assessed. A large body of research suggests that assessment information can significantly improve learning.5 It provides benchmark against which students can measure their own progress. In so doing students improve their ability to regulate their own learning process.

The ability to regulate one’s own learning process or metacognition has been shown to be an important factor in learning and problem solving. The metacognitive skills of novices are generally rather weak. Students do not accurately assess what they do and do not know.6   In providing measurable benchmarks in a syllabus, prior to a lesson and in the form of a test blueprint prior to a test can significantly improve metacognitive skills. This process works best when learning objectives are reinforced often (before each lesson), recycled and integrated into new material, and used in the construction of test blueprints.

A test blueprint identifies the learning objectives that will be assessed, the level of cognitive complexity and the weight given to each objective.

The test blueprint should be specified before the test is written. The objectives assessed and the weights given to each objective should reflect actual classroom instruction in terms of content, cognitive complexity and emphasis. In the absence of a test blueprint, the development of an examination is often haphazard. Some learning objectives are over-emphasized and others neglected. Questions are selected based on cleverness or ease of writing rather than as an accurate reflection of learning goals. A test blueprint communicates to students the performance goals they are expected to achieve.

Most importantly the test blueprint should convey to students the behaviors that they are expected to achieve on the examination. The blueprint should be provided to students at least one week prior to the examination. Too often students believe that re-reading a text or going over notes is an effective way to study for an examination. By specifying performance measures for an examination, a blueprint can guide students to a more effective study patterns.  A test blueprint is not a study guide. It does much more than inform students of the content to be covered. Rather it is a set of performance goals against which students can benchmark their own learning.

After exams are graded students can compare their expected achievement on specified learning goals with their actual level of attainment. This feedback is helpful in focusing further study and in improving metacognitive skills. Cumulative data on class level achievement on each specified performance goals on the blueprint is also shared with students. This lets students see how they are doing relative to others on specific performance goals. In addition when overall class achievement on a learning goal is low this may be related to instruction or a misconception that has become entrenched. It can be very useful to discuss this with a class as it may lead to instructional improvements. One possible improvement might be the introduction of formative assessments that catch identified misconceptions. 

Classroom assessment can be broadly categorized as formative or summative. Summative assessments are typically exams given after completion of instruction, In contrast formative assessment is often integrated with instruction with the purpose of providing feedback to both instructor and student on progress towards a learning goal. Formative assessments frequently target common difficulties or misconceptions.  The purpose is to bring these into the open and correct them during instruction. Thus when a summative assessment reveals class level low achievement on a learning objective, the instructor may want to develop formative assessments on this objective to be used in future instruction. Clickers, warm up exercises, quizzes and class discussions are common forms of formative assessment.

In summary setting up aligned hierarchical learning objectives can provide data to improve both student learning and classroom instruction. Assessment data form measures that are aligned with hierarchical learning objectives further provides information that may be required for external reporting related to accreditation or certification. The aggregate data collected from examinations that were designed from blueprints provides information on class level achievement of learning objective. Setting up this scheme is not an easy task but the investment has substantial payoffs.

The process assures that there is alignment from course level objectives all the way to classroom assessments. Assessments designed with hierarchical learning objectives in mind better reflect instructor expectations and classroom instruction. Rich feedback is provided that can improve both teaching and learning. Communication to accreditation and certification agencies is facilitated.


  1. Emenike, M.E.; Schoeder, J.; Murphy, K.; Holme, T. J. Chem. Educ. 2013, 90, 561-567.
  2. Black, P.; Harrison, C; Lee, C.; Marshall, B.; Wiliam, D. Phi Delta Kappan  2004, 86 (1) 8-21.
  3. Bloom’s Taxonomy taken from:
  4. Ludwig, M. A.; Bentz, A.E.; Fynewever, H. Journal of College Science Teaching 2011,  40 (4) 20-23.
  5. Black, P.; William, D. Phi Delta Kappan 1998 Phi Delta Kappan 80 (2) 139-148.
  6. McGuire, S. Y. (2015) Teach Students How to Learn: Strategies You Can Incorporate Into Any Course to Improve Student Metacognition, Study Skills, and Motivation Stylus Publishing,  Sterling,  Virginia




Thanks for your interesting and informative paper.  You have convinced me that having explicit learning objectives and test blueprints would improve learning in my courses.  I already use formative assessments like clicker questions and warm-up exercises in my courses.  But I have never written learning objectives.  The whole approach seems a bit overwhelming and I do not know where to start.  I'm also worried that I would be inclined to work backwards with the test blueprint.  I've been writing tests for 25 years and that feels familiar to me.  Would it be crazy to start by taking a test I have given before and writing the test blueprint for that one?  Do you have other advice for an experienced instructor who is a newbie in terms of assessment?



I would start by reading some of the work done by Grant Wiggins and Jay McTighe. You might pick up a copy of Understanding by Design and the partner guide book. Both should be available on Amazon. 

The problem with starting with the tests that you already have is that this would give you a picture of what you are assessing now. This may or may not be what you would want to focus on after going through a backwards design process. Last summer I took part in a workshop given by the Foundation for Critical Thinking. In one exercise we were asked to come up with the five to seven key concepts that provide the foundatin for all learnng in our class. This is hard. As a prompt we were asked to consider meeting a former student six months after taking our course and asking them a question. What concepts would we expect them to mention in their response. Mastering these concepts are course level learning objectives. Drill down from there. 

SDWoodgate's picture

This is marvelous stuff!!  Assessment is such an important part of teaching and something which most university staff are expected to do without any prior instruction in good methods of doing so.

In my experience writing learning objectives focuses the mind, probably because it makes you think more deeply about what you are really trying to do and goes beyond the simple presentation of content.

I love the idea of a blueprint for the exam because creating a blueprint also makes you stand back and think about the questions more deeply, but I am afraid I don't quite understand your blueprint.  What do the numbers mean?  You mention something about weighting.  Are the numbers an arbitrary weighting scale?  Are the sums at the bottoms of the columns meant to show relative weights?  Does the total of 102 have any meaning?

The numbers are the actual exam points assigned to the items relating to those learning objectives. The rule of thumb is that at least 50% of the points in an exam should be at or above the target level of cognitive complexity. So for me, I think all my students should be able to achieve at the application level. Application is my target level. In the example above there are 102 possible points on the exam. The numbers at the bottom of the columns are the total number of points at each of the respective levels of cognitive complexity. I hope that helps. If not let me know and I will try again.

Hi Pam,

Very interesting article.  It certainly made me think about the structure of my courses over the weekend.

For someone who doesn't have specific learning objectives written yet, do you think it is possible to start with the goals/objectives/concepts listed at the beginning or end of a textbook chapter?  These lists are generally quite substational and would certainly need to be condensed.

My second questions is about the blueprint.  I think this is a great idea to foucs the students' attention to the most important topics.  Do you have any feedback on what the students thought about it?  How helpful did they find it?



1. I am not seeing a way to share my syllabus here but for Ochem 1 and 2, I have course level and module level learning objectives in my syllabus. My belief is that one really must start with the end in mind. Earlier I mentioned that one way to think about this would be to consider the concepts you would hope a student would use in answering a question about your course content six months out. If the number of learning objectives gets too large, one defeats the purpose. These need to be guideposts against which learners can evaluate their own learning. If there are too many one loses focus. You are correct in the observations that most textbooks have too many learning objectives and many that are not measurable and skill based. 

2. My students love the module level learning objectives and the blueprints. Because there are skill based and measurable they are better able to assess their own learning. We all know students are notorious for not doing this well as evidenced by the number of times they report thinking they did well when they did not. I see many copies of syllabi and blueprint with LOs checked off or edited with specific questions. As soon as a test date is set my students start asking for a blueprint

Pam, I think that we can probably get your syllabi posted to go along with your paper here.  Please send them to me and I'll work on it.



While not quite as structured as these blueprints, I provide my students with a file that I call "Need To Know" as in need to know for the test, which is keyed to the lectures and textbook chapters.  I try and provide a couple of questions from the text for each point.  My practice has been to provide this information two weeks before a test, but I see the advantage of providing it at the start of a section to guide the students' study


Josh Halpern

Emily Alden's picture

Hi Pam,

Great paper! As a student I can absolutely relate to what you're saying. Thinking back on the classes I've taken almost all of the syllabi have included course objectives but that is the first and only time they are mentioned. Conversely there have been a few classes where those objectives were broken down into learning goals for each lecture. Those were the classes where I felt like I was studying with a purpose and the exams reflected those expectations. 

I love the idea of the test blueprint. So many times we (students) get to the exams and realize we've studied the wrong materials or have glossed over a topic that is a large portion of the test. When this happens it makes studying for the next exam even more chaotic and disheartening. 

I know this is something I would take advantage of but have you seen students utilizing the test blueprints to evaluate their individual work on an exam and adjusting their study habits accordingly? Is this something done during the class period to encourage students to participate in the process? 




I generate a blueprint before I set the exam. When I write the exam, I do it with the blueprint so that questions map onto learning objectives. When I hand back exams students can see how well they did on questions but also how well they did with  respect to the learning objectives. 

The state requires that I provide data on overall student achievement against course level learnind objectives. Having drilled down the course level learning objectives to the point of test blueprints, the reporting is really very easy. 

What is more useful to me is that I can look at the exam and know what learning objectives were missed at a class level suggesting that I may need to work on how those learning objectives are taught in future classes. 


I am interested in knowing how students might respond to receiving test blueprints when they have never previously heard of them.  Do other instructors at your campus use test blueprints?  Did these students have test blueprints to help guide them when they took general chemistry?  I'm guessing that you have probably used test blueprints the whole time you've been teaching.  But if I'm wrong about that, perhaps you can tell us how students responded to the new existence of these guides to help them focus their learning and how to self asses.



On the first day of class I talk to students about how to learn. I have a short powerpoint deck based in work by Saundra McGuire I further introduce Bloom's taxonomy. Before getting to my class many students have only been required to learn at lower levels of Bloom. My favorite comment this semester was "Yeah we did acid and bases in Gen Chem but it was just a bunch of calculations. We never learned what it meant" Students say the darnest things. 

At the beginning of each class there is a quiz or warm-up. I tell students the level of Bloom accessed in these and why the questions were included (usually based on some previously misidentified misconception or over-generalization) When we discuss answers, I remind them that if they missed this question they need to work more on the LO accessed. Students catch onto this very quickly. I find that they get more engaged and ask for more feedback.

When they get a first blueprint we discuss it. By the second exam they are asking for it. I also share these with the peer tutors in our study center.  

My secret is that I actually have a second degree in Education that focused on assessment. So yes I was familiar with test blueprints and ideas about strengthening metacognition for my graduate work. 

Pam -- Among the scientists who study how the brain works, the consensus is that students, when solving problems, must rely primarily on information that has previously been well memorized.  As in most representations of Bloom's taxonomy, knowledge is the broad foundation for the pyramid.  In your system, how important is knowledge that can be quickly retrieved from memory?

-- rick nelson


It can be hard to solve problems if you do not have any tools in your tool chest. However tools are more effectively learned in context. For example knowing what a hammer is would fall at the knowledge level of Bloom's taxonomy, knowing how to use it is application. I suppose I would argue that one does not really "know" a hammer until one can use it properly. It is after all a tool. I would suggest the same is true of the reactions in OChem. Knowing what something is without knowing how to use it, can get one in real trouble; as it did my sister who ,as a child, used an obscene phrase she had heard in an argument with my father. Later she sheepishly asked me what it meant and followed with the comment no wonder he got mad. 

I used to contribute to this conference and have lurked on it for many years. There are many comments made about learning, algorithmic learning, 'knowledge,' etc. I've written a scholarly book about learning (Unified Learning Model, Shell et al.) in 2010. It's a heavy read. More recently I've co-written a book that has much broader scientific roots and has the virtue of being online: Minds, Models, and Mentors (Brooks, Trainin, & Sayood). We update that book regularly and have adopted a formal method for cataloguing those updates such that readers can focus only on the changes after their first read through. ALL of the assertions we make are supported by references, many of which are accessed easily through links. Also, the text has some links to sources that you really might not believe -- such as the McGurk effect or the time required to memorize the sequence of cards in a shuffled deck of 52 playing cards (< 22 seconds). The link is at: Be our guest. Best, Dave Brooks

Greg Baxley's picture

Hello Pam (sorry I had the names mixed up at first),

Would you be willing to share your PowerPoint about helping students learn?

thank you,



I just sent this to Jennifer who tells me that she can get it posted