You are here

Student Engagement with Flipped Chemistry Lectures

Author(s): 

Michael K Seery, Dublin Institute of Technology, Dublin, Ireland

05/09/14 to 05/15/14
Abstract: 

This project introduces the idea of "flipped lecturing" to a group of my second year students. The aim of flipped lecturing is to provide much of the "content delivery" of lecture in advance, so that the lecture hour can be devoted to more in-depth discussion, problem solving, etc. As well as development of the material, a formal evaluation is being conducted.

Fifty-five students from year 2 Chemical Thermodynamics module took part in this study. Students were provided with online lectures in advance of their lectures. Along with each online lecture, students were given a handout to work through as they watch the video. Each week, a quiz was completed before each lecture, which allowed students to check their understanding and provided a grade for their continuous assessment mark.

The evaluation is examining both the students’ usage of materials and their engagement in lectures. This involves analysis of access statistics along with an in-class cognitive engagement instrument. The latter is measured by "interrupting" students as they work through a problem and asking four short questions which are drawn from another study, which aim to examine how students were engaging with the materials in that moment.

Results from this, along with access data, quiz scores, and student comments, aim to build up a profile of how the flipped lecture works for middle stage undergraduate students.

Paper: 

Student Engagement with Flipped Chemistry Lectures

Michael K Seery, Dublin Institute of Technology, Dublin, Ireland.

This project introduces the idea of "flipped lecturing" to a group of my second year students. The aim of flipped lecturing is to provide much of the "content delivery" of lecture in advance, so that the lecture hour can be devoted to more in-depth discussion, problem solving, etc. As well as development of the material, a formal evaluation is being conducted.

Fifty-one students from year 2 Chemical Thermodynamics module took part in this study. Students were provided with online lectures in advance of their lectures. Along with each online lecture, students were given a handout to work through as they watch the video. Each week, a quiz was completed before each lecture, which allowed students to check their understanding and provided a grade for their continuous assessment mark.

The evaluation is examining both the students’ usage of materials and their engagement in lectures. This involves analysis of access statistics along with an in-class cognitive engagement instrument. The latter is measured by "interrupting" students as they work through a problem and asking four short questions which are drawn from another study, which aim to examine how students were engaging with the materials in that moment.

Results from this, along with access data, quiz scores, and student comments, aim to build up a profile of how the flipped lecture works for middle stage undergraduate students.

 

Introduction

Flipped lectures at university level began to be touted a few years ago and quickly became a popular choice among early-adopter innovators to replace the traditional lecture. The concept of providing material in advance of the class so that more time could be spent on active learning during class is appealing to educators, and the flipped model provides a useful design framework for how one might integrate in-class and online materials. However, despite a lot of attention in the “popular press” of blogs, online journals, and social media, there is scant detail on the effect of flipped lectures on student learning in the education literature. The study most often referred to is an article published in 2000 in the Journal of Economics Education.[1] That article describes the implementation of the inverted lecture, along with some evaluation based on student opinions. More recently an article in Chemistry Education Research and Practice describes the implementation in the context of a General Chemistry module, with the implementation consisting of a student survey.[2]

Recent work by the author reported the effect of providing some material to students in advance of a lecture.[3] These pre-lecture activities aimed to introduce some core terminology and ideas prior to a lecture with the aim of reducing the in-lecture cognitive load. That research found that introducing some terminology and structure in advance of the lecture improved grades for students who had and had not any prior knowledge of chemistry, and in addition narrowed the gap in these grades to a non-significant difference. While pre-lecture activities differ from flipped lectures in terms of the amount of information provided in advance and the nature of the subsequent lecture hour,[4] they have similarity in their underlying rationale: providing students material in advance of the formal teaching time may help reduce the cognitive load as students will have some familiarity with the material when it is being discussed in class.

On the basis of this rationale, the current study aimed to examine the implementation of a flipped classroom model of delivery in place of a more traditional lecture model. The study aimed to address the following questions:

  1. Would students engage with the materials in advance of the class?
  2. Would students attend lectures for material that had “been delivered”?
  3. Would students engage with the more active approaches being taken in lectures?

 

Details of Implementation

Undergraduate chemistry degrees in Ireland typically consist of four years (stages) with 60 ECTS credits (European Credit Transfer and Accumulation System) per year. After a common science first year, students take modules in chemistry for the remainder of their degree. This module in Physical Chemistry was delivered to 51 students during their second stage (Year 2) in the first semester. The content consisted of introductory thermodynamics (First Law, Second Law, Solution Chemistry) and made up half of the module (2.5 ECTS). The other half of the module was delivered by another lecturer in the traditional manner. In total in Year 2, students take two modules (2 ´ 5 ECTS) in Physical Chemistry.  The hours of delivery were 9 am and 3 pm on Wednesdays for 6 weeks over the second half of semester (the final week consists of tutorials). The modular exam for the semester 1 module is held in January and consists of 50% of the mark. The remainder of the assessment is derived from laboratory (30%) and continuous assessment (20%). 10% of the continuous assessment mark is derived from the Thermodynamics half of the module, and is referred to below.

The twelve hours traditionally delivered to students was re-configured into five weekly screencasts, prepared especially for this purpose. Details on design considerations for these screencasts are available.[5] Students were asked to watch the screencast before Wednesday of each week. Screencasts were typically 10 – 15 minutes long. While watching the screencast, students completed worksheets where they had to write out explanations and try questions.[6] In addition, they were referred to the textbook at various points where they were asked to work through worked examples and other questions to check their understanding in their own time. Once they had completed the screencast and worked on their questions, they completed a pre-lecture quiz. The questions in the quiz were devised by the author such that they followed on from the worked examples students were asked to work through. Each student completed similar questions, although the values in each question were different. The quiz had to be completed prior to the lecture, and after this time, students could review their answers and the correct answers. The sum of the pre-lecture quiz marks was used to compute the continuous assessment mark (10%) that was drawn from this component of the module. On the whole, the entire pre-class work required approximately 45 minutes to 1 hour of work from students.

The cognitive load considerations underlying the approach were used to design the lecture hour. As there were two lectures per day following on from an associated screencast, it was decided to use these two hours to progressively develop students’ understanding and problem solving as related to the material under consideration. Therefore, the first hour consisted of revisiting some core concepts in the introductory 10 minutes of the lecture, followed by a series of problem sets, the design of which followed directly from the pre-lecture quiz. Students worked on these in small groups, typically groups of three. The purpose of this first hour was to get students talking about the topics under consideration and working on algorithmic style problems. This was to ensure that they had the knowledge and confidence to use the core equations and approaches in each of the topics. In the second lecture hour, the students were given more advanced problems to work through, again in groups. These often had missing data or data that need to be estimated, and/or brought together related topics.  The aim here was that students would be able to move on to applying the core knowledge in each topic to a thermodynamics problem. The author circulated the lecture hall dealing with queries and prompting questions. Finally, after the lecture, some worked example videos were placed on the virtual learning environment, which covered the approach to some of the main problems if students wished to revisit them at their own pace.

 

Observations from Implementation

As mentioned in the introduction, this pilot study was interested in three main questions. These are considered in turn below.

  1. Would students engage with the materials in advance of the class?

Screencasts and pre-lecture quizzes were made available for the week preceding each lecture, giving students seven days to complete the work required in advance of the lecture. As these students are quite busy with course work (four chemistry practical reports per week to prepare, continuous assessment from other modules), there was a concern that even though intentions might be good, students would not engage with the preparatory material.

Analysis of the access data for each screencast shows that overall, 92% of students watched the screencast at least once each week (Figure 1). These included three students who never logged into watch videos or who never came to lectures, and in weeks 1, 3, and 5, an additional three students. When the latter group of students were asked why they did not watch the video, the typical answer was that they forgot. Students who did not watch a video were automatically sent an email reminding them to do so the following week, which may explain the periodic nature of the data.

 

 

Figure 1: Proportion of students who watched and did not watch weekly screencast

The day that students watched was also recorded, and it was found that most students watched on the day before the lecture, in the evening time (Figure 2).  There are some indications that as the weeks went by, the time of day moved to earlier in the evening, which suggest students began to build it into their regime on a Tuesday evening.

Figure 2: Access data showing days of the week (top) and time of the day (bottom)

 

In addition to watching the screencasts, students were required to complete a pre-lecture quiz. All students who watched the screencast completed the quiz. Those that didn’t also did not complete the quiz, which helps to verify ‘forgetfulness’ as the reason for non-viewing.  The overall average on the quiz for all quizzes was 69%, excluding those that did not attempt (DNA) (Figure 3).

Figure 3: Scores achieved in weekly quizzes

(Topic 1: 1st Law, Topic 2: 2nd Law, Topic 3: Solution Thermodynamics).

 

  1. Would students attend lectures for material that had “been delivered”?

Having completed the online work in advance of the lecture, the next concern was that students might feel that they had “covered” the content, and therefore had no need to attend the lectures. As with the concerns about watching the screencasts, these fears were mostly without basis. The attendance at each lecture was logged and overall, the attendance was good. Attendance for the first four weeks was above 70% for morning lectures and 80% for afternoon lectures. Attendance in Weeks 4 and 5 were lower, and this is probably due to a combination of it being close to the end of semester, as well as an increasing amount of coursework being due. Nevertheless, attendance for this module was above the average attendance for other lecturers who recorded attendance that semester.

 

 

  1. Would students engage with the more active approaches being taken in lectures?

One of the core aims of considering flipped lectures is that students engage with material during the lecture in a more meaningful way. As well as the traditional means of considering engagement considered above, it was decided to use an instrument to measure students’ cognitive engagement as they were working through some material in a lecture. This was achieved by using the survey developed by Rotgans and Schmidt.[7] This survey asks students to consider, in a particular moment, four aspects of engagement: (i) whether they were engaged with the task at hand, (ii) whether they are putting in effort, (iii) whether they wished to continue working, and (iv) how deeply involved they were in the activity.

Mid-way through one of the afternoon lectures, students were interrupted while they were working on a problem during class time. They were handed out a sheet of paper which contained a table listing “Statement 1” through “Statement 4” in the left hand column and a five-scale Likert response across the top, running from “Not true at all for me” on the left to “Very true for me” in the right (Figure 4). Students were then shown each of the four statements in turn and asked to rate whether that statement was true or not for them (i.e. students did not see the statement until it was displayed on the screen). Then the sheets were collected and the students resumed work. The entire exercise took approximately five minutes.


Figure 4: Cognitive engagement instrument based on Rotgans and Schmidt.7 The text of the statement was not on the student form.

 

In order to explore the students responses to the four statements, the difference between each student’s response and the “neutral” response was calculated, where each of the five statements were scored 1 – 5 with neutral being 3. The results of this analysis are shown in Figure 5.

The analysis shows that students agreed, to decreasing extents, with statements 1 – 3. This suggests that the students were actively engaged with the work to hand (Statement 1), were applying mental effort while doing so (Statement 2), and at least did not mind continuing to work on the problem (Statement 3). Strong disagreement with statement 4 (“I was so involved I forgot everything around me”) is probably to be expected given that students were actively encouraged to discuss and work through problems with their neighbours.

 


Figure 5: Responses to cognitive engagement instrument for each of the four statements

 

Discussion

The implementation of the flipped lecture approach was approached with some trepidation. Fears that students would, in the context of a busy academic week, not have time to prioritise the work required to engage in the material prior to lectures, or not attend lectures themselves resulted in this pilot study monitoring pre-lecture work, attendance, and in-class activity. As can be seen from the data, the students taking the module embraced the flipped model whole-heartedly. They engaged with the material prior to the lecture, attended the lectures, and worked well on assigned tasks during the lecture. In this regard, the implementation was considered to be successful.

Some limitations may apply to these conclusions, and further research aims to use focus groups to learn from students whether and to what extent these are a factor. As an example, it is clear from attendance data that the average attendance falls over the course of the delivery of the material. It would be interesting to probe to what extent this was end of semester time constraints compared to whether students were pushing the activity to a lower priority as the content could be “covered” online. In simple terms, it is worth examining whether flipped lectures are useful as a novel alternative for a short course/topic, or whether they could sustain a longer roll-out over the course of a semester or year.

Additionally, while the emphasis in this design was to prepare students before a lecture and work on material during a lecture, there was no formal follow up for students on their understanding after a lecture, until the end of module exam. While anecdotally the quality and performance in the end of module exam was much better, it would be interesting both to examine this, as well as the student understanding after each topic.

In this context, some interesting serendipitous observations were made in terms of what students found difficult. At the end of every second quiz, which was the final part of a topic, students were asked to look back on the material in that section and say what they found most difficult. As an example, a word-cloud for the responses on the first topic (Enthalpy and the First Law) are shown in Figure 6. Unsurprisingly, the areas of difficulty were “enthalpy” itself, “calculations”, and “energy”.


Figure 6: Word cloud of responses to question: “What was most difficult for you in the last two weeks?”, asked in Week 2.

 

More interesting than the words themselves however were the different levels of student responses observed. Surveying the responses as a whole, it was found that the responses could be assigned to one of three categories. Examples are shown here (Table 1), but the responses suggest that this may be a fruitful area for examining student engagement and independence with their own learning in a flipped lecture framework. 

Table 1: Assigning categories to student responses to the question: “What was most difficult for you in the last two weeks?”

Category

Example statement

  1. There are difficulties and I don’t know how to address them.

Calculations mainly, just knowing what part to use of the information given. Theory is fine and that but mainly equations are what I find difficult. 

  1. There are difficulties, and I have pinpointed the area where I am getting stuck.

Trying to understand the signs for energy loss and gain was a little bit tricky.  For example, how energy given off is a minus number and how energy taken in is a plus.  I feel I've got a better hold of the concept now though.

  1. There are difficulties, but I have independently worked out what was causing the difficulty.

I found that in lecture one the question sheet question 10 and 11 which were calculating the standard Enthalpy change and calculating the standard Enthalpy for glycolysis. These two questions required some attention but with the help of the [text] book and the internet I was able to understand the method behind the questions.

 

Conclusions

The implementation of a flipped lecture model in a mid-stage undergraduate chemistry module was piloted over the duration of half a semester. The design of the implementation was grounded in cognitive load theory. Students engaged with the module and its online materials, and worked well in class on active learning components there. The implementation has thrown some light on some issues surrounding flipped lectures, as well as some thoughts that may be useful in the delivery of online material in the future.

 

Acknowledgments

The author thanks the College of Sciences and Health and the Learning Teaching and Technology Centre, Dublin Institute of Technology for their support of this project. In addition, the author acknowledges Dr Roisín Donnelly and Dr Claire Mc Donnell for useful discussions and their input into this work.

 

References




[1] M. J. Lage, G. J. Platt, and M. Treglia, Inverting the classroom: A gateway to creating an inclusive learning environment. The Journal of Economic Education 2000, 31, 30-43.

[2] J. D. Smith, Student attitudes toward flipping the general chemistry classroom, Chemistry Education Research and Practice, 2013, 14, 607-614. 

[3] M. K. Seery and R. Donnelly, The implementation of pre-lecture resources to reduce in-class cognitive load: A case study for higher education chemistry, British Journal of Educational Technology, 2012, 43, 667–677.

[4] M. Seery, Jump-starting lectures, Education in Chemistry, 2012, 49, 22-25. 

[6] Permission was obtained from the textbook publisher to reuse images in the screencasts and handouts.

[7] J. I. Rotgans and H. G. Schmidt, Advances in Health Science Education, 2011, 16, 465-479.

 

 

Comments

I understand that the educational system in Ireland is different from that in the US, so that may be the reason for my question. I'm unfamiliar with the term "continuous assessment." It sounds like that might be what I might call a homework average or a quiz average. Am I on the right track?

My second question may also be related to differences in educational systems. You assigned pre-class activities once per week. How does that frequency compare to those of class meetings? Did the class meet one day per week, with separate lectures in the morning and afternoon? Does the class meet on multiple days per week? Perhaps another way to state the question - were the pre-class activities assigned before every day with a class meeting or only before some of them?

Oversby's picture

This term has been in continuous (!) use now for decades. I have just Googled the term and had a number of very useful hits. It is well described in http://en.wikipedia.org/wiki/Continuous_assessment in good detail. It is not the same but has some common characteristics as formative assessment. This is not an issue of Ireland and the US being different, indeed I have friends in both countries who use continuous assessment in higher education and at secondary schools. The second question is also not, IMHO, about differences in educational systems.

Michael Seery's picture

Hello!

Many thanks for your questions. I appreciate there are differences in language - one of the nice things about travelling to US is learning about differences usage of words: trunk, elevator, faucet...to name a few! We also put "u" in a lot more words than the US spelling does - so if you see "favourite" you know I mean "favorite".

Regarding the first question, continuous assessment is where students complete some activity during the semester (weekly homework, reports, presentations etc), and the grades for this work make some contribution to the final module mark. I think this is what US call "coursework", although I am not sure about that! Typically a chemistry module has three assessed components: the laboratory work (and reports), the continuous assessment, and the end of module exam. It's important in the context of the current study, because generally continuous assessment is used to "encourage" students to do work during the semester - essentially it accords the work some value.

Regarding the second question, I saw the students twice per week - on the same day - for 6 weeks. All students come to both classes. Therefore, they had to one pre-lecture task for every two lectures (and in turn, I only had to prepare one for every two lectures!) This was serendipitous, but actually it worked out quite well as it meant the second of two lecture slots could develop on the content in the first - it gave the whole a nice flow. So in future I'll be requesting two slots on the same day, or at least jut a day apart.

Hope that makes sense?
Thanks again,
Michael

rpendarvis's picture

Could you clarify what you mean by "continuous assessment"? In the USA at least, I think most of us use quizes, homework, various intermediate tests, etc. as a means of keeping the students on track. It is obvious that those must count in the final grade to keep the students involved. Who was it that said "non credit, forget it"? In any case, I am wondering how your methodology differs.

Thanks

Layne Morsch's picture

Michael,

Thanks for sharing this analysis, it's already given me ideas for use in my own class. I have a few questions.

Attendance: Did you take attendance for the same course when you have taught it previously? If so, did you see any changes in attendance pattern?

Video format: Was the video format instructor at the board type, just what you are writing and saying or face and writing?

Have you checked at all the number of times students view the videos? I required students to watch the videos I posted in my flipped class but I was wondering if they might actually watch them more often if it was for their own learning than required for points. (unfortunately, the way my videos were posted I can't get the number of views per student, I wish I had better analytics)

Layne

Michael Seery's picture

Hello Layne,

Thanks for your questions. Glad article has given some ideas.

Re: attendance - yes I did take attendance previously, although when I gave this course last year, it was once weekly for 12 weeks at 3 pm, so it's not an exact comparison. As a general rule in previous years, the attendance dropped as semester went on from a high of over 90% to a low of 60% - not quite linear, but pretty much a steady decline over the semester! This year, the attendance stayed pretty steady apart from the last week; morning attendance was lower than afternoon, but the overall trend was better attendance.

Re: video format - the lectures were presented as screencasts, with some embedded video demos. They weren't lecture capture or videoing of me lecturing. I made the screencasts for this purpose. These linked explicitly to sections in the textbook (this is quite unusual in our system - we use textbooks as a supplementary source rather than following it as a curriculum).

I was able to look at access - in the early weeks (1, 2), most students viewed videos multiple times - I figured this was partly curiosity - having a look to see what it was about, or what was required. As the weeks went on, about half the students viewed the video once, the other half viewed them more, but typically 2 - 3 times. I've no idea if these multiple viewings were of the full video or just checking in on a particular part (maybe perhaps when they were doing the quiz). I think the only real way to find this out is to ask the students though!

Hope that answers your questions somewhat,

Michael

I find this study interesting and thank you for sharing it with us, Michael. I'm hoping you can tell us more about the worksheets and the pre-lecture quizzes. Did the students submit the worksheets for grading/correcting? Or were the worksheets mostly to help them organize and focus their thoughts about the concepts, with the grade coming from the student's answers on the quiz?
Jennifer

Michael Seery's picture

Hi Jennifer,

The worksheets were to help the students structure the notes as they watched the pre-lecture screencasts. I suppose I was worried in my first iteration that students would come out the other end without a clue and without a set of notes! So I wanted to structure their screencast viewing time. I arranged with the textbook publisher to use the images from the book in the screencasts, on the worksheets, so it would be clearer for students to relate book, notes, and screencast. The worksheets had gaps where the students would try out questions, write out their understanding of a topic, and try out questions from the textbook.

The latter was important, as when I walked around the lecture hall, I could see if there were gaps where the students had not tried the questions from the textbook. In these cases, I would prompt them to work on those first in class, and then carry on with the class work, which tended to build on these questions. So it was a useful guide as to how much work the students had done outside of the minimum of watching the screencast and doing the quiz.

The grade came entirely from the quiz. Quiz questions were designed so that they were similar to the questions in the textbook that the students were asked to do, so implicitly it was in their interest to try these questions first. I think next year, I will make that more obvious to them!

Hope that helps. Best for now,
Michael

Kelly Butzler's picture

This is a great article! Thanks so much for sharing. I have been struggling to get students to "buy-in" to this system of viewing the screencasts. Most students either do no preparation or "read" the text book. All analytics about student engagement with the screencasts are anecdotal. I understand your screencasts are linked to the textbook. Are they embedded in a text? Or do students access these through an LMS? Can you expand a bit about how you made your screencasts? (software, hardware, etc)?

I like the worksheet idea. Do you have to teach students to take notes? Could you provide an example of your worksheets? I would really like to see how you formatted these. I am going to have students keep a "pre-class Lesson notebook" for the next iteration of the flipped gen chem course. The format of your worksheets may help guide the notebook requirements.

Thanks again! ~Kelly

Michael Seery's picture

Dear Kelly,
Many thanks for your kind comments. To elaborate a bit more on screencasts: I prepared them by making PowerPoint slides and using Camtasia Studio screencasting software. I tried to give the screencasts a format that made them consistent, so that the position (topic) in the video was clear, as well as any links to textbook explicit in a particular area of the screen. You can see a screen grab of a screencast in this blog post I wrote about preparing screencasts: http://www.rsc.org/blogs/eic/2014/02/preparing-screencasts-few-thoughts That article also alludes to some good design principles from an e-learning perspective (diagrams, amount of on screen text etc).

The screencasts were then hosted in our VLE (Blackboard). The quiz was separate (my experience with SCORM has been... frustrating!) - I used my own questions in Blackboard's native quiz editor. So the students didn't have to watch the video to do the quiz but this didn't seem to stop them watching it.

The worksheet was important. Another query I'm about to answer asks about students taking notes, so will elaborate more there - I feel very strongly that we need to structure this independent time very carefully, especially with students who are not used to making their own notes. I don't think I can share worksheet here as there are images from textbook, but please email me and I will send on examples. In general, they were structured to align with text book sections I wished to cover, but emphasising places for students to commit their own thoughts and answers to questions. I couldn't see myself using this method without it!

Sorry for very long answer, that's probably enough for now!
Regards,
Michael

Michael Seery's picture

Dear Kelly (again!),

Excuse me, it was you who asked about teaching how to take notes; sorry I thought it was elsewhere. That's a great question, and something I was conscious of in terms of designing worksheets. I don't think students get much exposure to real note-taking, so I did make the worksheets very well structured. Perhaps if I was a bit braver, I would slowly remove some of the prompts and supports as the module ran on, to challenge students a bit more in beginning to structure their own notes. The suggestion below about getting students to write a summary of a topic might be a good way to begin to introduce that.

Thanks again for prompting so much thought,
Michael

At the HS level, I have found that requiring the students to write a short summary of the video content that includes at least one question has been helpful for uncovering misconceptions, and of course, which students are actually doing the homework. We run our flipped chemistry courses via Moodle.

Michael Seery's picture

Thanks - that's a very simple but very effective sounding idea. Must try it!

Michael

Bob Belford's picture

Hi Kristin,

When you state you have your students write a short summary of the video that includes at least one question, are you having them ask questions? Or are they answering questions that you have asked? If they are asking the question, which I could see leading to the identification of misconceptions, that leads me to ask, how do you handle the questions?
Thanks,
Bob

Hi,

The students submit questions. Often, similar questions arise, so it is easy to tell what the sticking points are with a given topic. I response to students directly via Moodle Message, but will address overarching concerns with the class as a whole. I have found that requiring students to submit a question along with their video summary is a useful formative assessment.
Kristin

Your response says the students need to ask a question about the content after they view the video. This sounds like a practice others have used where students are required to identify an idea that was the most difficult to grasp. As you say this provides insight into what part of the material is the most troublesome for students. How complex are the questions? How detailed does the summary need to be? How much do these count toward the course grade?

In my classes, I use very broad questions. This is typical:

"Please let me know what you need to know from the readings and video lectures on financial statement analysis. Specify what questions you have, what you find confusing, or what difficulties you are having with the materials."

Here's another:

"Read the file on PolyLearn about RoE at Goldman Sachs. You can see G-S uses a modified DuPont method. Comment on the differences you find in the G-S model. Why do you think they calculate RoE the way they do? Do you think the G-S method improves the RoE result, or reduces it? Why?"

I collect and grade each student's responses and tally them up for an "Assigned Reading Forum" grade that is worth 15% of their overall course grade.

Thanks for your replies and examples.

It looks like there are multiple ways to use questions. One way has students ask questions while the other is more like a take home and/or open book quiz.

In my online classes post their 'answers' in our weekly discussion. I've asked students to identify the idea that was most troublesome for them in the week's work. I also asked them to explain an idea they felt they understood. It was a problem to get them to accept the notion that they could pick whatever matched their situation. They really wanted to be quizzed, interrogated or tested. I used these assignments to encourage students to study for a reason beyond quizzes. I also wanted them to see that they were not unusual to have trouble with a particular idea. This contributed to a more open class atmosphere. The posts were worth about 10% of the total grade.

Your experience with students not being comfortable with the approach is very common. I call it "pushback" or "blowback." Either way, it's a form of resistance to the idea. It seems like you handled the problem well if the class atmosphere opened up. That is a good sign. Students are used to passive learning, where they wait for you to tell them what to do, or to quiz them or interrogate them. The use of pre-class questions with open-ended assessment goals requires them to be more active in their learning, to take responsibility for it, and they often resist as an instinctual response. Students aren't as progressive as they like to think. This blowback often derails efforts to flip the class because instructors give up, so I am glad you persevered.

Gregorius's picture

Hi Michael,

Thank you for a good report. I'm particularly interested in this teaching method since I will be moving my current classes to some version of flipped classroom.

With "new" teaching methods, the two big questions seem to be: do students like it/can they work with it, and do students perform equally or better in this environment as compared to traditional methods. I think your data has suitably answered the first question. Do you have data on learning outcomes relative to traditional teaching methods, compared to other students who did not go through the flipped classroom system?

Greg

Michael Seery's picture

Hi Greg,

Thanks for your question - this is starting to put some meat on the bones!

I don't have any data to answer your question with conviction. So I can only speak with anecdote and opinion, so the following has that caveat:
I think the model has pushed students to use and begin to understand how to use the textbook more. In our system, I think I mentioned the textbook is an aside at best. We don't follow any one rigorously, and indeed even for me to "adopt" a text was quite hard; I had to change sequence and rethink some content. This in turn has increased the value of the text to students, and because we interacted with it much more, they used it a lot more. I know this because all of the library copies were out for most of the semester - something which never happened before! (We had to order more at short notice!) I think this is a valuable outcome because students are learning how to work with a text again (index, sequence) - something which I felt was lost.

Other outcomes may centre around self-direction (approaching an unknown task); reflection (thinking about what they have learned/difficulties); and working through problems with others (in class work) - but they are still in the realm of personal hunch!

Does any of that sound reasonable? It's hard to remove myself from the 'gut-feeling' so I'm trying to be a bit objective!

Michael

Gregorius's picture

Michael,

These certainly sound reasonable. The way I see it, even if all I've done is used a method that has effectively increased time-on-task - either because engagement with content has increased or students are more able (and willing) to study because the scaffolding I've provided (in exercises) allowed them to stay in the flow longer - I'm still all for it. I still remember the feeling of triumph on finding that a simple scheduling change - so that my flipped class was scheduled right after a free period for all students - got 90% of my students into the library study rooms and working on the class material. Data showing outstanding student performance gains soon followed, but I could never tease out whether that was from the flipped class, the animations, the worksheets, time-on-task, etc. ... I console myself with the thought that I was being a teacher not an academic.

Greg

Hi Michael,

I am glad you and many other colleagues found positive learning outcomes, flipping your classes.
I teach General Chemistry in an Engineering university (a course of 9 credits; 72 hours) and perhaps I teach in accordance with the rules, because I can answer ‘yes’ to almost all the 11 characteristics of the flipped classroom.1 Unfortunately I cannot join the chorus of exciting results.
The first day of class I ask my students to complete the Friedel-Maloney’s questionnaire.2 Results, as well as many other courses in the past have been very disappointing, so I used the worked examples.3 The repetition of the test showed a significant improvement: correct answers were in the range 70-85%. In the first written exam, one of the three stoichiometric problems was similar although more complex: only 32% of the students solved it correctly.
The results of the first written exam were very disappointing: the average score was 14:42 (out of 30: N = 80 Ss; marks from 0 to 30, SD 8.13). Despite the fact that the students practice fairly systematically in solving problems. For each lesson, I gather the problems solved: about 6,500, and the average value is 97.8 (from 22 to 308, N = 66).
Now the course was carried out at 85% and a week ago students have carried out a test on the theory of the atom on the periodic table, ties, geometry and gas (12 questions). They could see their concept maps, summaries, and even the books, but in a maximum of half an hour. Again the results were very disappointing: average score 14.00 (out of 30: N = 64 Ss; marks from 0 to 26, SD 6.08).
Many of my students have a remarkable logical capacity: the average value of the GALT (Group Assessment of Logical Thinking ) test is 14.77. I began the course with about 120 students and now the number has almost halved. Why? Perhaps because of insufficient motivation to achieve a good preparation in studies. Brophy4 defined motivation to learn as “a student’s tendency to find academic activities meaningful and worthwhile and to try to get the intended learning benefits from them.” If there is no desire to study and reason, no matter the teaching method, the results are unfortunately unsatisfactory.

Thanks, liberato

References
1 Flipped Learning Network (FLN). (2014) The Four Pillars of F-L-I-P™. Online at: http://fln.schoolwires.net/cms/lib07/VA01923112/Centricity/Domain/46/FLIP_handout_FNL_Web.pdf
2 A. W. Friedel and D. P. Maloney, Those Baffling Subscripts, Journal of Chemical Education, 1995, 72, 899-905.
3 J. Sweller, The worked example effect and human cognition, Learning and Instruction, 2006, 16 165-169.
4 Brophy, J. Motivating students to learn, Lawrence Erlbaum: Mahwah, NJ, 2004, p. 249.

Michael Seery's picture

Dear Liberato,

Thanks for your detailed comment. I have the advantage of teaching students who have selected chemistry from day 1 of university; so there is an inherent motivation I can draw upon. Even though they mightn't like thermodynamics or see its immediate relevance, I would hazard a guess that most of my students see it as part of their broader chemistry education. Perhaps in teaching engineers, you have additional difficulties in that regard.

My only suggestion is to progress on your in-class work to help students develop their ability on the kinds of questions you say they have difficulty with. Others have written about peer-teaching and just-in time teaching coupled with the flipped approach. Maybe these could have some value in your case? Or perhaps some form of context based learning may assist with motivation.

Good luck, it's not an easy task!
Michael

Liberato,

I share your experience. I also teach a class of students who are required to take general chemistry and who have no motivation to learn the material. A few years ago I tried ALEKs, an AI based electronic homework that figures out what the students know then gives them questions on material they are ready to learn. I apparently works extremely well most places that have tried it. My students are very motivated to pass (which is a D grade) so they spent an incredible amount of time getting the highest score possible on the homework since we made it 15% of their grade. (ALEKs recommends you make it worth a lot since it is a lot of work to do). They did all get extremely high HW grades but did it by learning the patterns in the homework, not by learning the chemistry. I saw no improvement in exam grades. I talked to the ALEKs people afterword and they said that the only places they've found it to fail is in required courses where the students don't see the need for chemistry.

So, I take baby steps in encouraging them to learn material on their own and really sell it as this will help you pass the course.

Judy

Cary Kilner's picture

We, too, at UNH have used the ALEKS online HW to improve students' basic skills. However it does not develop the problem-solving skills we seek. In my many discussions with Eric Gates and Christopher Grayce at ALEKS, we have come to agreement that ALEKS is a mastery-learning tutor for basic skills. It is the instructor's role to choose and/or write the appropriate problems and provide the guided practice, preferably in small recitation classes led by a pedagogy content knowledge expert. Some of the other online platforms, such as Quantum or Owl may provide such tools, but I have found that work in small-groups with portable white-boards on instructor-designed problems on the proper gradient of difficulty are essential to learning how to apply chemistry. By selecting or writing appropriate problems from the service students' majors you can also provide context and encourage interest.

A lot of us are creating tools to facilitate the flipping of our classes. One of the problems with doing this without the aid of one of the large academic publishers is that it's hard to get our peers to review what we've done and make suggestions. Have any of you figured out ways to get peer reviews?

I ask because I started a blog that I'm hoping will provide opportunities for us to share our work and get comments and suggestions from others.

http://chemedtools.com/

Thanks

Mark Bishop
http://preparatorychemistry.com/

My first suggestion is to follow #flipclass on Twitter. While it is followed by more HS teachers than college profs, you will find many college participants. I also recommend FlipCon14 to be held in Mars PA this June (with an online virtual conference available).

I'd expect that you will get several blog followers who are willing to give input if you tweet your blog link & request.
Kristin

Michael Seery's picture

Hi Mark, Kristin,

Can't agree more re Twitter for getting suggestions and input. #flipclass #chemed #edtech are three relevant tags.

Michael (@michaelkls)

Cary Kilner's picture

Mark makes a good point.
I would suggest that the folks who might provide useful feedback are those who take the time to engage in list serve conversations such as this to deepen their practice.
I have a Chemistry Manual -- a workbook that I wrote for student use, that follows these two admonitions Rick listed in Article #2:
1) "-- students are given lecture notes that they can read with comprehension during HW --"
2) "-- during initial study, students need reading materials with a different design than a comprehensive text."

It also includes "exercises" for drill and practice toward mastery learning of fundamental mathematics skills of which Rick speaks, and that are found in my Chem-Math Project materials.
It includes authentic "problems" I have found or written that utilize concepts students are learning, with fundamental mathematics skills properly embedded and scaffolded.
My high school usage of the Manual, followed by five years with life-science majors in gen-chem 80-minute recitations, has shown me the value of these materials for enhancing student learning, particularly among the under-prepared students.
However, I would like additional peer-review (I have shared some of this at various conferences over the years).
Along the lines of Mark's idea, perhaps Bob could create a site where we could post our created materials, and interested readers could go there to critique them and subsequently provide feedback to the authors.

There is no need to reinvent the wheel. A lot of useful material already exists, and there are peer groups online one can consult with.

The Flipped Learning Network (http://flippedlearning.org/site/default.aspx?PageID=1). They have many good resources freely available. Mark could post his blog there if he chooses.

Another useful resource is Turn to Your Neighbor (http://blog.peerinstruction.net/author/peerinstruction/), Julie Schell's blog on the peer instruction method. Julie was with the Mazur Group at Harvard before she moved to UT Austin.

Kelly Butzler's picture

both the Flipped learning Ning and Turn to Your Neighbor blog are excellent! I subscribe to both!

Hello,

At the HS level, I don't expect students to write detailed summaries of the videos, although several do. I want to see that they get the gist of the topics, and are introduced to vocabulary & calculations before we do labs & inquiry activities (POGIL). Some students ask very deep questions that show that they are thinking beyond the basic concepts introduced in the video and others as fairly superficial questions. The questions do still provide insight into areas that are troublesome to the students.

I assign a nominal value of 2 points per "day" of videos per chapter. Typically it is 8-12 points entered as a homework grade.
Kristin

Some students find that watching the videos after they have done some activities in class, is actually more helpful (although I consider that to be catching up at the last minute).

“On the whole, the entire pre-class work required approximately 45 minutes to 1 hour of work from students”
Was this for each lecture?

If physically possible have you considered having some of the groups working at blackboards? It would be easier for you to see how each group was approaching the problem.

Do you have any idea how this would work with the average community college student in the US?

Kelly Butzler's picture

Pankuch, I will presenting a paper next week that looks at flipping classes at an open-enrollment college. As we speak I am analyzing my data from my dissertation study that compared lecture and flipped classes at this open-enrollment college. Therefore, I can definitely address your question! You may want to visit my blog: http://kellybutzler.wordpress.com/ where I have been reflecting on flipping gen. chem at this college.

I agree that blackboards are a good way to facilitate real group work during problem solving rather than having students work separately and conferring with each other. Another approach similar to blackboards is huddle boards, which are portable white boards. Those can be a handy way to keep students working together. The students can work on the huddle boards at their seats and then share their work with other groups during a wrap-up session at the end of class.
Jennifer

Holly Wiegreffe's picture

I bought "beadboard" at a home impovement store years ago. Beadboard is a building material that goes on the inside of showers. I had them cut it down to size and it's pretty much the same as a white board. I have a class set of small ones, one for every student, and a set of large ones for them to do group work on and show the class. I had to pay out of my own pocket but I think it was about 40 bucks, including the price of the "cuts" in order to get them to size. Cut up shop towels for erasers.

Holly Wiegreffe's picture

Technically, I teach at a State College, which is basically a community college that offers limited BS/BA degrees.

As an experiment, I have flipped a few lessons but didn't have enough students do the "away from class" part to make it work. The accountability piece is still missing for me.

Here's what I'm going to try next: As the students walk through the door (I have 24 students) I will collect their "homework" (also known as an entrance slip), which will most likely be some sort of fill in the blank/short answer/short response "worksheet" that they will use to demonstrate they did their part before coming to the flipped classroom. At the door, I will assign students into two groups, those that did the assignment (As) and those that didn't or didn't do it with any effort (Bs). Then, I'm putting them in groups - the A's together and the B's together. I'm hoping that preventing the B's from hanging on the coat tails of their prepared colleagues will encourage them to take it more seriously. There will a grade for the entrance slip and for the work done in class (also known as an exit slip). When students are finished, they can leave. The unprepared will have a 0 for the entrance slip, have a harder time in class doing the assignment, probably won't get to leave early and may or may not get a lower score on the exit slip.

I might even add a "put your work on the board" portion to make it very uncomfortable for those trying to slack. Generally, the students at my college work an incredible amount of hours at their jobs, are full time students and in many cases have children. I get that, but if they can't do the home assignments before class, I don't see how "flipping" is going to be effective.

Has anyone else tried the two group approach?

Layne Morsch's picture

The comments here regarding the lack of students viewing the videos concerned me while flipping my organic chemistry class for the first time this Spring. In the end, I decided to give 10 points per chapter for quiz questions that were attached to each quiz. These were due before I finished each chapter. The majority of chapters had over 90% viewership by the students (class of 40). Just as comparison, the exams for the course are worth a total of 500 points. (the lecture quizzes were about 14% of the overall grade)

Hi,
I've used pre-lecture assignments for 'graded' work many times in the past.
The in-class sessions involved discussion and small group work similar to the pre-lecture worksheets.
I didn't divide the classes based on who did and didn't do the assignment.

I always put the work sheets out at least a week before the session and told students beforehand what was planned for one of these sessions.
This way I was sure there was adequate time for students to complete the work sheets.
These assignments were a small part (less than 10%) of the course grade.

Students procrastinate no matter what format is used for a class. When I review the daily activity of my students in
my online CANVAS classrooms their participation peaks the day before and the day that work is due.
Presently 90% of my students are completing 100% of the assignments.
Our college students are adults and free to fail. We can't force them to do the course work.
We can put carrots in front of them to encourage participation.

What chemistry class are you teaching?

You are right. "Flipping" is not going to work if students do not attempt or
spend time on the pre-lecture (class) assignments. "Flipping" creates a situation
where students are expected to comprehend course content before it is discussed.

Cary Kilner's picture

"Flipping creates a situation where students are expected to comprehend course content before it is discussed." Or -- I would say, based upon what we are seeing discussed, where students are expected to make a reasonable attempt at comprehending course content in order to be ready and able to articulate their needs in the "real" classroom for subsequent address by the instructor.

Michael Seery's picture

Hi Pankuch,

The pre-lecture work in my case was for every two lectures, as I had the students twice in one day (on a Wednesday). So the prep was for both those lectures (mostly the first, which in turn prepped the second).

I agree with others - blackboards or similar are a great way to get students writing out chemistry. Thanks for suggestion. It's hard in my class because of room design - one other clever idea I saw was with a little A4 sized whiteboard that students could show as they worked through things.

Re community college, sorry I can't say personally, but it seems from others' experiences that the method is applicable across the spectrum. I think the pre-lecture work has to be worth something to students (grade).

Michael

Layne Morsch's picture

There are many ways to use technology to do this sort of engagement. My entire class works on iPads using ChemDraw for iPad to draw and exchange reactions, mechanisms and syntheses. I can allow any of these students to display their work on the screen in class to share with the rest of class.

Cary Kilner's picture

I have found these to be key for effective problem-solving lessons in small classes (or larger ones, if sufficient white-boards are available). In my use of them I have evolved the following principles that the reader may find useful:
1) Three students per white-board are best. Two is all right, but four are too many. One student cannot see clearly and so does not interact.
2) If possible, allocate stronger students among groups. Alternatively disperse them to wander the room and assist other groups when they finish early.
3) Have groups rotate the role of writing on the boards so it does not become institutionalized, unless you see that a group is functioning particularly well.
4) Provide cleaning materials to hand for each group or table. Clean boards promote neat work. (Note that diluted white vinegar is recommended to minimize degrading the surface of the white-boards.)
5) Provide a variety of dry-erase colors for each group. Students often access their visual intelligences and color-code as a useful technique.
6) However, don’t let the board work become a “work of art” and take the place of correct chem-math and problem set-up. “Pretty” work can disguise errors and distract learners from the true purpose of it.
7) Students should make copies of the final result in their notes, but all brainstorming should occur on the boards.
8) Don’t allow the students to erase their results before you can inspect them. You may wish to take digital photos of their work to show other classes or to document it (as your own action-research).

Hello, How frequently do students go through this kind of session? What problems if any have come up from dividing the class into the separate groups (AS) and (BS)? How much does this work count toward their grade? Does attendance count in the course grade?
How far in advance do students get the work sheets? If they skip the work sheets can they pass the class?

Holly Wiegreffe's picture

Walt~ I'm having a little trouble following the format of the discussion, but if I read your message correctly, your question about the two groups As and Bs is directed at me.

To answer your question, I haven't tried it yet. The idea came out of a discussion with a small group of motivated students... Students always want more time in class for working problems and I explained that a flipped classroom would help with that, but I explained the challenges with student participation prior to class... Out of a 5 minute brain storming discussion before class, this is what we came up with. Motivated students don't want to work with the ill prepared. The "fear" I have is that if I can't cajole students into the prework, my "success rates" will fall and students will write on my evaluation that "I didn't teach them the material.", which would kinda be true, from their standpoint. I have the support of my administration but that will change if 50% of my class drops because of failing grades.

By the way, I plan to group my students in the same way for labs... I'm tired of the chronically ill-prepared and late hanging with the "good" group and sliding by. You want to come to chem lab with no idea what's going on? Well, that's your choice but then your partner will be as clueless as you are and good luck with that.

I teach Intro to Chemistry and Gen Chem I. I only tried the flipped lesson in Intro to Chem (and in a high school class I taught ages ago.) Presently, I have a category called "Other" which includes, quizzes, homework, etc. "Other" is worth 15%, but I'm taking that down to 12% in the fall. Basically, "other" is everything except labs and tests. I award no points for attendance.

Holly Wiegreffe's picture

I wanted to answer two questions: 1. how many students watch the pencasts and 2. how much student "sharing" of answers was going on with prework preparation... This is what I did. I gave an assignment (30 minute pencast on the scientific method.) Straight forward and something everyone can understand without help. The "assignment" was to view the pencast and a due date was given. Embedded in the pencast was an additional assignment (Write about an example where someone uses the word "theory", when they should be using the word "hypothesis" instead. They could take an example from their life or find an article, etc. - easy points here if you do it.)

But here was my trick. I told them in the pencast that the excersize was worth 24 points for the entire class, but the 24 points would be divided between the students who did the assignment. If only half the class did it, there's extra credit for those who do. But if you blab about the assignment to your neighbor, you are reducing your own grade (insert evil laugh here).

Results: About half the class did the assignment (not enough for a flipped classroom for me) but not many "shared" with their classmates as I would guess the completion rate would have been higher.

It was actually quite interesting when students figured out there was an excersize that they missed (the zero tipped them off) and then asked me what the assignment was. My response? Watch the pencast and then we'll both know.

I'm not sure where this falls in line with the flipped classroom model, but, I use YouTube videos as topical supplements. I also use the textbook publishers internet homework platform to gauge and monitor the level of student comprehension of the topics.
The students are allowed during the lecture session to pose questions pertaining to the homework and/or supplemental videos.