You are here

OpenOChem: An LMS Agnostic Chemistry Quizzing Platform

Author(s): 

Carl LeBlond, Department of Chemistry, Indiana University of Pennsylvania, Indiana, PA

Ehren Bucholtz, Department of Basic Sciences, St. Louis College of Pharmacy, St. Louis, MO

Jennifer Muzyka, Department of Chemistry, Centre College, Danville, KY

Abstract: 

OpenOChem is a freely available online homework system and interactive classroom tool under active development. Like most commercially available electronic homework systems, this program enables instructors to ask questions that students answer by drawing structures (2D/3D), reaction mechanisms and more.  Unlike commercially available systems, instructors can write their own questions/feedback and access a shared library of questions from other instructors. Furthermore, OpenOChem is Open-Access so it is cost effective for students. The OpenOChem system uses Learning Tools Integration (LTI), enabling faculty members and their students access through their learning management systems (e.g., Moodle and Canvas) used on their individual campuses.  We discuss the general functionality of OpenOChem and its use in organic chemistry instruction.

 

Introduction

Active learning and prompt feedback are two key principles for good practice in undergraduate education. (Chickering and Gamson) Active learning requires students to engage with the materials and put it in a context connected to prior knowledge. Prompt feedback allows for students to quickly assess what they understand and do not, and determine their level of competence.

Online homework systems are able to combine these two principles and several studies have indicated the benefits of online homework systems for learning organic chemistry. They can be used to encourage students to engage with the material and to put in study time in a regulated manner through weekly online homework deadlines.  Students are more likely to complete homework if it’s online, and they completed the online homework more consistently (Richards-Babb, 2011). Feedback is immediate as responses can be automatically evaluated and specific prompts can be provided for both correct answers and common misconceptions. Parker and Loudon (2013) found that students studying problems in the textbook was no better preparation for exams than online homework. They had hypothesized that even though book problems were more complex it was the immediate feedback that more effectively reinforced the material. Finally, online homework correlates with student performance. Richards-Babb (2015) found a direct correlation between online homework and final course grade, suggesting that online homework is an important aspect in student learning.  Malik et. al (2014) found that students who used online homework earned approximately 15 percentile higher scores on ACS exams than those use used only paper based homework. Some students in the Malik study indicated that feedback was a primary advantage of the online homework system, further supporting Parker and Loudon’s hypothesis.
 

History of OpenOChem

In 2012 the organic chemistry instructor/programmer (CL) became interested in developing interactive online teaching and assessment tools for organic chemistry.  At that time his university was using the Moodle learning management system (LMS) and since Moodle is open source and had a well-documented API it was very amenable to developing plugins.  This work culminated in the EasyOChem plugin series for the Moodle LMS which was presented at the 2014 BCCE (LeBlond 2014).  Concurrently, JM was developing the Reaction Zoo, an interactive web-based application with an online database of over 1200 organic reactions.(Muzyka and Williams, 2016) Over the next few years Moodle underwent numerous revisions, so the EasyOChem Plugins required continual updates to keep pace. While Moodle LMS was the original platform for this tool, as interest increased, several organic chemistry faculty members who learned about EasyOChem were disappointed that it would not work with other learning management systems used on their campuses. EB was an early user of EasyOChem questions in Moodle, but as his institution was updating the LMS, those questions were no longer available. He was working on a system for optical recognition of hand drawn structures and other cheminformatics projects, and contacted CL to make suggestions for improvement of the interface, use of line notation in answers, and system tags for sharing information.

Chemistry drawing and input in early versions of the EasyOChem question types for Moodle were based ChemAxon’s Marvin JAVA Applet.  These JAVA applets often required users to install a java runtime environment (JRE), were slow to load and cumbersome for users. However, security flaws and malware issues associated with these Java based plugins spurred the development Javascript based drawing apps (MarvinJS, JSME, ChemDoodle, Kekule).  The EasyOChem Question types evolved to use MarvinJS, simplfying the user experience. OpenOChem currently uses ChemDoodle, Kekule and JSmol javascript libraries for entering molecular structures and mechanisms.

OpenOChem’s agnosticity can be attributed to the Tsugi Framework (https://www.tsugi.org/).  Tsugi was developed by Dr. Charles Severance of the University of Michigan and it provides the low-level implementation of IMS Global’s Learning Tools Interoperability (LTI).  The IMS LTI standard provides a framework for integrating a variety of learning tools, making them each compatible with any LMS. Instead of each individual LMS system hosting a resource, LTI allows for gaining access to externally-hosted learning tools. This approach allows the learning tool provider to focus on maintaining and developing the tool, the administrator and faculty to use tools that best fit their courses, and students to have a seamless integration of a learning tool that would normally be hosted in another environment. Finally, since the tool is an external application, it is portable by definition and agnostic of the LMS used.

 

Why OpenOChem?

Commercial systems available today include OWLv2, Sapling, and MasteringChemistry which require students and faculty to use additional resources outside their campus LMS and require additional monetary cost to the student.  These “do all” homework systems were designed to be very general and applicable to numerous subjects. OpenOChem is designed to be specifically for Organic Chemistry. The goal was to develop question types that are ubiquitous in the organic subdiscipline, but not necessarily found in other areas.

Problems with commercial systems are also that students and faculty are required to maintain accounts on many systems- their home institutional LMS as well as an external site.  Additionallyuse of other vulnerable technologies such as Adobe Flash (Sapling, Mastering, OWLv2) and Java (Mastering) make these commercially available homework systems less attractive. As pointed out by Steve Jobs, Flash has issues with “reliability, security, and performance.”

Another issue is that many institutions tend to switch from one LMS to another as cost and delivery and support structures change.  For example, at one of our institutions (CL) the LMS has switched from Moodle, Blackboard and D2L in the past few years. At another (EB), hosting switched from on-site, to vendor contracted.  Problematically, as our institutions’ LMS offerings changed, our questions and content could be lost in the transitions. Furthermore, the ability to share questions across multiple LMS was limited to very simple questions types such as MultiChoice or using SCORM format.  OpenOChem solves these issues as it is independent of a specific LMS.  Thus, developed questions are hosted on one system rather than many. One of the advantages of this is that it lends itself to a system where faculty can act as a community of users to build shared questions, activities and content. This approach  is a strength of OpenOChem, as our combined efforts enable us to develop a larger library rather than duplicating content in individual LMSs.

OpenOChem is designed as an Open Access system. While the tool is hosted by the developers and is currently proprietary, the content on the system is intended to be shared and adapted by any users of the system. As such, OpenOChem has adopted the Creative Commons Attribution-NonCommercial 4.0 (CC BY-NC 4.0) copyright license for questions developed in the system. Therefore, when a user creates a question, other users of the system are allowed to copy and adapt the question for use in their courses. CC BY-NC 4.0 is a non-commercial attribution, such that OpenOChem, or users of the system cannot monetize the questions generated by the users. Ideally, users will generate questions, and let others take advantage of the novel ways that each faculty member teaches their course. Some of the questions also take time to build, and by sharing the questions, you are helping other faculty members to use their time developing additional interesting questions that will be shared with you.

Finally, OpenOChem allows faculty members to have more control and freedom, while students currently have no cost. By designing a system from the ground up, the developers and users can have increased freedom to build and design the learning tool that best suits their needs. This project allows for collaboration, and is intended to push the boundaries of what can be done within an organic chemistry grading tool without the need to meet profit goals or cut corners.

 

Functionality

After a faculty member completes a request on the OpenOChem website, an LTI encrypted key and secret pair is provided. A link between the faculty member’s LMS and OpenOChem is accomplished in Moodle via an “External Tool.”  In Canvas, faculty users can insert either a Module or Assignment to create the link to OpenOChem.

 

Workflow

Once the key and secret have been installed, using OpenOChem is rather easy.  OpenOChem’s design allows several possible workflows. One general workflow appropriate for developing and using content is described below.

1. Access the link.  Clicking on the link as an instructor takes you to the OpenOChem Instructor Dashboard.  Students do not see this Dashboard upon accessing the link; instead, they see the Activity you assign from your Dashboard.  Alternatively, you could hide this link from your students and use it as an entry point to OpenOChem for authoring questions and activities.  If a student accesses a link in which no activity was assigned, a warning message will be displayed.

Figure 1.  Screenshot of OpenChem Dashboard in Moodle.

 

Figure 2.  Screenshot of OpenChem Dashboard in Canvas.

 

2. Assign an Activity.  Under the Activities menu item an instructor can access “My Activities” or the “Activity Bank”.  From the My Activities page, an Activity is assigned by simply clicking the assign checkbox for the activity.  Activities can be easily copied from the Activity Bank, and then be assigned as well.

Figure 3. Instructor’s “My Activities” page show all your activities you’ve created.

 

3.  Author New Questions and Activities (Optional).  You can author new questions/activities from any link into OpenOChem as long as you have an Instructor role in your LMS.  In step 3 above you could also create a new activity and assign it.

 

Activities

Questions are arranged into activities.  There are currently three kinds of activities available on OpenOChem.  They were designed to address different pedagogical situations where instructors want to ask questions of students.

A primary design goal of OpenOChem is to make quiz creation and editing very efficient and intuitive.  With this in mind the search for questions and insertion into activities is done on the same page and makes use of drag-and-drop technology when possible.   While the question searching capabilities are constant across all activity types, the method in which you assemble or arrange your questions differs depending on whether you are preparing a Quiz, Adaptive quiz, or SlideDeck.

 

Quiz Functionality

The quiz activity was designed for use in formative assessment (homework and quizzes).  Quizzes are highly configurable, with options for controlling the navigation (sequential or Free navigation), the # of allowed attempts, timing and grading.  Instructors can also easily control whether the students see if their responses were correct, the correct answers, and/or their grades. There is also a “Builds On Last” (BOL) mode which shows students their own previous responses upon which they can build.  The BOL mode is ideal for homework. Questions are organized into activities through a drag and drop interface.

Figure 4.  Screenshot of Quiz editing page.  Simply drag and drop questions from the Question Bank into your quiz.

 

Adaptive Quiz Functionality

With an Adaptive Quiz, the student’s response or correctness determines their next question.  Due to the adaptive nature of these assignments, there are fewer options for the instructor as they set up these quizzes.  The instructor can control whether students see their correctness, the correct answers and/or grades. In an adaptive quiz the questions are arranged into a “Question Flow” diagram by dragging and dropping “Question Operators” onto the flowchart.  The questions are then linked together to control the flow. You can easily create branching, loops and other adaptive questioning with this activity.

Figure 5.  Example of Question Flow in Adaptive Quiz. 
 

SlideDeck Functionality

SlideDecks are HTML based presentations with questions.  SlideDecks are based on the Reveal.js presentation framework and are like PowerPoint presentations but with interactive questions!  Students self-check their understanding by answering questions and getting feedback as they move through your content!  SlideDecks are ideal for delivering lectures or lessons in a flipped classroom environment, with students proceeding at their own pace.  There is also a “Live” mode option in which SlideDecks behave like a Student Response System (SRS), in which the instructor can pose questions and get responses in real-time from students.  Other response systems offer a similar ability to present a series of slides with the opportunity for students to answer a variety of question types (multiple choice, free response, and drawing).  OpenOChem’s SlideDecks allow instructors to pose chemistry questions that students answer by drawing structures (2D/3D) and mechanisms.

 

Question Types

A variety of question types are available on OpenOChem.  This section provides an introduction to the organic chemistry specific question types.  Standard question types such as multichoice, T/F, MultiAnswer, Short Answer and Formula question types are also available but will not be presented here.

Figure 6.  Question selection screenshot.

 

All question types except for MCQ, T/F, Formula and MultiAnswer allow incorrect/correct answers and specific feedback for each answer.  All question types also allow both positive and negative general feedback as well.  Question editors employ a tabbed interface for easy editing of the question settings, answers and feedback.  All questions must be given a title, question text, at least one tag and at least one answer. The title is not seen by the student and is used for the question bank description.  The Question Text is what the student sees and must answer during an attempt. You can easily insert images, embed videos and 2D/3D interactive structures into the question text as well.  We have implemented two methods for inserting 2D or 3D structures into a question’s text or answer feedback. These are accessed by clicking one of the two buttons .   These tools use Kekule.JS (2D or 3D) and ChemDoodle (2D only).  The Kekule tool inserts a javascript object into the page that can be edited.  The ChemDoodle based tool uploads an image to the server and creates a html link to it.  The ChemDoodle image can not be edited, only deleted.

 

Figure 7. Question editing interface

 

We have designed a very powerful tagging system for OpenOChem.  The tags are utilized for searching questions from the question bank and for specifying random questions.  There are tags for every functional group, mechanism type and most every organic chemistry concept. Users can easily request new tags be added to the system.  A question can have many tags. For example suppose you create a question in which you expect students to draw the product of the electrophilic addition of H-Br to propene.  You could assign three tags to this question; 1) Alkenes--Reactions of, 2) Alkyl Halides--Formation of and 3) Mechanisms--Electrophilic Addition. You can also assign your own user defined tags as well.  Searching the database for Alkenes--Reactions of would yield this question as well as any other questions that have this tag.

 

Figure 8. Selecting tags for a question

 

Instructors can also classify their questions into different categories.  Categories are for personal use only. You can only define one category per question and the categories are not visible in the “Question Bank,” only the the instructors “My Questions”.

You can easily share your questions, making them available in the Question Bank for other instructors to use.  Selecting the Share checkbox when editing a question will submit your question to the “Sharing Queue.” A reviewer will check that the question works, is chemically correct and has appropriate tags.  After your question is approved, it will appear immediately in the Question Bank. If it is not approved you will receive comments describing what needs to be remedied. Any questions you develop are licensed by you under a Creative Commons Attribution-NonCommercial-ShareAlike license. 

One of the benefits that we see from use of the OpenOChem platform is that questions are peer reviewed, and an argument could be made that your work is making a greater contribution beyond your institution. Questions used in the system can be tracked for how many students answered the question, and how many faculty used the question. We foresee faculty members seeking promotion or tenure might request analysis from OpenOChem on how many different institutions or faculty members are using their contributions to the system.

 

Figure 9.  Question sharing checkbox

 

ChemDoodle - 2D Structures and Mechanisms

Currently, 2D structural and mechanism questions in OpenOChem utilize the open-source ChemDoodle Web Components.  The instructor and the student use the ChemDoodle sketcher to author and answer questions respectively.  There are four question subtypes that can be specified, and they are set from the “Question Specific Options” tab.  Table 1 outlines the different types of questions you can ask. While ChemDoodle is our primary interface, we are also using the Kekule.js for other structural input in some question interfaces. Each method has its benefits, but we do recognize that standardization on one will decrease development time and minimizes duplication of efforts.

Table 1

2D Structure and Mechanism Question Types Examples

Question Sub-Type

Example Questions

Structural – Students respond by drawing 2D structure(s).

Draw (R)-butan-2-ol in the sketcher below?

Draw the electrophile formed by the following reagents/conditions below.

Lewis Structure – Students provide electrons or charges on a molecular template to satisfy valence rules.

Provide any charges need in the following electrophilic aromatic substitution intermediate?

Below are the Lewis structures for the two resonance structures of ethenylamine.  Provide any lone pairs that are missing?

Mechanism (Arrows On Template) – Students provide curved arrows on reaction/resonance templates.

Provide curved arrows on the following structures showing how the acylium ion can be converted between its two resonance forms.

Provide curved arrows on the following template for electrophilic addition of a t-butyl cation to benzene to form a cyclohexadienyl cation.

Mechanism Full – Students must draw complete reaction mechanism with all intermediates and electron flow arrows

Provide the complete mechanisms for the acid catalyzed dehydration of 2-propanol.


 

Animation 1.  Mechanism (Arrows On Template) question type example.

 

 

Animation 2.  Mechanism (Complete) question type example.

Figure 10.  Setting the answers for a 2D structural question.  The instructor draws the answer in the Answer Editor and then clicks the “Retrieve from Editor” button to register the answer.

 

Figure 11.  Student view of question attempt.

 

3D Structures and Conformers

3D structural questions are based on JSmol.  The instructor and the student use the JSmol model kit to author and answer questions respectively.  There are two question subtypes that can be specified, 3D(SMILES) and 3D(RMSD), from the “Question Specific Options” tab.  When the 3D(SMILES) option is selected, the student must draw the correct structure but does not need to be concerned with the conformation.  The students structure is converted to unique SMILES string and checked against the SMILES string of the correct answer. The 3D(RMSD) subtype requires the student to draw the correct structure in a specific conformation.  A SMILES string comparison is done and then a comparison of the RMSD values is performed. The RMSD is achieved using the RDKit cheminformatics programming library and does not include H atoms in the RMSD calculation.  The 3D structure’s geometry can be optimized when designing the problem and when being attempted by the student.  It good practice to optimize the structure when creating a conformer problem and you can provide a note to the student to do the same in the question text.

 

Animation 3.  JSmol 3D model kit based question type.


 

Table 2

JSmol 2D Structure and Conformer Question Types Examples

 

Question Sub-Type

Example Questions

3D(SMILES) - Students must draw a 3D structure, the conformation is not important.

Construct (R)-2-bromo-3-methylbutane in the editor below.

 

Construct the product of the reaction of ethylene glycol with acetaldehyde

3D(RMSD) - Students must draw a 3D structure, is a specific conformation.

Construct (R)-2-bromo-3-methylbutane in its anti-periplanar geometry as is required for an E2 elimination to occur.  Be sure to optimize your structure once you have drawn it.

 

Construct the most stable conformation of methylcyclohexane.  Be sure to optimize your structure once you have drawn it.


 

Localization

OpenOChem currently supports three languages (English, French and Spanish).  The user’s language is automatically set based on their browser locality. In recent months there have been significant revisions to OpenOChem’s code, so some language strings have not been translated yet.  We appreciate your patience.

 

Just-in-Time Teaching

Just-in-Time Teaching (JiTT) was co-developed by physicists Gavrin and Novak (Novak et al) and has been adopted by chemists. (Chambers and Blake)  JM has used JiTT in her courses for several years (Muzyka 2015) and has recently incorporated OpenOChem quizzes as valuable tools in warm-up assignments that students complete before class.(Muzyka et al, 2018)  Student responses to structural questions help identify common misconceptions about chemical reactions which can then be addressed during class. Students find the questions helpful in their learning and some have requested additional OpenOChem practice problems to assist in their preparation for tests.

 

Building Online Organic Chemistry Resources

CL currently uses OpenOChem extensively on his public facing Moodle server hosted at the Indiana University of Pennsylvania. (Carl LeBlond’s Chemistry Resources - http://penrose.nsm.iup.edu/moodle/).  He has developed a series of online courses utilizing OpenOChem technology which cover Pre-organic chemistry, and Organic Chemistry I and II.  CL’s Chemistry Resources initially were designed around the EasyOChem question types for Moodle and he is currently converting to OpenOChem. His pre-organic chemistry course is available to the public and eventually he would like to provide his Organic Chemistry I and II courses as well.  His students use the resource regularly for homework and assessment (quizzes).

 

Prerequisite Knowledge and Post Activity Evaluation

EB has incorporated OpenOChem quizzing in collaboration with the activities he used from his guided inquiry workbook (Bucholtz 2016). Students are expected to come to class having self-assessed their preparedness for the new activity. Self-Assessment is directed by a series of “I am ready for this activity because I can …” statements. For example, prior to an activity on using enolates as nucleophiles, students are to determine if they can identify formation of kinetically or thermodynamically controlled enolates. Students are expected to bring college issued computers to class and are prompted to answer quiz questions in the first five minutes of class using OpenOChem. The following example question provides feedback for correct and incorrect answers to ensure students understand the concept before building on it in the activity of the day.

Figure 12.  Collage of sample question and multiple correct answers and incorrect answer with feedback.

 

Students results are quickly assessed and the instructor can clarify further if needed. Students also have been able to self correct later in the class period. In the following example, a student initially got the question wrong, but after working with the material in the class, was able to resubmit a correct the question during the class period.

Figure 13. Student revision of attempt during a class period.

 

Testing and Development

CL is the main developer of OpenChem.  OpenOChem is constantly under development and as such constantly evolving.  We don’t have a large team of software developers and testers and question/activity writers  If you have an issue, suggestions or comments concerning OpenOChem please feel free to email CL, JM and/or EB.

 

Getting Involved

We are always looking for contributors to join our effort.  There are numerous ways in which you can help. Some examples include using OpenOChem in your instruction, creating and sharing questions, helping review questions or help us develop other language translations.

 

Requesting a Key

If you would like to try OpenOChem or learn more about its features please visit www.openochem.org.

 

Usage Statistics

There are currently 1725 shared questions available in the question bank.  A total of 325 activities (Quizzes) have been attempted 10490 times as of April 1st 2019.

 

Acknowledgements

The authors would like to thank the Indiana University of Pennsylvania for OpenOChem current server and networking support.  We would also like to thank Dr. Sandrine Brice-Profeta for the French language translation.

 

References

 


 

Bucholtz, Ehren Foundations of Organic Chemistry, 2016, Pacific Crest, Hampton, NH.

Chambers, K.A.; Blake, Bob. Enhancing Student Performance in First-Semester General Chemistry Using Active Feedback through the World Wide Web, J. Chem. Educ. 2007, 84, 1130-1135.

Chickering, A. W.; Gamson, S. F. Seven Principles for Good Practice in Undergraduate Education. Am. Assoc. for Higher Educ. (AAHE) Bull. 1987, 39, 3−7.

LeBlond, Carl; Fair, Justin D.; Frick, Paige M. “Online assessment tools for organic chemistry,” Presented at Biennial Conference on Chemical Education, Grand Valley State, August 5, 2014. P651.

Malik, K., Martinez, N.; Romero, J.; Schubel, S.; Janowicz, P. “Mixed-Methods Study of Online and Written Organic Chemistry Homework” J. Chem. Educ., 2014, 91 (11), 1804–1809

Muzyka, Jennifer L.; Williams, Alexander, Reaction Zoo:  Organic Chemistry Practice Problems, Presented at Biennial Conference on Chemical Education, Greeley, CO, August 1, 2016.

Muzyka, Jennifer L.; LeBlond, Carl; Bucholtz, Ehren C. OpenOChem to facilitate active learning.  Presented at Biennial Conference on Chemical Education, South Bend, IN, July 30, 2018.

Muzyka, Jennifer L.  “ConfChem Conference on Flipped Classroom:  Just-in-time Teaching in Chemistry Courses with Moodle,” J. Chem. Educ.  2015, 92 (9), 1580-1581.

Novak, G. M.; Gavrin, A.; Christian, W.; Patterson, E. Just-in-time Teaching:  Blending Active Learning with Web Technology, 1999, Prentice Hall: Upper Saddle River, NJ.

Parker. L; Loudon, G. “Case Study Using Online Homework in Undergraduate Organic Chemistry: Results and Student Attitudes” J. Chem. Educ., 2013, 90 (1),  37–44

Richards-Babb, M.; Curtis, R.; Georgieva, Z.; Penn, J.H. “Student Perceptions of Online Homework Use for Formative Assessment of Learning in Organic Chemistry.” J. Chem. Educ. 2015, 92 (11), 1813-1819.

Richards-Babb, M.; Drelick, J.; Henry, Z.; Roberson-Honecker, J. “Online Homework, Help or Hindrance? What Students Think and How They Perform. J. Coll. Sci. Teaching , 2011, 40(4), 81-93

Date: 
04/29/19 to 05/01/19

Comments

OpenOChem looks like a great pooling of resources to make organic specific online question tools.  I really like that it is independent (agnostic) of any LMS platform and non-commerical enterprise run by the instructors who are actually using it.  I really like the drawing tool and the ability to make molecules in 2D and 3D.

 

In full disclosure, we do not currently use an online homework system.  I remain agnostic to the cost-benefit ratio for these tools.  Removing the cost to students via OpenOChem is excellent.  I am still struggling, however, with the time spent versus learning gained aspect of the online homework systems.  I see benefits and drawbacks and I’m left feeling ambivalent.  The rapid feedback response is great for an adaptive quiz style homework system.  Students learn what errors they are making right from the beginning of their study of a particular topic.  Unfortunately, the questions themselves prevent students from displaying their full understandings or full misconceptions.  With multiple choice questions, students’ responses are limited and may not allow students to display a misconception they have.  With Chemdraw-style drawing questions, students are prevented from drawing some of their more interesting molecules.  Unfortunately, the technology has not yet advanced to give students meaningful feedback on free-style questions.  I suspect that most of our assessments involve written free-response questions and areas where students are asked to free-hand draw molecules.  These skills are not currently able to be practiced in any online homework system.  But, if you just assign students traditional offline homework assignments and don’t hand grade them, their only feedback is an answer key (which they may struggle to interpret correctly).  To me there is no clear solution to this complicated issue.  I’d be interested in hearing other folks perspectives and thoughts.

Hi Brian, thanks for taking some time to respond. I agree that we have not yet cracked the code to what would be the perfect online system. Adaptive feedback is something that I have started to explore with this system, but I find that it takes an inordinate amount of time to build really good questions that build on each other. I would hope that with a building of a community of instructors, we could all be working to build a true database of strong questions. As it stands right now, if I draw up a question, and Jennifer or Carl draw up the same one, we don't have a mechanism to let each other know that we are doubling up each others work. We are trying to work that into the system with tagging, but we are not quite there yet.

I agree that multiple choice is very limiting in that we can only provide a handful of the possible wrong answers. By having students draw structures we can increase the ability to allow students to draw unexpected structures, but we can only preprogram in those most seen wrong answers and provide feedback (as shown in figure 12). The solution would seem to be some sort of artificial intelligence that would be able to give the feedback that I can give to a student when they draw a structure in my office. 

You also mentioned that many of us still use assessments where we have free response. I do believe there is something different where students must put pencil to paper and draw as opposed to clicking on the screen. Another project that I have been working on has been with optical structure recognition and Carl and I have talked about trying to incorporate that into OpenOChem. I put together a demonstration video https://www.youtube.com/watch?v=DRX1m27znD0 that shows how I originally envisioned it. 

Once again, this doesn't really get to the misinterpretation of the correct answer from key that you mention at the end of the post. It can only tell if a student has drawn the right thing, or the wrong thing, but not explain. In the last paper I suggested crowd sourcing interpretations of the spectra. This might be another area where crowd sourcing could be beneficial. I could see where at the end of semester, we pull the data for all incorrect answers to a question, and then look at why that is wrong answer, and enter that into the database as a wrong answer and in the feedback give the reason why. While there are probably infinite wrong answers that could be given, we would see the 10-20 common wrong answers, and give the feedback. I know when students come to my office now, it is rarer that I get a student who has given a truly unique wrong answer to some of the common questions. 

One of the next areas that I would like to explore more is mechanism drawing, as those are the open ended questions on my exams that students struggle with. If they don't understand the correct answer from a key and why it is correct, they will make arrows that are backwards, or add acid when it is clearly a base mechanism. I really like how chemdoodle does the arrow drawing in this interface, but there is no artificial intelligence built in to indicate if a student drew an arrow backwards. I have done simple acid/base questions with arrows drawn incorrect in the answer to catch that, but once again it takes too much time.

So, I agree there is no clear solution to the checking of homework issue. I think there could be ways to help with OpenOChem or optical structure recognition, but without artificial intelligence this is best we can do now. 

Ehren

Like Brian, I don't use an online homework system.  I have used a couple of them at different times in the past, with two different systems.  One of the experiences was a pain because students would claim they drew the correct answer but the computer didn't give them credit.  In those cases I would encourage students to write down the question and their answer so that I could inspect them to see if they were accurately assessing the situation.  This was the early days with online homework and I couldn't see the answers they were drawing to be able to see if the computer was actually making grading errors.  In most of the cases the student's hand drawn answers were appropriate responses and I granted them credit.  That's a lot of extra work for something that's supposed to be automated.  So we decided to drop the system.

A few years later my students and I gave online homework another shot with a different system.  That time there weren't the problems with the automated grading.  But at the end of the term some of the students told me that they didn't find the online homework helpful for their learning.  So I decided to drop that system as well.

I have small classes (15-32), so I could definitely manage graded homework.  Both of my colleagues who teach organic have students turn in homework for a grade.  Some years I have had assigned homework that was turned in.  But most of my years of teaching I have used weekly quizzes instead.

I sometimes use Nearpod's response system to have students submit hand drawn answers on tablets during class.  That works okay but it's tough for students to draw structures on phones with that system.  At other times I wander around the classroom to see the structures students are drawing in their notebooks so that I can provide individual feedback. 

When my students started using OpenOChem in their pre-class warm-up assignments, a few of them pleaded for more practice problems through the system.  I have over a thousand questions in the system because Carl figured out how to import my database from the Reaction Zoo.  That abundance of questions isn't necessarily a blessing when you need a few on a particular topic and cannot readily locate them.  That's what led us to the current tagging system.  These days I regularly search through existing questions before developing new ones.

Even though I'm heavily involved in this project, I do not use it for homework at this point in time. 

SDWoodgate's picture

I have been doing online questions on organic chemistry for over a decade, and, in a teaching environment, it turns out that there is a heap of organic chemistry that you can do without free response questions and without drawing free-hand mechanisms.  AND even better you can discover from these questions that the stuff that you always assumed was straightforward is not as straightforward for students as you thought.

For example, you might have liked to ask a freehand question to draw all of the isomers of C5H10.  Doing this on line I was reduced to showing them the various carbon skeletons and asking for the number of alkenes for each.  The responses showed that the students had no appreciation of the equivalence of various carbons. 

Similarly instead of getting them to draw a mechanism, I asked them which is the nucleophile and which is the electrophile.  50% get this right on the first attempt.  That is an important starting point that students clearly do not get until they have practiced.

So the questions do not have to necessarily be the sort that we would ask in an exam, but they are the sort that lead to understanding so that students can be successful in exam questions.  The first step in writing a mechanism is analysing which is the nucleophile and which is the electrophile. 

Online systems can very effectively act as a bridge between hearing about content in a lecture or reading it in a book and doing it yourself without support.  Results like I described above also can inform classroom teaching.

You bring up some good points. Sometimes in my active classroom I do ask the students to identify the nucleophile or electrophile, or if this is acid catalyzed. These straightforward questions can get the student focus their thinking. One question type that I used in the MarvinJS version of these questions was to have students identify the most acidic proton in a molecule. That exercise can be very telling. These questions can easily be incorporated in this system. I think asking for the nucleophile would make a nice step in adaptive questions.

SDWoodgate's picture

I have the luxury of being able to present several answers styles on the same page, and when I do mechanisms, in addition to asking the nucleophile and electrophile, the students get asked how many bonds are broken or made (you need an arrow for each one) and to choose between arrow directions for the first arrow (all the others are in the same direction) and one the arrows have been dragged and dropped in place, to reflect on the changes in charge (first atom in sequence becomes more positive, last atom more negative and in-between atoms no change).  These are all things that organic chemists do intuitively, but they are not so intuitive for students.  Since I also collect data for all of this, I can say that even if I ask them to reflect on the charge changes with the arrows in place, they struggle.

Maybe you have all seen this clever approach created by Andrei Straumanis and Suzanne Ruder, which was reported in J Chem Ed 10 years ago - https://pubsdc3.acs.org/doi/10.1021/ed086p1392.  They basically have developed a system where students can devise their own arrows that are submitted with clickers that allow alphanumeric responses.  I learned about it around five years ago but haven't used it because I seldom have clickers that enable the alphanumeric responses. 

The most valuable aspects of online homework is when it helps students identify the misconceptions or gaps in their understanding.  My gen chem students deeply value the online homework systems for this reason.  I haven't had the same luck with organic online homework systems that I have tried.  Perhaps I have not tried enough of the systems that are available.  (I have not tried Sheila's Best Choice but may need to investigate it soon.)  What kinds experiences do others have?

OpenOChem is a very interesting tool to interact with and engage students in organic chemistry.  I really appreciate the effort that has been made to create a flexible program for use across multiple LMS.  The flexibility has also been incorporated into how each instructor can implement aspects of OpenOChem. 

I also appreciate that there is a form of peer review on the questions submitted to the general pool.  Having the added statistics of question use and student response is great.  I think that this could be well utilized by instructors to develop better questions for assessing topics in organic chemistry.  Adding to the question refinement process, is there a way for the instructors (beyond the quality control editor) to give feedback on questions that are in the general pool? 

As OpenOChem develops further, are there plans to add in different question types?  Short answer or free response would be beneficial for discerning student understanding versus student recognition of the correct answer.  While the immediate feedback via AI recognition of the student’s response would be challenging, for small scale classes answers could be evaluated in a timely fashion by instructors.

I like the utility of asking questions at the beginning or throughout class time.  This function allows for instructors to give more targeted instruction to misconceptions that are elucidated through this process.  I have noticed that there is a transition of students using tablets more and more for note taking.  Would you consider developing an app for mobile or table use?  With tablets there would also be the possibility of collecting student free drawn answers.  There are many structure and bonding misconceptions that become apparent when a student has to provide an answer from scratch without a structure scaffolding program.  Could your optical recognition software be expanded to these tablet responses?

I may have to wait for Carl or Jennifer to jump in on the feedback of questions. Since I am a reviewer, my account allows me to add reviewer comments to questions, but it may be anyone can (this needs to be double checked.) If only reviewers are able, I think it would be straightforward to allow anyone to make comments on questions. I would also think we need to make it possible that anyone gets emailed when a comment has been made to a question.

We are looking at adding question types all the time, but we are limited as Carl is the main programmer. Short answer is currently a question type. I have not used that type as last semester I had 120 students using the system and would not have had time to review free response questions. 

Carl is working to develop other types of questions all the time, so hopefully he will be able to address some of what he is currently working on.

Many of my students have used cell phones and tablets in class, but we are a tablet PC campus where all students have the same equipment and I can require them to bring to class. However, we are revisiting that program and there will be BYO device in the future. 

There are two ways to go with the optical structure recognition program at this point. Jennifer and I were playing around with the idea of using a cell phone picture of a structure drawn on paper and linking that to my software. The issue is that we these devices take huge resolution pictures. Carl and I were playing around with the idea of decreasing resolution before sending to the server as that is undue bandwidth. None of us are app developers, but we have been looking at that possibility. When I was doing early tests on optical structure, I just drew skeletal structures in MS paint and submitted those images to my software and it work. 

One of the biggest challenges is that experts know how to draw structures how they are supposed to look. Students don't. I think using the drawing software does help train students that bonds need be uniform straight lengths, and bond angles should be clean. 

I agree that an initial scaffold can be helpful to students learning to draw structures.  Though it would be advantageous to phase out this scaffold as the course progresses.  We currently assess students on their hand drawn depictions of molecules and reaction mechanisms.  Providing early feedback on student free drawn structures may prevent much more consequential feedback come the exam.

I agree that over time that students should become more adept at drawing structures over time. Here is an image that unless I do some image processing, optical structure will return 2,3-dimethylheptane:

It is very easy for us to close the ring, and know what a student meant, it is more challenging to program that.

In smaller course sections, I definitely want to assess hand drawn structures, but I also find that automating feedback can be benefical as well when teaching larger sections. 

We do have the ability to ask short answer questions.  I haven't used those in OpenOChem yet.  These days I regularly ask essay questions directly from Moodle since that's where I have an existing question bank.  It would probably be good for me to migrate all of my questions over to OpenOChem.  I hadn't thought about doing that until you asked this question. 

I look forward to being able to ask drawing questions with students using their phones to respond.  I also look forward to using optical recognition with structures in a student's handwriting. 

Hi Aubrey - Thanks for reading our article.  Initially the instructor feedback/comments on questions was available to all instructors, but once we added the question reviewer role and review process, I limited the comments to only reviewers and the question author.  I could easily enable it for all instructors.  I felt it should be similar to submissions of scientific articles in that you never get to see reviewer comments of papers you did not author.

I really enjoy coding, so I am always trying new ideas, investgating new question types, new ways to deliver questions, activiites and content.  I have a selection question type much like the older easyochem selection qtype Ehren mentioned and should be out sometime soon.  We also have a Newman projection question type in which students drag and drop groups onto eclipsed/staggered templates (needs some real world testing).  I also built a formula question type for mathematical questions you might use in laboratory questions etc.   We do have short answer, mulitple choice, mulitanswer questions that are all auto graded.  

I have thought about building a mobile app interface, but it can be difficult to draw mechanisms on phones...I'm a keyboard and mouse guy so maybe its especially difficult for me.

Longer term I've dreams of building an AI feedback system.  We could analyze student responses for common pitfalls and provide appropriate feedback, or adjust question difficulty to the students ability.

Let me know what you want and we can build it.

  

SDWoodgate's picture

You talk about data collection as far as which questions have been used, but are you collecting data with respect to how students answer the question?  I would be very interested to know how many can construct the complex mechanistic stuff.  Can you see responses for individual parts?  We see that there is sufficient challenge for students even to complete partially drawn structures or simply to put the charge on the right place on a partially-drawn structure.

I can see each of my students' individual responses for each questionwhen I look at the grades for a particular assignment.  There's a panel to view all of the responses to a single question simultaneously (not overwhelming with an average of 17-25 students).  It's also possible to click on an individual answer to inspect it with a larger graphic.  After teaching for 29 years, I thought I had a firm grasp on what kind of misconceptions students might have.  Seeing their answers in OpenOChem opens my eyes to new and different ways they misunderstand concepts.  And that eye opening happens on a regular basis.

I have never tried to look at student responses to a given question from different classes taught in separate terms.  So I'm going to let Carl respond about how that might be accomplished. 

We collect and save the full response to every question attempt, so we could easily analyze or view all these responses.  As Jennifer pointed out you can currently see all your student responses.  We have plans to anonymize the student info but need to work out a few details first.  We could then quickly view and compare responses across all student attempts of a given question.  Our ChemDoodle structural based question type has options to have students assign charges or electrons to partial structures. 

rpendarvis's picture

Has anyone had experience with the Cengage online homework system, OWLv1 or OWLv2?  Any comments good or bad are welcome.  

Thanks - Richard

Richard, OWL (version 1) was one of the two systems I used previously with my classes.  We didn't like it well enough to keep using it.  But I know some folks that are using OWL2 who are quite happy with it.

I have also tried OWL on several occassions. I "test drove" the original product with a small lecture section. I liked it and the students thought it help. Then Cengage took over and I tried it again as part of the Hart Organic Chemistry book OWL1). I thought it had degraded significantly. There were too many problems with the software and way too many issues with the questions being poorly worded and grading errors (incorrect answers for some question and possible correct answers not socred correctly for others). The student were very frustrated with it and mad about the cost. I have also heard that OWL2 is supposed to be an improvement over version 1, but I would not be willing to use it without a free test drive first.

I have not used OWLv2 for organic, but we are currently using it for gen chem.  It's main advantage is that it can handle high enrollment classes well, and provides immediate feedback for students, which is something difficult to do for high enrollment classes.  There are also interactive explorations that you can assign to the students, which are grouped into simulations, exercises, tutorials, visualizations.  The student can also obtain subscriptions that allow them to reengage the material during later semesters.

The bad: The available questions are typically not the more difficulty ones that are available from the textbook, and some questions are altered to make it more multiple-choice like and easier for computer grading.  We always provide non-graded exercises to students that are at a more appropriate difficulty level, and it is made clear to students that doing OWLv2 homework alone is insufficient.  Another problem is that, sometimes, a particular variation of a question would be unnecessarily complicated or have an almost correct answer that is deemed incorrect for very nuisance reasons.  It's difficult to catch this issue while you are picking the questions, and can cause frustration for a subset of students.

I don't have a full sense of how much and how frequent there are technical issues, but we generally encounter at most one system-wide issue a semester, and it's resolved fairly quickly. 

I have used OWL (first v1 and now v2) for all 6 years I have taught Organic Chemistry. My opinion is that is a good tool, but not perfect. We use McMurry's Organic Chemistry and I use a semi-flipped model with JITT. 

Pros: As our class sizes are getting larger and we are not getting any additional support, online homework is a good option for graded homework. It does give some feedback and has a variety of question types.

Cons: Many of the OWL questions use language that is very different than what is used in the text and students have trouble figuring out what the questions are asking. There is little to no customization for questions, which is frustrating in the mastery questions where students must get 2 out of 3 questions correct to earn mastery, but the 3 questions may be of differing difficulty or topic even. There are questions that have incorrect answers and/or feedback. Editing assignments or changing due dates is cumbersome, requiring 6 to 8 clicks each and the changes are not reflected if any student has opened the assignment.

That being said, I am excited to learn more about Open OChem and have suggested our organic chemistry committee take a look.

bmccollum's picture

A big thanks to the authors of this paper. I really appreciate seeing JSmol in the sytem!

In regards to OWL, I used OWLv1 and didn't like it. I decided to try OWLv2 in Gen Chem at the same time has moving to LibreTexts. I use the homework system as a pre-class reading check. Early in the term, many students will try to skip the reading assignments and go directly to the HW. They tell me that after getting a few questions wrong they go to the text as I suggested, and then return to OWLv2 after reading. I think most of the questions are relatively easy, which is perfect for a reading competency check. I then follow up with active learning based on the same topics, but harder difficulty, in class. 

For organic, I was using OWLv2 as a reading competency check and was happy with it. The questions were more challenging, which definately promoted additional discussion in class. More recently, my department moved to a Canadian textbook (Ogilvie) and OWLv2 doesn't align with it as well as I need for it to be used for pre-class reading feedback. I'm currently trying out the Mobius system that goes with the textbook but it doesn't yet have the drawing element enabled, which drives me nuts. OpenOChem looks like an interesting alternative. 

ngreeves@liv.ac.uk's picture

I am enjoying testing out the questions in OPenOChem as Jane Instructor and am particularly interested in the marking of curly (curved) arrows with associated questions about lone pairs and charges.  I have tried a few of the mechanistic examples and found that I have feedback I thought worthy of submission. So I completed the Submit Comment box. Then I thought, am I wasting my time? Are you able to see this feedback and can you tell who submitted it?

An example of the kind of question I have is: when do we draw lone pairs of electrons in mechanisms?

I know the norm in the US is to depict an enolate with both the negative charge AND a lone pair (sometimes even on carbon) and to start the curly arrow on the lone pair. Personally, I would not show the lone pair and would start the curly arrow on the negative charge (implicit lone pair) e.g. http://www.chemtube3d.com/Acylation%20at%20carbon%20-%20Claisen%20ester%20condensation.html this has always been a barrier to my use of US based resources like Sapling.

In the examples I have tried on OPenOChem the lone pairs have indeed been depicted (on LDA and enolates) BUT the curly arrow is constructed by clicking ON the atom not the lone pair e.g. Question Title: acetone + LDA -> enolate. I thought this was OK but I wondered what my US colleagues would think.

Then I tried Question Title: Draw acylium ion resonance from acetyl chloride. (Arrows only) and was very surprised to see that no lone pair was depicted on the carbonyl oxygen. I would always require this.

I know feedback is always the crucial aspect of online testing materials and, as has been said elesewhere, creating that feedback is very time consuming. I would like to offer ChemTube3D.com http://www.chemtube3d.com/Main%20Page.html as another Open free resource that you could link to in the feedback if you would like. We have literally hundereds of organic mechanisms and much more depicted in 3D that would be of interest to your students and instructors. 

Thanks,  Nick, for pointing us to ChemTube3D and encouraging us to use it as we develop feedback in OpenOChem.  I am okay with the curly arrows starting on atoms the way they are with ChemDoodle's drawing interface.  I think when we were using Marvin the arrows did start on the lone pairs as you were expecting.  I don't find the difference a stumbling block.

Like you, I always require students to show lone pairs when they draw resonance structures.  But when I write the prompt for a question I tend to leave out the lone pairs and require students to add those.  Perhaps that was the thought process for whoever wrote the question about acylium ion resonance?  Maybe we should adapt the questions so that students have to add both the lone pairs and the arrows?

ngreeves@liv.ac.uk's picture

I agree that we can work out where to click but students may end up being confused. The trouble is, we draw lone pairs when we need them and leave them out when we don't. I have seen a bromide ion depicted with four lone pairs but why would you do that unless you were talking about a complete octet? 

My suggestion would be, yes, ask students to add lone pairs and arrows but don't require them to add too many lone pairs e.g a carbonyl does not need two - one if required, hydroxide does not need any (but you would want it to have one, others might prefer three - I hope not) and I would like an enolate to be correct with none.

This makes the coding more difficult but would be more widely accepted. Whatever convention is adopted it needs to applied consistently otherwise confusion may result.

You bring up some good points about how mechanisms can and should be drawn as correct. In the case of acetone + LDA, I looked at what Carl has as the "correct" answer, and he only has one pair of arrows:

I am guessing that you drew a correct mechanism that would add a third arrow to move the pi electrons to oxygen as well. For my classes, I have students only draw two arrows at a time until they are "savvy" chemists. I suppose if we went with the letter of law of the question, it does ask how does the carbon become negatively charged. 

This is probably some of the same frustration that Jennifer indicated in an earlier post about students have correct answers, but not getting credit in other online systems. 

Drawing lone pairs of electrons has always been a pain in these drawing systems, and I think we tend to assume that students should know they are there. I try to remember to draw them in questions, but am guilty of not aways doing it. For this question, it should be there as that is critical of the mechanism. The system does have flaws in that one must click on atoms and not the electron pair for drawing arrows. From a programming perspective that can be challenging as if there are multiple lone pairs on an atom, and any of the pairs can be involved in resonance, then we have to program all possible lone pairs as being correct in the mechanism. Once again, as experts we have algorithms for seeing what is correct, but we need to build that all into a system. I think the compromise is clicking on the atom as to where the electrons move from.

ChemTube3D is a great resource, and would like to think about how to provide links in answers- thank you. I just was looking at nitration of benzene  http://www.chemtube3d.com/Electrophilic%20aromatic%20substitution%20-%20Nitration%20of%20benzene.html and wondered if students really focus on the animation of the bonds breaking or forming. On a paper test situation, or even an automated mechanism question, students will need to draw arrows and the animation doesn't really do that. As experts we know what should be happening and that arrows don't exist, but they are our way of showing that areas of high electron density react with areas of low electron density. In the animation of nitic acid being protonated, the bond breaks between proton and oxygen of sulfuric acid before the oxygen of nitric acid accepts the proton. When I draw the mechanism for students, I have the first arrow come from the nitric acid oxygen. I just wonder out loud if students then would confuse which arrow to draw first, and why. I like to teach it from the perspective that forming new bonds is beneficial and releases energy, and breaking bond requires energy. The animation shows the bond breaking first, then formation of a new one. I just think back 30 years ago when my first organic instructor emphasized that we are electron pushers in these mechanisms, and wonder if I would be confused as a new student by the animation. 

Finally, I do think my take home point from your post is as I am putting questions together, I need to be sure to include all relevant information. I tell my students that organic is hard as we have to unpack so much hidden information (lone pairs that may not be drawn, protons that are not explicit) before we start to answer a question. Maybe I need to be sure I provide more explicit information.

(as a side note, I did see your comment on the question, but we need to add an automatic emailer to let the author know if a comment was added. Since I am not the author of the question, I would not have known you made a comment unless I looked at the question.)

ngreeves@liv.ac.uk's picture

Yes, I instictively drew the full mechanism leading to the enolate with a negative charge on oxygen only to be told I was incorrect.

Jennifer's concern about students being told correct answers are wrong, never mind losing credit, is also mine. Over the years I have used online systems and have always been able to supply completely correct answers that were "wrong" because the question setter had not considered all the options (I did this with a question about enolates too, cis was "correct" trans was "incorrect"). Our subject is difficult in that regard. I have to say your approach of detecting organic structures seems to work very well. I tried adding explicit hydrogens but it was not fooled. I recall an earlier fight with another system that required me to draw ALL the hydrogens - no way!

As I said, I think clicking on the atom is OK if not strictly correct. It just strikes me as inconsistent to require/display lone pairs sometimes but not others. I too am inconsistent as I use negative charges as the tail if they are present and lone pairs if they are not.

The animation does have 3D curly arrows - click on the first structure. Did you use the Larger View button and look at the animation frame-by-frame? I think your perception of the timing of bond breaking may have been altered by watching at full speed. Our intention, although we may have missed the odd one, was to have a transition state phase where all bonds making and breaking are shown as dotted at once. I agree that the animation contains much more information than a traditional paper mechanism and so could be confusing. I use them in lectures to explain what to learn from them. Our aim is to allow greater understanding of the processes that could be useful at higher levels of study rather than just providing the mechanism, although that is there too of course in 2D. If you want to risk being confused even more we have animated molecular orbitals for http://www.chemtube3d.com/Electrophilic%20aromatic%20substitution%20-%20Friedel-Crafts%20alkylation.html and these show the results of the curly arrows.

I'll continue to comment on the questions if I find anything else worth mentioning now that I know someone might be listening.

 

 

One of the things I really like about OpenOChem is the ability to set up more than one correct answer that gets full credit.  And it's easy to edit questions after they are initially created.  So when a student draws a correct answer that I did not initially anticipate, I can go back and add that as another correct answer.  That way the question works more effectively than it did the first time.

In my class students get full credit for attempting the question even if they got the answer incorrect.  That's a handy policy if you're assigning questions before the material is discussed in class and you do not expect students to have mastered the reactions yet.  It's also a handy policy when you're a beta tester like me, so nobody gets stressed over whether the computer marked the answers as incorrect.  So the system improves as we continue to tweak things in response to comments like yours.  We appreciate your observations, Nick.  They will help us to improve the system and make it more useful for other educators and their students.

I see the arrows in the animation now. I had not clicked on the structures with arrows, but on the reaction scheme step arrows only. Those seem to not have the curved arrows in the animation. If I click on the mechanism step, it starts with the arrows and then shows the animation. 

The system does fairly well with cis/trans, but I have found that I have to put in multiple answers for E/Z imines.  

 

Sometimes I forget to draw the lone pairs too!  I went back and added the lone pairs to the acylium ion question.  I like to have my students draw enolates with charge on carbon, so they know it is the nucelophilic atom.  Of course I do discuss the resonance structure with charge on O, but O-alkylations are reserved for my advanced ochem.  ChemDoodle does not support arrows starting on lone pairs, however Kekule.js should be able to do this in the future so we may explore using it.

I also tend to draw the curved arrow from the negative charge, assuming that there is an implied lone pair, but our current online homework system (OWLv2) requires that the students always start the curved arrow at a lone pair. Upon reflection, I think this helps novice students follow through with the idea that it is the electrons that are moving and making the bonds. However, once you have mastered the concept it is easy to start with the charge, knowing that there is a lone pair associated. Especially in Organic 1 I make sure to draw in the lone pairs and always start the arrows at a pair, but toward the end of Organic 2 (and in advanced courses) I will abbreviate and just use the charge, after having the discussion of what that means and why we can abbreviate it that way.

Layne Morsch's picture

Thank you all for the work you have done on this important addition to organic chemistry Open Educational Resources. The examples show Moodle and Canvas and I am wondering if this will work with Blackboard as well?

Layne

When Carl switched from Moodle plug-ins to the current approach, it was with the intent of being adaptable for all learning management systems.  So the answer is yes, it should work with Blackboard.  But I'm not sure whether any folks who use Blackboard have adopted OpenOChem.  I suspect the logistics about how to make the connection between OpenOChem and your LMS would be slightly different for Blackboard since there are slight differences between Moodle and Canvas.

My University has not used Blackboard so I don't have any experience with it.  Googling up "Blackboard LTI external tool" gives me this

https://help.blackboard.com/Learn/Administrator/SaaS/Integrations/Learning_Tools_Interoperability

Seems like it should work.  Contact me via email and I can help you get going.

Clearly OpenOChem is a valuable tool for instructors and very well put together.  But with any work for students to complete or pedagogical, the details and implementation matter most.  If this platform is LMS agnostic, I assume that it is also textbook agnostic.  How are the differences in texts addressed by the developers or question suppliers?  Additionally, I was curious about the process for instructors to comment on the questions in the database and what the process of editing/proofreading of questions before the recent discussion of that.

One advantage of textbooks or commercial software is the very large number of users, which generally correct errors/misconceptions/mistakes/inconsistencies over time.  If I were to become a user, I would really want to see the feedback on each question from other instructors.  I think seeing that feedback would help others evaluate whether or not to use a particular question.  A major advantage of our modern communication technology is that our teaching practices no longer need to happen in relative isolation at each institution.  We can share resources like OpenOChem and in the process expose our own curricula to meaningful scrutiny and expert critique.  If we take these opportunities seriously, we will all become better instructors over time.

[Full disclosure: Acylium cation is my favorite of all the cations.  Obviously, it is important in EI-MS and as an intermediate in Friedel-Crafts acylation, but it is almost certainly a yet-to-be-discovered space molecule.  The only thing that could make organic more awesome is to do organic in space!  It is the methyl analog to known space molecule HCO+ and simply a protonated form of ketene.  I also have extra nostalgia to the acylium cation as I tried and failed to generate enough of it in a discharge to detect its rotational spectrosctrum on multiple occasions.  We're still working on this endeavor.  It’s important to hold on to one's failures…]  To the goal of having a discussion about curricula in detail, the structural depiction of the acylium cation resonance is less than ideal (Table 1, Figure 10).  Like Nick’s previous comment, I found the lack of lone pairs unusual.  With lone pairs added it would depicted in a manner that is consistent with some textbooks, but not with our best available experimental and theoretical information.  The C–O group in acylium cations is really a triple bond with very little double bond character; computational estimates (MO or NBO approaches) indicate more than 5.99 shared electrons.  This is consistent with known gas phase and x-ray structural data for RCO+ containing molecules, IR spectroscopy, and the highest level calculations.  The resonance weights are predominantly the triple bond structure and less than 0.2% the double bond structure using some very high level calculations (CCSD(T)/ANO2).  Some newer textbooks are starting to avoid the double bond structure all together.  I think the double-bond structure confuses students since acylium-type cations are isoelectronic with nitriles and there is not much reason to have an empty p-orbital on carbon perfectly aligned with a filled p-orbital on an oxygen atom.  The resonance structure with the positive formal charge on the C-atom is not required to communicate the electrophilicity or charge of that carbon atom.  The electronegativity of the O-atom and the location of the pi*C-O orbitals take care of the rationalizing the reactivity.  A colleague and I have published on this (https://pubs.acs.org/doi/10.1021/ed5002152), including a student-level activity that demonstrates all of this that we use with our FC acylation lab experiment.  If anyone doesn’t have access and wants the article, feel free to contact me directly.

I would agree that you expect the commercial software that is "made" for a certain text to align well with that text and that the publisher would respond to feedback about incorrect questions, but I have not personally experienced that. We have sent lots of feedback to the publisher for our online homework system and still see some of the same mistakes year to year. I mentioned in a previous comment that OWLv2 was marketed to us as designed for McMurry's text but definitely uses language and asks questions that do not align with the book. If I'm going to have these types of issues, at least with Open OChem it could be cheaper?

As of now our students are using the system for free, and IUP is providing the servers and bandwidth. We have discussed a nominal fee structure to support the maintenance, but have not moved forward with that discussion. If we move to that type of structure, a goal is to be sure that the cost is an order of magnitude less than what students are currently paying. Can you share what your students are currently paying for access to OWL?

Currently we use Cengage Unlimited which comes with the lecture text, lab text, and online homework OWLv2 for ~$120 (depends on if they have other courses that use Cengage so how long they need to have the unlimited subscription).

Tanya Gupta's picture

Great work on openochem. It seems to be a very useful system for students and instructors both for assessment. Can you please share aspects that students find both intriguing and challenging. Also I am correctly assuming, it can be used for teaching inorganic chemistry too.

Thanks, Tanya. I think one of the technical challenges has been that sometimes a student's browser will not allow them to draw using the ChemDoodle interface. I usually have them clear their brower cache to clear this up. My students have used on phone devices, which is small, but not impossible. I am trying to recall when Jennifer and I were trying iPads, we didn't always have great luck with drawing but it was doable. 

One of the challenges that Carl and I have discussed is if automatic grading and import back into the LMS is possible. I have to remember to export the students grades back to Moodle after quiz.

SDWoodgate's picture

When you see the results that I have seen with respect to student understanding of organic mechanism, whether the arrow is drawn from the lone pair or the atom or the charge pales into insignificance. 

I have a question where the users are expected to eventually complete the mechanism for acid-catalysed hydration of a double bond given the overall reactants and products.  50% of 3000 users get that the intermediate in the first step (alkene + H+) is a carbocation (given the structures of the carbocation, radical and carbanion), 61% recognised that this step required one arrow, 65% recognised first time that the tail of the arrow had to be at the double bond, 63% recognised first time that the head of the arrow should be at the H+ AND even with those two answers in place only 71% got the direction of the arrow correct.  BTW we drag and drop arrows after all of the questions are answered.  None of this is the type of thing you would ask in an exam, but it is important for potential mechanism drawers to learned to think about these things, and it seems to me that this is all of much more consequence than whether the arrow is drawn from the lone pair, the charge or the atom.