You are here

Tracking student use of web-based resources for chemical education


Robert Bodily and Steven Wood

11/10/16 to 11/12/16

We have developed an analytics system for use by general chemistry students that tracks video and quiz interaction data and reports it back to students in real-time. To this end, we have developed a learning analytics dashboard for students. When students access the dashboard, they can easily ascertain the gaps in their knowledge. Furthermore, they can click on a flagged concept to access video resources, practice questions, or web resources to remediate their lack of knowledge. We have tracked the use of our initial dashboard version by students in the first semester of a two-semester general chemistry sequence. We report on the student use of web resources in this course and provide recommendations for researchers and practitioners in chemistry education based on our results.



As online and blended learning in chemistry education continues to grow (Allen & Seaman, 2014), it is increasingly important to understand how students interact with resources online. Once we begin to understand how students use web-based resources, that data can be used to improve the design, the access, and the content with the object in mind to improve the learning potential of these resources. This is the focus and objective of learning analytics. The learning analytics process includes selecting data, capturing data, using data, and acting on data (Lias & Elias, 2012). We set out to develop a learning analytics system, in the form a class dashboard, to capture student use data as they interact in real-time with course content such as videos and quizzes. We had three major research questions that we initially set out to answer.

Research questions

R1: How can student use of web-based resources be tracked in open learning environments?

R2: How are students using the videos, quizzes, and dashboard provided to them in the course?

R3: What is the relationship between student use of web-based resources and overall performance in the class?

Design and Development of the Learning Analytics System or Class Dashboard

Technical Infrastructure

The following technical term definitions are provided to help the reader understand the analytics system that we constructed:

  • Learning Record Store (LRS): a database to store student interaction events
  • Learning Management System (LMS): an online platform to host content and keep track of student grades
  • Experience API (xAPI): a standardized data format that is consistent across applications. It uses a “verb”, “actor”, and “object” framework to identify what has happened in what context.

A graphical description of our analytics system can be seen in Figure 1.

Using LTI, students can launch to external learning applications from the LMS. This allows a student to be tracked across systems. It also provides a streamlined experience for the student because they do not have to sign in again when they use another learning application. Experience API is a data format standard that allows data to be passed in the same format to our LRS from multiple applications. The dashboard can then pull data out of the LRS in real-time to provide students with their performance data in real-time in the form of a visual report. This system enables us to collect student interaction data as students interact with quizzes, videos, and the dashboard in the course. Table 1 provides a description of the types of data we collect in our system.

Table 1.

Data points collected or calculated in our analytics system.

Video Analytics

Quiz Analytics

Dashboard Analytics

# of plays

# of question attempts

# of dashboard views

# of pauses

Time spent on a quiz

Time spent in dashboard

# of skip forward

# of quizzes attempted

# of video suggestion clicks

# of skip backward

Average confidence level

# of quiz suggestion clicks

# of play rate changes

Max number of attempts

# of unique visits to dashboard

# of volume changes

Max time on a quiz


In order to provide a better understanding of our system, we have included screenshots of the dashboard tool (Figure 2), quiz application (Figure 3), and video application (Figure 4). The videos used in the class can also be accessed at

Figure 2. The content recommender dashboard displaying student mastery scores.

Figure 3. A demo quiz question illustrating our assessment system.

Figure 4. An example of one of the course videos in our video player.


During the Fall semester of 2015, we initially implemented our web-resource tracking system in a general level blended chemistry course at a large western US university. There were 200 students that consented to participate in the study who allowed us to collect their data. The chemistry course content that was available to these students via the system we had developed consisted of (1) high quality videos developed with animations and audio, (2) short formative quizzes related to the videos in the course, and (3) a dashboard that provided students with information on their knowledge gaps and metacognitive skill abilities based upon the use and performance data generated on their performance and use.

Relationship between Online Resource Use and Overall Performance in the Course

In order to identify which use elements were predictive of student grade in the class, we ran a linear regression. The variables used as independent variables in the class were taken from the student use of web-based resources. However, it should be noted that some of the variables were removed from the model as some of the variables were too similar to each other (multicollinearity). The final list is defined in Table 2 below.

Table 2
Descriptive statistics for variables use in the analysis





total (Overall) Percent



Grade in the class




# of times navigated away from the quiz




# of times paused a video




# of times clicked to go to another place in video

changed play-rate



# of times changed the video play-rate




# of times clicked show hint in a quiz

confidence level



Average of self-reported confidence in answers




Average play rate in videos

first Attempt Quiz Score



Score on quizzes based on the first attempt only

follow Feedback



# of times followed feedback in dashboard




Time it takes to take a quiz

late night



Percent of events after 11pm and before 5am




Percent of days a student accesses online material

knowledge awareness



Percent of high confident correct answers




# of attempts without clicking to see answer

deep learning



# of questions without clicking show answer

We first discuss the results of some general resource use average trends among video, quiz, and dashboard use, and then we discuss the results of a multiple-linear regression predicting student final grade using these resource use variables.

There were 215 videos available to students in the course. However, the optional video use was low (see Figure 5).

Figure 5. The number of videos watched by each student.

Despite the fact that there were over 200 videos available for students, the median student number of videos accessed was about 40. These videos were high quality videos developed from an NSF grant with animations synchronously delivered with a concise audio text. It was disappointing for us to see that even when providing excellent resources for students, many of them chose not to use the videos. Our initial thought in creating the short quizzes that were correlated with the videos was that the quizzes would encourage students to view the videos prior to attempting the quizzes. Even with quizzes related to the videos, video use was still low. We suspect that this low use could have resulted from the quizzes being too easy and students were able to do well on the quizzes without having to watch the videos. This could also happen because students had a number of other resources available to them that they could have used instead of the videos.

Student use of the dashboard paralleled video viewing. We found that only about 40% of students accessed the dashboards, and even among those that accessed the dashboard, they typically only accessed it a few times.

These video and dashboard use data highlight the importance of identifying how best to support student engagement with web-based resources and feedback. If time and effort is invested in the creation of resources that students do not utilize, that effort is wasted. Tracking student use is an important means for evaluating not only the effectiveness of web-based materials, but also to determine the best ways to incorporate these materials as an integral part of the course such that student benefit.  

We used the collected data for the analytic system to see if student use of the resources was correlated with a student’s overall grade in the course. We used linear regression to determine if elements of student resource use are predictive of student final grade. The results of our regression can be seen in Table 3 below.

Table 3
Linear regression results predicting final class grade



Std Error


















changed play-rate










confidence level










first Attempt Quiz Score





follow Feedback










late night










knowledge awareness










deep learning





Note: Coefficients are large because the dependent variable had to be transformed to satisfy the assumptions of linear regression.

As can be seen in the table, pausing, changing tabs frequently during homework, and working during the middle of the night were negative predictors of student achievement. Jumping around in the video and being confident in responses were positive predictors of student achievement. Although these results are not causal, they do highlight indicators that may be useful in identifying students struggling in the course.

Implications for Practice and Research

Based on the results of this limited study, we provide recommendations for chemistry education researchers and practitioners. First, when investing time and money in educational resources, take the time to collect data on how students are actually accessing and utilizing the resources created.  This information provides instructors or designers the actual and not the self-reported feedback required to redesign certain aspects of their online material and their courses. Second, tracking student use of web-based resources results in information that is unobtrusively collected that can be used in prediction algorithms to help identify who is struggling in the course. We envision that these variables can then be included in some sort of instructor dashboard to help instructors in the student remediation process. Third, additional research is needed to understand how to support students in metacognition, and taking advantage of feedback provided by systems like the dashboard of our analytics system. Perhaps the major question raised by our findings is best started as, why are students choosing to not take advantage of resources we are providing for them?


This article has described the technical infrastructure needed to collect student use of web-based resources, trends and issues encountered by analyzing this student use, and a possible predictive model to predict student grade in the course using a learning analytics system that collects actual student use of online resources. Implications for practitioners and researchers based on the results of this paper have been provided.


Allen, I. E., & Seaman, J. (2014). Tracking Online Education in the United States, 1–45. Retrieved from

Lias, T. E., & Elias, T. (2011). Learning Analytics: The Definitions, the Processes, and the Potential.



Bob Belford's picture

Hi Robert and Steven,

Thank you for sharing your work with us.  I would like to start off with an obvious question not related to the real topic of your paper, and then get down to your paper.  That is, you have created over 200 videos and made them available to the public  Are other faculty from other schools allowed to integrate them into their curricular material, and if so, what protocols should they take.

If I am reading your paper right, these resources are not being utilized to the maximum extent possible, and this can lead to a lot of questions. As you have a lot of analytic data, I’d like to ask a couple of questions on student usage.  Do you have a bearing on time spent on all videos as compared to time spent in class, and time spent on other tracked out-of-class activities (I assume the Quiz App is tracked)?  Do they navigate to the quiz and then video lecture, or vice versa?  And are there videos that get used a lot, and others that don’t, and if so, do you have thoughts on that?  (Is usage correlated to topic?  And if so, which are high hitters, and which are low hitters, and what are your thoughts on that?)

Cheers, and thanks for sharing your work,


Thanks for the questions!

1. Protocol for using the videos.
The videos are publically accessible right now. There have been discussions of using them in a commercial textbook system, but none of these have panned out as of yet. The videos are not copyrighted, but they do not have a creative commons license either. We are thinking about licensing them using creative commons, which would grant other instructors the rights to reuse and redistribute them in their courses. So I think the videos can be used by others, but you should attribute Dr. Steven Wood at Brigham Young University and say videos are accessible at

2. Time spent in videos compared with time in class
We have tried a couple of different classroom structures for videos and quizzes with varying levels of success. First, we tried making videos more important by having students watch them before lecture and then take a short quiz on the concepts in the videos. Second, we tried making videos optional (watch if you need them), and quizzes were more like homework assignments with unlimited attempts. Third, we tried making videos optional and limiting attempts (to 3) on quizzes to make it more of a high stakes assessment environment. If videos were optional, students did not use them very much. However, the first time we encouraged video watching before lecture, and we saw much more video use that semester than the next two semesters. For quizzes, the third time (limiting attempts to 3) seemed to be best for students. Otherwise if the questions were too easy or they had unlimited attempts they didn't take the quizzes seriously. In general, students were spending 2.5 hours per week in class, but would only spend about 30 minutes on online homework and video watching per week (not a calculated average, just what we seemed to see across semesters).

3. Video usage trends across videos
There were definite trends across videos. Some videos were used much more than other videos. However, we haven't analyzed the video data from a design perspective yet to see why certain videos were watched more than others. We did look at student behaviors in videos and found some students watched straight through, some paused and went back frequently, and some skipped forward through the video. These behaviors were fairly consistent across videos.

Thanks for your questions! I hope I have sufficiently addressed them. Feel free to respond with additional questions if you have them.


Tanya Gupta's picture

Hello Robert and Steve,

This is quite interesting and an eye-opener. I agree that there is a lot of investment in creating these resources but if only 40% students use it then it becomes a challenge to determine the effectiveness of web-based resources. I am courious about the student use of text and their use of these resources. My assumption is that there is an assigned text and if students also used text (that might have been a factor for lesser use of videos or pausing, skipping or change of play rate.

My second question is about student attitudes and student motivation. If you have data or plan to collect data for student attitude/ motivation, it might help see the reason behind lack of use of resources.

Another aspect is what percent of students is majoring in chemistry or in a science area - that might also help seperate students who viewed versus those who did not view the videos.

Overall this is an interesting  paper - last question :-) - how did you create the learning analytics system. What tools and resoruces would one need to create such a system to assess students in their respective classrooms.

Thanks for sharing your paper - it is very informative.


Hello Robert and Steve,

A very important paper.  The opportunities that this offers for formative assessment of a course in real time are astounding.  Your tool offers instructors easy access to the sort of information that they need while teaching and IMHO I WANT IT.


I'm excited to hear you appreciate the work we are doing!

We are currently developing a scalable version of our system that we will deploy online for other instructors to use in their classes. As you can imagine, this is quite a big task. However, we have been making good progress and we are planning to do a beta launch within the next few weeks. This would allow you, or any instructor, to take advantage of the analytics infrastructure we have developed.

Thanks again,


Hi Tanya,

1. We were not able to track student use of the text explicitly because it wasn't online. However, we did administer a survey to students to better understand their use of various resources in the course. We found that there were some students that just used the videos, some that just used the textbook, and some that used both the textbook and videos. This resource preference is an interesting variable that we should include in our future analyses.

2. Student attitude and motivation data would have been excellent to compare with our analytics data. Unfortunately, we did not collect any kind of learner characteristic data. Future analyses we conduct will definitely move into this area though.

3. Student demographic data was not collected for this study because we were more interested in activity patterns and student use of resources. However, future analyses will include student demographics.

4. The learning analytics system was pretty difficult to create. We basically took existing open source applications (quiz application from open assessments, video application from and put them on our own server. Then we implemented an analytics back end, created a NoSQL database, and stored all of our data in the database. Then we created the dashboard system to pull from our database. This process is quite a technical challenge. However, we are currently developing a platform that would make it really easy for another instructor to do something like this. It would allow other instructors to create content (quizzes, resource pages, videos), upload them to our platform, and use them in courses. The platform is about to do a beta launch, but it is still under development.

Thank you for the questions.


Hi Robert & Steven,

Thanks for an interesting article.  I also routinely track student use of resources as much as possible and have struggled with getting students to use video resrouces on a regular basis.

Your data collection has focued on student usage of the analytics system.  Have you considered collecting data regarding student opinions on how helpful they thought the various compenents were?  For example, did they find the videos more helpful for learning the material than the quizzes, or vice versa.

Have you sharred these results with students?  Perhaps if they could see how "A" students are studying, they would be more willing to change their study habits.

My last question is about the learning management system (LMS).  How does this system compare to Blackboard, one of the more commonly used systems in higher education?

Thanks again for a great article!



1. It's good to hear (but at the same time not good to hear and frustrating) that you have also struggled with getting students to use video resources. This is a significant issue that needs to be addressed in the online or hybrid learning space.

2. We started investigating student help-seeking behavior in an attempt to understand why students used the resources that they used. It seems that students have preference regarding their resource use choices. For example, some students are independent and are smart enough to find answers to their own questions quickly. This means they use textbooks, videos, and the internet more often than others. However, there are some students that want to get answers quickly, but instead of going to resources (maybe it is too hard for them or they want the easy way out), they go to people resources, such as teaching assistants, friends, tutors, or study groups.

3. In our dashboard we thought about making class comparisons, but we were afraid that people too far below the "A" students would feel discouraged, and students above the "A" students would stop trying. In future implementations of our system we will try out some class comparison features, but we will be mindful of how if affects certain types of students.

4. The system that we use has implemented interoperability standards (LTI, xAPI) which means that it can attach on to any major LMS. At BYU right now we are using a proprietary LMS called Learning Suite, but Canvas, Moodle, Blackboard, Sakai, D2L, or Brightspace would all work with our system (any LMS with LTI implemented). Our system is better than current LMS dashboards or analytics functionality because a lot of learning occurs outside of the LMS, LMS don't give researchers real-time access to data, and researchers can't control the data reporting aspects of the LMS (it's proprietary).

Thanks for the questions,


Can I ask a question about the instructional philosophy behind the videos?

In chemistry education, there seem to be two prevalent instructional philosophies.  Since the 1970’s, there have been many who advocated encouraging students to solve problems by “thinking like a scientist,” and to encouraging solving based on conceptual understanding rather than applying memorized facts and algorithms.

The second school, who I think includes the “traditionalists,” assume that students must move a sizeable number of facts and algorithms on a topic into long-term memory before they can solve the kinds of problems that are assigned at the end of the chapter in General Chemistry texts.  Among cognitive scientists (who study how the brain works), this also seems to be the consensus method they recommend in 2016, due to measured characteristics of the “working memory” where the brain solves problems.

Did the approach in your videos favor a particular instructional philosophy or scientific approach to learning?

-- rick nelson


You pose an interesting question. We developed the videos to help students develop a conceptual understanding of the material. We have heard from a number of students that after watching the videos they have understood certain concepts much better than before. My undergraduate degree was in Neuroscience so I took general chemistry, organic chemistry, and biochemistry; I was also a chemistry teaching assistant for almost 2 years. But despite this, sometimes I would sit down to review a concept, watch a video, and would learn something or see something in a new light that I hadn't realized before. This seems to favor the "thinking like a scientist" approach.

However, the formative quizzes and homework problems that students have to do gives them repeated practice on facts and algorithms necessary for them to be able to solve more complex problems. This seems to favor the "traditionalists" or cognitive science approach.

In conclusion, I don't think I have a good answer to your question. There seems to be elements of both instructional philosophies in our current implementation.



Have you thought about putting a comment section below the video so students can comment, make suggestions, get answers as they go.  Probably a lot more work on anti-trolling but maybe a good thing would be to allow anonymous comments.  

Also have you thought of/tried how to add more production values? Have students do the narration, etc.


Yes, we've thought about adding a comment section below videos. We have also thought about implementing a question and answer application that students could use to ask questions and answer other students questions (similar to Stack Exchange or Quora). Yes, there would need to be some moderation to protect against trolling, bullying, or cheating, but I think this would be extremely useful to students. This is something we would like to do in the future.

In terms of production, there were students involved in the video creation, image creation, script generation, and filming process. However, it was a very structured process. It seems like there should be a way to involve a greater number of students in a crowdsourced way to create instructional materials at lower cost. We are exploring ways to do crowdsourced assessment creation and content creation, so simple short student created videos would fit well with this idea.

Thanks for the comments,


Last year, I used Zaption (which closed its services) and now use PlayPosit.  Their subscription program includes LTI and I can integrate questions with the lessons.  These are scored and the grades posted in Canvas.  All of them together are only worth 5% of the grade (we've had ~60-70 so far this semester) but I have very high compliance rates since they are worth some points.  I review the results the morning before class and see where students struggled.  If there are any, I give a 5-10 minute review of that content only.  I never re-teach the entire content of the videos.  We spend most of our class time doing problems in small groups.




It sounds like you are doing some really good things. It uses LTI (good), posts a grade back to Canvas (good), and is worth points (good). These are all things our system does, but PlayPosit looks like a good tool as well. It's encouraging to see that students are completing the videos/quizzes. I have a few questions for you:

1. Are you able to track any student data within PlayPosit?
2. Do students like the tool? Is there anything they do not like about it?
3. What kind of student or instructor reporting tools do they have (dashboards, feedback, etc)?



1) The instructor dashboard is pretty minimal has a summary and average on each question for all students and see how each student answered each question.  They also have to rate the video and leave a comment at the end for the score to transfer to Canvas which is on the dashboard.  Most of the comments are "n/a" or "..." which I told them was fine since reviewing the comments requires an extra click for each of them (haven't yet found a way to export these which would be great since I try to spot check some for seeing where students are struggling before class).  

There are a few analytics on each lesson based on Bloom's verbs in questions, types of questions but not much.  

2) I think for the most part they like it.  I had a student tell me last year that they love it (because it's helpful) but they hate it (because they sometimes forgot to do them).  At the beginning of the semester, we were running into issues with some grades not transferring so they had to be recorded manually.  Part of it was students weren't accessing through the link in Canvas (going through links on PlayPosit's site) and some was on PlayPosit's side but that seems to be greatly improved.  

The one thing some of them don't like is that they can't redo the questions (which Zaption allowed if they rewatched the video).  Once my repeaters are through the course, that won't be an issue since they won't know anything different.  The other comments are more about the flipped class model rather than the tool.  

3) Students can go back in and see their scores on PlayPosit and review the questions and how they answered but I'm not sure how many do.  I make a YouTube Playlist for the week but obviously without questions to make them easier to rewatch after the due date.  I have to hide the actual links the assignment after the due date and time so I don't have to deal with late grades in Canvas since it will let them submit late and just mark them in pink. 

Here is an example of one of my lessons.  This is for my GOB class (pre-nursing students) and they've already had two other lessons (one on assigning oxidation numbers and one worked example of identifying reduction and oxidation in a reaction).

Dear Robert and Steve,

I think your system is intriuging!  I was wondering if you have ever considered using a commercial product to develop your videos, your videos with questions, or your analytic software?  Articulate Storyline, I think does something similar to what you are trying to do, although it does have a steep learning curve to use it.  Also I developed videos like the ones you use in your class - using Camtasia software, and I added questions using this same software.  Then I uploaded the questions directly to Canvas (our Learning Management System), but it can be uploaded on any LMS that has SCORM capabilities.  The quiz looks like the ones that Alison created using PlayPosit.  The grades for the quiz appear in the gradebook, and you can also have an excel spreadsheet sent you of everyone's grades once a day.

Thanks for your paper, Caroline


Thanks for the comments. There are many commercial products available to develop videos, questions, and analytics. We considered using a commercially available system, but for our research we wanted (1) real-time access to student data, (2) to be able to track student activities across videos, quizzes, and our dashboard system, and (3) to be able to change the student dashboard system as needed. Because of this, we opted for a more modular and interoperable system. I think that Articulate Storyline is a great program that will work in most use cases. Thanks again.


Bob Belford's picture

Hi All, if possible, please do not use reply, but reply by commenting to the web site, that way we can keep a record of the discussion below each paper.  Please contact me in private if you have problems logging in, and I do understand that some times things just slip by.  But I am going to take the liberty to repost Sheila's comment to the web site, and ask that we reply through the website comment/reply feature, and not via direct emails to the ConfChem list.


I would like to applaud your emphasis on collected data to find out how your resources are being used.  This is something that I have advocated for years on this forum because I had similar revealing results due to data collection on BestChoice.

Adding a field where students can enter comments would be a great first start to what Josh has suggested because students are very good at offering constructive suggestions about how things are taught (again this is from my own personal experience collecting user comments).  They are a source that we should tap more often AND listen to what they say.

I find the comments about videos also interesting because, in my opinion, while videos can be a means of transferring information, the information transfer process is not, indeed any different to what it would be in a classroom EXCEPT that the teacher in the video has less contact with the audience than the teacher in the classroom.  However, in days of diminishing student attendance at lectures, they are useful.

We have, for some time, been delivering activities that are a mixture of information pages and question pages.  These are designed to act as a bridge between classroom participation and independent study.  I have include a link to one of these.  If the link doesn't survive the email go to  Click on Demo (bottom of blue box).  Near the bottom of the left-hand menu there is a section called redox.  The activity is the one on oxidation numbers.

Videos are easy to incorporate in these activities, and it seems to me that just like you needed data to see how your students were using the videos, the students need questions that they can do immediately after viewing the video so that they can see whether they have grasped the concepts.  You also need data from those questions to inform you.  We have a system to do all of that and a teacher dashboard.



Thanks for the feedback. Student comments on specific course resources or teaching methods in general would be interesting to solicit and examine. Thanks for the link. Best Choice seems like a good system to do activities like these. And I like that they have an instructor reporting tool (teacher dashboard). Thanks again for the comments!



This is a very interesting paper. Have you considered making the quiz questions sufficiently challenging that you could actually use similar ones on the test in the course?

Do you have correlations between the final grade in the course with time spent using the videos? With the average grade on the video quizzes? With the estimated total time spent studying?




Good questions. We have iterated through a number of different course structures. In our final iteration we used quiz questions that were at the exam level and we limited student attempts to 3 per question. Some questions were multiple-choice and others were short answer. This change made the homework more meaningful for students (although they probably didn't like it as much) and they cared more about the data collected from the homework (shown in the student dashboard). 

Regarding correlations, video viewing in the course had no correlation with final grade in the course. There could be a few explanations for this: (1) only students that needed additional help were watching the videos or (2) there were so many other resources in the course that some students remediated their lack of knowledge in other ways and didn't need to watch videos. In terms of what was most predictive of final grade, we found that students that read the textbook generally did better on the final exam (not very surprising...). Unfortunately, we don't have a metric for total time spent studying, although this would be a good self-report variable to get from students. It seems like there could be some bias in just asking students at the end of the semester, but maybe each week students could report the amount of time spent in the course that week.

Thanks for the ideas.