Jose Barreto,John Reilly,David Brown,Laura Frost,Sulekha Rao Coticone,Terry Ann Dubetz,Zanna Beharry,C. Michele Davis-McGibony,Ria Ramoutar,Gillian Rudd 2014-08-12 02:56:59
New technological developments have minimized training, hardware expense, and distribution problems for the production and use of instructional videos, and any science instructor can now make instructional videos for their classes. We created short “Khan style” videos for the topic of buffers in biochemistry and assigned them as homework, followed by group problem-solving sessions in class. We tested the hypothesis that “inverting the classroom” (a popular terminology for the new format) could replace traditional live lectures, which are typically followed by assigning homework problems (traditionally, mostly solved by students working alone). Using the inverted classroom method, we found that most of our students achieved mastery in solving buffer problems on an exam, without any live lecture (the class averages were ~80%). Our survey data showed that both students and faculty reviewers considered the new format to be an effective teaching tool. To validate our results, we included six survey questions concerning rigor and fairness; positive data were obtained in this regard, with a mean of ~4, on a 5-point scale. We included three separate classes in our study with grade data from 67 students and survey data from 42 students. For well-trained specialists using expensive hardware, the possibilities for video instruction have been with us for many decades. Very recently, there has arisen a combination of inexpensive hardware (a ~$60.00 drawing tablet), free software, and free universal access to video storage and retrieval sites on the internet. These developments have now put video production into the hands of any interested party with minimal training (Gannod, Burge, & Helmick, 2007). Salman Khan (director of the Khan Academy website) has created hundreds of science instructional videos that share certain characteristics. They are short (~15 minutes or less) and consist of a black background with extensive use of color for formulas and drawings. The teacher’s voice is synchronized to the activity appearing on the computer screen. At no time does the teacher’s face or hands appear in the video. The end result is somewhat mesmerizing (a Kahn video example on the topic of buffers can be found at http://www.youtube. com/watch?v=HzmI7A578ss). As university biochemistry professors, we were particularly interested in scientific videos that involve the presentation of complex quantitative topics (Eick & King, 2012). Buffers are one example of a problematic quantitative topic, and experience has taught us that this subject area is particularly difficult for our students to master. Other faculty members agree (Orgill & Sutherland, 2008). Our working hypothesis in this study was that video lectures, assigned as homework, could replace live classroom lectures in the presentation of buffer theory and problem solving (He, Swenson, & Lent, 2012; Prober & Health, 2012), particularly when combined with a collaborative learning environment (Case, Stevens, & Cooper, 2007; Johnson & Johnson, 1974). The preceding assertions need some “case study” documentation if they are to be adopted on a wide scale by understandably skeptical, and traditionally cautious, academics (Bell, 2012). “Inversion of the classroom” has some inherently appealing characteristics. We have noted that students have less and less appetite for individually working difficult problems on homework assignments, and their response to a long “live lecture” is to tune it out (Deslauriers, Schelew, & Wieman, 2011). We have also noted that lecturing has a long, well-documented (and frequently ignored) history of failure as a problem-teaching tool (Powell, 2003). Group problem solving in the classroom means that both the group and the teacher are available to help but has been criticized as too time consuming, leaving little time for lecture. Our video lecture homework can be viewed from anywhere and at any time students prefer—hopefully when they feel attentive and focused. The videos can be repeatedly rewound and reviewed. They do not consume class time, creating a window for group problem-solving sessions during class. Khan videos are less available for upper division science classes, and we became interested in converting a junior-level biochemistry survey class (enrolling mostly biology majors) to the inverted classroom”format. In the past, the topic of buffers often seemed to produce test anxiety (and failure) in our student population. Chemistry II buffer lectures (>90%) were the most common type of preparation for our students, and very few students had taken analytical chemistry prior to taking biochemistry. Methods A number of methods were used. First, over a 3-week period, students watched seven buffer videos by receiving e-mail links to YouTube videos with a specific class day set for completion. Second, on the due date and without further explanation, they arrived in class and were asked to go to the board in groups of four or five and work a buffer problem to a final solution; collaborative problem solving is a very important part of our method (Klionsky, 2001; Paulson, 1999). Three different buffer problems were given to four to five students randomly assigned to six groups so that each problem was worked by two groups. After allowing 15 minutes to work the problems, the students were asked to “shift” to one of the other two problems and either complete the problem or certify a correct solution. Note that after two shifts, every student had seen every problem, and there were two solutions to each of the identical problems on the board for comparison. The process was time-consuming, requiring approximately 45 minutes, but it led to completely correct solutions (with the instructor’s assistance). Students were encouraged to ask for help when the group stopped making progress. Third, the above method was repeated on three other class days. On test day, students were given buffer problems that were similar, but not identical, to the video problems and classroom work. Fourth, the tests were graded by the principal investigator (PI) and scanned into a PDF file such that the student’s names were anonymous. The PDF file was sent for review to the biochemistry faculty listed as coauthors on this article. After watching the six relevant videos and reviewing the graded results, the biochemistry faculty completed a six-question survey. After students performed group work to proof a detailed test key for errors, their graded exams were returned to them. After receiving the grades, students were asked to fill out an anonymous survey, containing the same questions that were asked of faculty reviewers. The survey questions began with “the usefulness of the videos” and continued onto a series of “validation questions” about the rigor and fairness of the grading. The class labels are not chronological; Class 3 preceded the surveyed classes. We added “Class 3” to the “majors” and “grades” Figures 1 and 2 at the request of reviewers. The class labels and chronology are not important; they are essentially three separate experiments using three separate classes. The survey questions are listed here in an abbreviated form: The answers were given on a Likert-type scale; a positive-response format of 1 = strongly disagree, 2 = disagree, 3 = neutral, 4 = agree, and 5 = strongly agree, with 5 always the most positive response. The questions were as follows: 1. The videos were useful in preparing students for the test? 2. The test problems were difficult? 3. The test grades reflected mastery? 4. Other faculty would have graded the problems similarly? 5. The grading was consistent? 6. The grading was fair? The most important question is Question 1, because it alone will validate or refute our hypothesis. Questions 2–6 were intended to establish that there was a reasonable level of difficulty to the problems, and competent and fair grading ensued to make sure that any conclusions drawn from Question 1 would be valid. We used the following two buffer problems for the Class 1 test, (Class 2 answered very similar problems), and the reader can judge the level of rigor and difficulty posed by the test: Buffer Problem 1: We need 600 ml of phosphate buffer that is 600 millimolar total phosphate species at pH 7.00. We have available phosphoric acid (H3PO4) and all three sodium salts, NaH2PO4, Na2HPO4 Na3PO4. Choose the correct A (acid) and B (base) pair in order to make a buffer at pH 7.00. Phosphate pKas are: pKa1 = 2.12; pKa2 = 7.21; pKa3 = 12.67. Identify and label A and B by their formulas, and calculate the grams of A and B that must be dissolved in 600 ml of water to make the buffer. (Note: A periodic chart of atomic weights was available.) Buffer Problem 2: We need 300 ml of HEPES buffer that is 300 millimolar total HEPES at pH 7.11. We have available only the crystalline zwitterionic form of HEPES (which is an onium sulfonate), and we have 1.00 M HCI and 1.00 M NaOH solutions available, if needed. The pKa of HEPES is 7.56; FW = 238.1. (A) Calculate the grams of HEPES that are needed. (B) Calculate the milliliters of HCl or NaOH that need to be added. (C) Calculate the amount of water that needs to be added to achieve a total volume of 300 ml. It was thought that the first problem would be somewhat easier. It is a straightforward matter of identifying the two components in the buffer equation, pH = pKa + log B/A, recognizing that B and A can be construed as mole ratios (the students practiced this type of problem), then doing an algebraic substitution so that individual moles of A and B can be calculated from the total moles (B + A = total moles buffer). Buffer Problem 2 seemed more difficult because the students had been taught that when only one buffer component is available, they must identify the pHi of the solution, so students had to perform a weak acid calculation, obtain pHi, then add enough base to convert some A to B. The students were allowed to make an approximation that if pHi was more than one pH unit below the pKa of the relevant A/B pair, then the initial solution could be considered to be ~100% A. Data and results Figure 1 shows the detailed composition of the three classes with regard to declared majors. This composition is typical of our junior- and senior-level biochemistry survey classes. The biochemistry class is intended to serve BA and BS students who are biology, chemistry, and health science majors. The first 7 weeks of this class were devoted to the foundational investigation of biomolecules with 1 week of enzyme activity, and with the last 7 weeks being devoted to metabolism. General Chemistry II was the last class most of the students encountered that included buffer calculations. Often students wait several years before taking biochemistry, and remediation concerning pH and acid/base concepts is needed. Figure 2 shows the grades achieved by the students using videos as their lecture preparation, followed by collaborative problem solving in class during group work (as described in the Methods section). Buffer 1 and Buffer 2 yielded approximately the same grades and almost exactly the same average grade. There was very little difference between the two classes, which were two semesters apart. Figure 3 shows the response to student and faculty surveys. All questions on the survey were asked with 5 as the most positive response; for example, to a question about “difficulty of the problems,” a 5 would have indicated that the “problems were ‘extremely difficult.’” The questions are listed in the methods section, and students and faculty answered the same six questions. In both classes students and faculty thought the videos were at least “useful” as preparation for the buffer test. Questions 2–6 were essentially validations of the test itself. These questions were intended to establish that the buffer problems were reasonably difficult and that the grading was consistent and fair, so that a successful outcome could be interpreted as a true success and was not due to an overly easy exam, or overly generous grading. Discussion Not all the students necessarily viewed the videos “on time.” On two separate occasions, one third of the class had not viewed the video by the homework deadline. We were counting on peer pressure to help alleviate this situation and created a rule wherein a group could eject a member that could not help to solve the problem because he or she had not seen the video (no ejections actually occurred). By test day, there were over 30 viewings on every buffer video, in a class where 27 students took the test; we surmise that at some point most students viewed all the videos. We could observe the total number of viewings on the videos, but we were not able to associate a particular student with a viewing. In our study, no points were awarded for video watching. Compelling upper division university students to do homework and offering points is a controversial topic, but it might be a useful carrot to award a few points for watching. Both students and faculty agreed that the videos were (at least) “useful” in preparing students for the test, with survey scores always being greater than 3.6 on this question (see Figure 3, Question 1). We confirmed our hypothesis. It is indeed possible for most students to master complex quantitative problems by learning from videos (without any live lecture), followed by group problem work in class (see also the grade averages shown in Figure 2). Readers are encouraged to judge for themselves whether the level of difficulty of the problems (shown in the Methods section) is sufficiently rigorous for a biochemistry course. It was somewhat surprising that the students did not consider the problems to be extremely difficult (Figure 3, Question 2 was scored as neutral for that question); we conclude that our method was successful, making the problems appear to be simple. Our success is real; validation of the grading by the faculty reviewers and students confirmed that the tests were scored fairly and consistently and that handing out “easy grades” was not the reason for overall student success. The PI (Barreto) saw the faculty survey results only in aggregate, and the faculty respondents knew that anonymity from the PI was part of the design, so objective answers were more likely than not. Note also that there were very few declared chemistry majors (Figure 1), so that an advanced background in buffers cannot explain the overall student success. Interestingly, both the standard deviation and magnitude of the means were remarkably consistent when comparing student and faculty responses (see Figure 3). We expected more divergence in the standard deviations because some students did not do well, and we thought this might polarize the responses, particularly to validation questions about problem difficulty or fairness. Faculty members have no such potential ‘built-in’ bias because they did not take the test for a grade. Excellent agreement was achieved in two separate classes (Figures 2 and 3), showing that the results are reproducible. The true trial number for grades (Figure 2) is 67 students, and for survey data (Figure 3) it is 42 students; overall compliance was >95% for the survey data. Note that all of our data were based on a combination of class group problem solving and video viewing. We believe that these activities are synergistic, but we did not test the variables separately (that would raise serious ethical considerations by creating a treatment group that we believe would cripple student success). As an example, we predicted that video viewing alone would lead to catastrophically low problem scores, because students initially made many mistakes at the board (after viewing the videos without practice). Over a 3-week period, with weekly assignments of two to three different videos (a total of seven), each showing different buffers and buffer types and leading to the buffer problem exam in Week 4 of the semester, problem solving did improve with practice. Finally, consider the student survey responses to Question 1 (Figure 3, Bar 1: “The videos were useful”) where a minimum score of 2.0 would have indicated a student perception of “somewhat useful,” and higher scores would indicate an increasing level of “usefulness.” With the preceding scale in mind, we note that for Question 1 (the most important survey question), we achieved high positive responses from the students: Class 1 = 4.1 and Class 2 = 3.6. Conclusion Inverting the classroom by combining video homework and group problem solving in class leads to overall student success, with relatively low attrition for a difficult, quantitative buffer exam. Problem-based laboratory instruction might also benefit from this method (Barreto et al., 2007). Jose Barreto (email@example.com) is a professor, John Reilly is an associate professor, David Brown is a professor, Sulekha Rao Coticone is an associate professor, Terry Ann Dubetz is an associate professor, and Zanna Beharry is an assistant professor, all in the Department of Chemistry and Physics at Florida Gulf Coast University in Fort Myers. Laura Frost is the director of the Whitaker Center and professor of chemistry at Florida Gulf Coast University. C. Michele Davis McGibony is a professor and Ria Ramoutar is a lecturer, both in the Chemistry Department at Georgia Southern University in Statesboro. Gillian Rudd is an associate professor in the Chemistry Department at Georgia Gwinnett College in Lawrenceville.
Published by NSTA. View All Articles.
This page can be found at http://digital.nsta.org/article/A+Case+Study+for+Teaching+Quantitative+Biochemical+Buffer+Problems+Using+Group+Work+and+%E2%80%9CKhan+Style%E2%80%9D+Videos/1783523/221015/article.html.