Almost half a century after the invention of the Internet and three decades after the development of the World Wide Web, online learning is no longer a field reserved for early adopters. 63.3% of chief academic leaders surveyed by the Babson Survey Research Group (Allen & Seaman, 2016) “agreed that online education was critical to the long-term strategy of the institution.” Further, the number of students taking online courses has been growing for the last 13 years. However, only 29.1% of academic leaders agreed that their faculty accept the “value and legitimacy of online education.” The instructional quality of online courses is a concern. Researchers agree that the quality of the online content may have significant impact on student satisfaction and success (Palloff & Pratt, 2011; Voigt & Hundrieser, 2008).
Another factor related to student success, such as graduation rates, is course engagement (Price & Tovar, 2014). While many pedagogical approaches aim to increase engagement, one of them is gamification. Gamification is the adoption of game elements in non-game contexts (Deterding, Dixon, Khaled, & Nacke, 2011; Huotari & Hamari, 2012, Fitz-Walter, Wyeth, Tjondronegoro, & Scott-Parker, 2013). Games can create an immersive feedback environment and, while promoting making choices, they also ease the impact of failure (Gee, 2003). While gamification is a subject of much business and educational research, few studies consider the evaluation through peer review of the instructional design quality of gamified courses. Online components of gamefully designed courses may increase engagement in students, but are they sound from the instructional perspective?
A peer review process is of great value in evaluating academic courses as faculty views on instructional quality of courses can be subjective. For example, in the STEM academic area, while faculty were often focused on producing high-quality graduates, they also contributed to a high level of attrition in academically weaker students (Christe, 2013). Faculty sometimes viewed student withdrawal from STEM majors as a sign of successful instruction. The STEM introductory courses were viewed as a gatekeeping process to spare unfit students from the rigors of scientific work. Problems such as low achievement, student boredom, and alienation, along with high dropout rates were linked to engagement (Fredricks, Blumenfeld, & Paris, 2004; Swap & Walter, 2015).
Indiana University Professor Emeritus of Biology Dr. Craig Nelson is the author of “Dysfunctional Illusions of Rigor” (Nelson, 2010). These illustrate subjective and often erroneous faculty views about their own instruction. 1) Hard courses weed out weak students. When students fail it is primarily due to inability, weak preparation, or lack of effort. Finding: When students fail it is often due to inappropriate pedagogy. 2) Traditional methods of instruction offer effective ways of teaching content to undergraduates. Modes that pamper students teach less. Finding: While lectures teach something, alternative methods teach on average twice as much as traditional lectures.
The instructional effectiveness of courses may also depend on the course delivery mode. 42.3 % of chief academic officers were more favorable about courses that combine elements of online instruction with those of traditional face-to-face teaching (Allen & Seaman, 2016). Academic leaders rate the promise of blended or hybrid courses as superior to that of fully online courses.
Online courses are often categorized into fully online and hybrid courses. Allen & Seaman (2016) propose four categories: traditional courses, web facilitated courses, blended/hybrid courses, and fully online courses. Traditional courses contain 0% of content delivered online. The content is delivered in writing or orally. Web facilitated courses deliver from 1 to 29% of content online. Next, 30 to 79% of online content delivered online make it blended/hybrid courses. Such courses have a substantial proportion of content delivered online. Finally, courses with 80% of online content or above are considered fully online. Online courses deliver most or all content online without face to face meetings.
Todd et al. (2017) support the notion that hybrid courses deliver better results over face-to-face or fully online courses, especially in teaching complex content. As an example of such content, in teaching ethical decision-making skills as a process-based content, the hybrid modality was recommended. The researchers admitted, that the success of the hybrid delivery might have been due to the extensive development of the course over the effort in development of the traditional or fully online versions.
Another reason, why hybrid courses may be more effective than fully online courses are computer skills of students. As the academic world and the workplace slowly come to terms with the myth of the digital natives, studies warn about negative consequences in assuming students have digital skills simply because they are of a younger generation (Kirschner & Bruyckere, 2017). A global research in 33 developed countries reported that only 5% of general population possesses high computer-related skills and only 30% can address medium-complexity tasks (OECD, 2016). In another study, while 83% of millennials report sleeping with their smartphones, 58% have poor skills in solving problems with technology, and out of 19 countries examined in the study, the U.S. millennials ranked last.
The findings about advantages of hybrid courses are supported in the literature by considering the strengths and weaknesses of each modality. Fully online delivery features a self-paced nature, which may lead to rapid progression through key content without sufficient learning taking place (Daymont and Blau 2011; Kirschner, Sweller, & Clark, 2006). The advantage of such delivery, especially over face-to-face courses, is in the student’s ability to pause, rewind, and otherwise consume content at their own rate (Osguthorpe and Graham 2003). Hybrid courses tend to capitalize on the benefits of both modes of delivery in keeping students more accountable for their knowledge of the online content leading to increased learning and course effectiveness (Sapp & Simon, 2005).
In addition to the course delivery mode, other factors contribute to course quality. Peer review of the online course content and the evidence of classroom activities tends to be a primary tool in promoting course quality (Chao, Saj, & Tessier, 2006; Feldman, McElroy, & LaCour, 2000; Little 2009; McGahan, Jackson, & Premer, 2015). In a paper dedicated to the review of national and statewide evaluation instruments of online courses, Baldwin, Ching, and Hsu (2017) list the following course quality improvement programs:
- Blackboard’s Exemplary Course Program Rubric (2017a)
- California Community Colleges’ Online Education Initiative (OEI) Course Design Rubric (2017)
- The Open SUNY Course Quality Review Rubric (OSCQR) (State University of New York, 2016).
- Quality Matters (QM) Higher Education Rubric (2017)
- Illinois Online Network’s Quality Online Course Initiative (QOCI) (2017)
- California State University Quality Online Learning and Teaching (QOLT) (2017)
Blackboard Exemplary Course Rubric
The Blackboard Exemplary Course Program was designed to recognize courses that “demonstrate best practices in four major areas: Course Design, Interaction & Collaboration, Assessment, and Learner Support” (Blackboard, 2017b). Each category was evaluated quantitatively within five levels of mastery: exemplary (5-6), accomplished (3-4), promising (2), incomplete (1), not evident (0). Reviewers were instructed to apply lower ratings when within a category some sub-categories were accomplished in an exemplary manner, but others appeared below that mastery level.
The rubric was distributed publically with a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License. This meant that course builders were allowed to use the rubric privately or within their organizations. They could also modify the rubric and institutionalize it as long as it was for non-commercial purposes and contained a note of Blackboard’s authorship.
(…) please read the full journal article for this redacted content
A complex game requires clear instructions. Effective instructional design is a foundation for gamification of academic courses. When applying gamification to academic courses, it is natural to focus on student engagement and intrinsic motivation. However, it is also important to validate the instructional design approaches to maintain high quality of instruction.
The Blackboard Exemplary Course Program allows faculty to incrementally improve the instructional quality of their online or hybrid courses. Faculty receive confidential, quantitative, and qualitative feedback from anonymous reviewers. The course content and evidence of student activity can be resubmitted to the program multiple times for additional feedback.
The design of the course, “Introduction to Computing” at Grand Valley State University demonstrates an application of the short and long game theory for academic courses (Machajewski, 2017d). The use of technology and gamification methods during lectures provides a short-term engagement mechanism. At the same time, long-term methods of XP tracking and XP trading allows for creating a player journey and experience shaping mechanism.
Allen, I. E., & Seaman, J. (2016). Online report card: Tracking online education in the United States. Wellesley: Babson Research Group.
Baldwin, S., Ching, Y. H., & Hsu, Y. C. (2017). Online Course Design in Higher Education: A Review of National and Statewide Evaluation Instruments. TechTrends, 1-12.
Bartle, R. (1996). Hearts, clubs, diamonds, spades: Players who suit MUDs. Retrieved from http:// http://www.mud.co.uk/richard/hcds.htm
Blackboard (2017c). Blackboard Announces Winners of 2017 Catalyst Awards. Press Release. Blackboard Inc. Retrieved from http://press.blackboard.com/Blackboard-Catalyst-Awards-2017
Blackboard. (2017a). Exemplary Course Program. Blackboard Inc. Retrieved from http://www.blackboard.com/consulting-training/training-technical-services/exemplary-course-program.aspx
Blackboard. (2017b). Blackboard Exemplary Course Rubric. Blackboard Inc. Retrieved from http://www.blackboard.com/resources/catalyst-awards/bb_exemplary_course_rubric_apr2017.pdf
California Community Colleges Chancellor’s Office. (2017). About the OEI. Retrieved from http://ccconlineed.org/about-the-oei/
California State University. (2017). QOLT evaluation instruments. Retrieved from http://courseredesign.csuprojects.org/wp/qualityassurance/qolt-instruments/
Cengage. (2014). SAM Helps Intimidated Students Gain Confidence and Proficiency with Computer Technology. Retrieved from http://www.machajewski.org/szymon/files/ss_sam_machajewski.pdf
Chao, T., Saj, T., & Tessier, F. (2006). Establishing a quality review for online courses. Educause Quarterly, 29(3), 32–40.
Chen, X., & Soldner, M. (2013). STEM attrition: College students’ paths into and out of STEM fields: Statistical analysis report. (NCES 2014-001). National Center for Education Statistics, Institute of Education Sciences, U.S. Department of Education. Washington, DC. Retrieved from http://nces.ed.gov/pubs2014/2014001rev.pdf
Chou, Y. K. (2015). Actionable Gamification: Beyond Points, Badges, and Leaderboards. Octalysis Media.
Christe, B. (2013). The Importance of Faculty-Student Connections in STEM Disciplines: A Literature Review. Journal Of STEM Education: Innovations And Research, 14(3), 22‑26.
Daymont, T., & Blau, G. (2011). Deciding between traditional and online formats: Exploring the role of learning advantages, flexibility, and compensatory adaptation. Journal of Behavioral & Applied Management, 12, 156–175.
Deterding, S., Dixon, D., Khaled, R., & Nacke, L. (2011). From game design elements to gamefulness: Defining “gamification”. In Proceedings of the 15th international academic MindTrek conference envisioning future media environments (pp. 9e15). Tampere, F, ACM.
Feldman, S., McElroy, E. J., & LaCour, N. (2000). Distance education, guidelines for good practice. Washington: American Federation of Teachers Retrieved from http://www.umsl.edu/technology/frc/pdfs/ guidlines_for_good_practice_DL.pdf.
Fitz-Walter, Z., Wyeth, P., Tjondronegoro, D., & Scott-Parker, B. (2013). Driven to drive: Designing gamification for a learner logbook smartphone application. In Proceedings of the 2013 Symposium on Gameful Design, Research, and Applications, Gamification 2013, Stratford, ON, Canada (pp. 42e49).
Fredricks, J. A., Blumenfeld, P. C., and Paris, A. (2004). School engagement: potential of the concept: state of the evidence. Review of Educational Research, 74, 59–119.
Gee, J. P. (2003). What video games have to teach us about learning and literacy. ACM Computers in Entertainment, 1(1), 20e20.
Huotari, K., & Hamari, J. (2012). Defining gamification: A service marketing perspective. In Proceeding of the 16th international academic MindTrek conference (pp. 17e22). ACM.
Illinois Online Network. (2017). Quality online course initiative. Retrieved from http://www.ion.uillinois.edu/initiatives/qoci/index.asp
Kirschner, P. A., Sweller, J., & Clark, R. E. (2006). Why minimal guidance during instruction does not work: An analysis of the failure of constructivist, discovery, problem-based, experiential, and inquiry-based teaching. Educational Psychologist, 41, 75–86.
Kirschner, P., Bruyckere, P. (2017). The myths of the digital native and the multitasker, Teaching and Teacher Education, Volume 67, 2017, Pages 135-142, ISSN 0742-051X, http://dx.doi.org/10.1016/j.tate.2017.06.001. Retrieved from http://www.sciencedirect.com/science/article/pii/S0742051X16306692
Lander, E. S., & Gates, S. J. (2010). Prepare and inspire. Science (New York, N.Y.), 330(October), 151. http://doi.org/10.1126/science.1198062
Little, B. B. (2009). Quality assurance for online nursing courses. Journal of Nursing Education, 48(7), 381–387.
Machajewski, S. (2013). CIS150. [Mobile application software]. Retrieved from https://itunes.apple.com/us/app/cis150/id944850769?mt=8
Machajewski, S. (2015). Educational Gamification System and Gameful Teaching Process, US Patent Application No. 14/922,321
Machajewski, S. (2016). Computer Software Release: Open Photo Roster. Open Teaching Tools. Book 1. Retrieved from http://scholarworks.gvsu.edu/oer_teaching/1
Machajewski, S. (2017a). Application of Gamification in a College STEM Introductory Course. (Doctoral dissertation). Retrieved from Education Resources Information Center https://eric.ed.gov/?id=ED574876
Machajewski, S. (2017b), Gamification in Blackboard Learn. Conference Proceedings. Retrieved from Education Resources Information Center https://eric.ed.gov/?id=ED575007
Machajewski, S. (2017c). Digital audio assistants in teaching and learning. Blackboard Inc. Retrieved from http://blog.blackboard.com/digital-audio-assistants/
Machajewski, S. (2017d). The short and long game theory for academic courses. Blackboard Inc. Retrieved from http://blog.blackboard.com/the-short-and-long-game-theory-for-academic-courses/
McGahan, S. J., Jackson, C. M., & Premer, K. (2015). Online course quality assurance: Development of a quality checklist. InSight: A Journal of Scholarly Teaching, 10, 126–140.
National Research Council NRC. (2012). Monitoring Progress Toward Successful K−12 STEM Education: A Nation Advancing? Washington, DC: National Academies Press.
Nelson, C. E. (2010). Dysfunctional illusions of rigor: Part 1 – basic illusions. In R. Reis (Ed.), Tomorrow’s Professor: Msg. #1058. Retrieved from http://cgi.stanford.edu/~dept-ctl/cgi-bin/tomprof/posting.php?ID=1058
Neuhauser, A., Cook, L. (2016). U.S. News/Raytheon Annual STEM Index. U. S, News and World Report.
OECD (2016), Skills Matter: Further Results from the Survey of Adult Skills, OECD Publishing, Paris. DOI: http://dx.doi.org/10.1787/9789264258051-en
Osguthorpe, R. T., & Graham, C. R. (2003). Blended learning environments: Definitions and directions. Quarterly Review of Distance Education, 4, 227–234.
Palloff, R. M., & Pratt, K. (2011). The excellent online instructor: Strategies for professional development. John Wiley & Sons.
Price, D. V. and Tovar, E. (2014) Student Engagement and Institutional Graduation Rates: Identifying High-Impact Educational Practices for Community Colleges, Community College Journal of Research and Practice, Vol 38, No 9, pp 766–782.
Quality Matters. (2016). Course Design Rubric Standards. Retrieved from https://www.qualitymatters.org/qa-resources/rubric-standards/higher-ed-rubric
Rank, O., Raglan, L., Dundes, A., & Segal, R. (1990). In Quest of the Hero. Princeton University Press.
Risen, T. (2016). “Coding Isn’t Just for Coders Anymore.“ U.S. News. Retrieved from https://www.usnews.com/news/articles/2016-06-08/coding-isnt-just-for-coders-anymore
Ryder, R., & Machajewski, S. (2017). The “UIC German” game app for the enhancement of foreign language learning – Case study. International Journal Of Educational Technology, 4(1), 1-16. Retrieved from http://educationaltechnology.net/ijet/index.php/ijet/article/view/13
Sapp, D. A., & Simon, J. (2005). Comparing grades in online and face-to-face writing courses: Interpersonal accountability and institutional commitment. Computers and Composition, 22, 471–489.
Soergel, A. (2015). “Want a Better Job? Master Microsoft Excel.” U.S. News. Retrieved from https://www.usnews.com/news/blogs/data-mine/2015/03/05/want-a-better-job-master-microsoft-word-excel
State University of New York. (2016). The open SUNY COTE quality review (OSCQR) process and rubric. Retrieved from https://bbsupport.sln.suny.edu/bbcswebdav/institution/OSCQR/OSCQR-Links-BKP-2016-08-09.html
Swap, R. J., & Walter, J. A. (2015). An Approach to Engaging Students in a Large-Enrollment, Introductory STEM College Course. Journal Of The Scholarship Of Teaching And Learning, 15(5), 1–21.
Todd, E. M., Watts, L. L., Mulhearn, T. J., Torrence, B. S., Turner, M. R., Connelly, S., & Mumford, M. D. (2017). A meta-analytic comparison of face-to-face and online delivery in ethics instruction: the case for a hybrid approach. Science and Engineering Ethics, 1-36.
Voigt, L., & Hundrieser, J. (2008). Student success, retention, and graduation: Definitions, theories, practices, patterns, and trends. Noel-Lewitz Retention Codifications. November, 1-22.
Yeager, D. S., & Dweck, C. S. (2012). Mindsets That Promote Resilience: When Students Believe That Personal Characteristics Can Be Developed. Educational Psychologist, 47(4), 302–314. http://doi.org/10.1080/00461520.2012.722805