Guest Interview Series by Dr. Danny Glick

Hear From the Experts on How AI can Enhance Online Education in the in a Pre- and Post-COVID Era

By IGI Global on Mar 22, 2021

In response to the ongoing shift to remote education, Dr. Danny Glick, University of California (UC), Irvine, USA, and editor of Early Warning Systems and Targeted Interventions for Student Success in Online Courses, has been conducting a series of interviews with leading industry experts, research scientists, and university professors. In this interview series, he seeks to explore research-based principles, emerging trends, and initiatives for driving student engagement and success in online courses.

It is his hope that this interview series will be an important step towards helping the education community navigate successfully the “new normal”. View the latest interview below featuring Dr. Kara McWilliams, General Manager of the AI Research Laboratories at ETS.

Introduction from Dr. Glick

Today, I am delighted to be speaking with Dr. Kara McWilliams, General Manager of the AI Research Laboratories at ETS. She is an expert in AI in the workplace and educational assessment, language learning, teaching and assessment technologies, learning analytics and personalized learning. She contributed a chapter Capturing Student Affect: Designing and Deploying Effective Microsurveys,” in my new book Early Warning Systems and Targeted Interventions for Students Success in Online Courses (IGI Global).

Early Warning Systems and Targeted Interventions for Student Success in Online Courses
Profs. Danny Glick (University of California, Irvine, USA) et al.
©2020 | 374 pgs. | EISBN: 9781799850755
  • Editor Panel Discussion
  • 15 Chapters
  • Perspectives from 5 Continents
  • Covers Game-Based Learning, Learning Environment
    & Learning Support
Quick Links
Bibliographic Information
Pricing & Purchase Options
Table of Contents
Recommend to Library
Access Full Text (InfoSci)

The Interview Featuring Dr. McWilliams

Professor McWilliams

DR. GLICK: Good afternoon, Kara. Thank you so much for making time for this interview. COVID-19 has resulted in schools being shut down across the world. Globally, more than 1.2 billion students in 186 countries are out of the classroom. As a result, education has changed dramatically, as schools and universities have moved to fully distance-learning models. Concomitant with the boom in online learning are escalating concerns about academic accountability, specifically with respect to student outcomes as measured by persistence (i.e., retention) and success (i.e., final course grade).

Given the many challenges associated with distance learning, what new considerations should teachers and instructional designers be keeping in mind?

DR. MCWILLIAMS: Hi, Danny. Thank you so much for the opportunity to discuss these very important topics. As you noted, the pandemic has been a deep, global disrupter of education. While the unfortunate circumstances associated with Covid-19 have created opportunity—introducing some to new and different ways of teaching and learning—it has simultaneously highlighted many shortcomings, like some of the inequities of educational technology. Through the disruption, one constant is emerging: understanding administrator, educator, and student needs - in the context of their values, beliefs, backgrounds, and experiences – and developing learning experiences that meet those needs, is how we can support success in remote learning now and in the future.

Four core needs that should be considered when developing learning experiences include efficiency, insights, efficacy, and equity.

With their many competing demands, students seek efficiencies in how they access, engage with, and complete instructional content and assessments. Capabilities that promote learning efficiencies like personalization, adaptation, targeted feedback, and actionable interventions should be considered.  Personalization can range from simply providing students choice among a set of required materials to digital learning tools with robust machine learning algorithms that become ‘smarter’ as a student progresses in the program and targets their instruction. 

The second is the importance of insights. Though there are many advantages of remote learning, the separation of time and space can create a breakdown in communication between an educator and learner.  Non-verbal cues that educators often use to gauge understanding are removed and informal interactions that might occur in the classroom end as well. When designing learning experiences enabling actionable insights and instant communication is critical. Here, a consideration for instructional designers is creating learning experiences that support the data needed to provide educators insight into access, engagement, progression, and performance, as well as creating targeted communication opportunities like personalized messaging templates.

As they shift to remote learning, educators and instructional designers have been adopting new and different educational technology to support student success. As they consider which technology-enhanced learning solutions to implement, decision-makers should seek valid and reliable information about how a solution was developed and evidence of whether that solution achieves outcomes, among whom, and in what context.  Additionally, teachers and administrators should rely less on the evidence of traditional relationships between use and high-stakes outcomes like summative assessment scores and rely more on reliable evidence that a learning solution is meaningfully supporting engagement, persistence, and learning progressions.

Lastly, but most important is considering equity.  The digital divide has certainly been highlighted during the pandemic, but so have equity issues like non-uniform data standards, responsible use of AI, and differential efficacy of learning tools among sub-populations of students. Administrators and educators should consider these issues as they turn to remote learning.  And, if adopting digital learning tools, they should expect vendors to provide information about how their solutions are addressing these issues.

Digital Learning: Architectures of Participation
Exploring Online Learning Through Synchronous and Asynchronous Instructional Methods
Handbook of Research on Creating Meaningful Experiences in Online Courses
Student-Centered Virtual Learning Environments in Higher Education
Handbook of Research on Virtual Training and Mentoring of Online Instructors

DR. GLICK:Engagement is one key to ensuring that students are successful as they pursue a college degree. Research has shown that achieving student engagement in online courses may be more important than it is in on-campus courses because online students have fewer ways to be engaged with the institution and perhaps greater demands on their time and attention as well. In other words, engagement may be the critical key to making online learning an essential component of higher education and an indispensable part of an institution’s future.

Which design principles–or elements–support the design and implementation of student-centered learning environments, that engage and support students for greater success?

DR. MCWILLIAMS: The importance of engagement in online learning environments cannot be overstated, and your question about design principles is an important one.  There are five approaches that I have found to support student engagement in remote learning environments.

The first is designing and developing a course with students rather than for them.  As I learned from my colleagues Jeff Bergin and Erin Scully who have done a lot of work in this area, the co-design process is really the first step in supporting the development of truly engaging and effective learning experiences. As higher education evolves to meet the realities of learning during the pandemic, building empathy for the students we serve and pairing those insights with what we know about how people learn most effectively – is how we will create engaging experiences.

Through the co-design work that I have done, I have learned that there are three “no regrets” elements that will support student engagement: setting personal learning goals, tracking progress toward those goals, and reflecting on one’s progress. These allow personalization, targeted feedback, and opportunities for engagement with remote learning peers.

Students need to know what educators want them to learn, and provide a goal for them to work toward – and students need to take ownership over their own learning by identifying personal goals as well. A learning objective strategy will meet the first need while opportunities for goal setting, metacognition, and self-regulated learning will satisfy the latter. 

When progressing toward one’s goals, students may disengage if they are being presented with content that is addressing objectives they have already demonstrated proficiency in. Personalizing a student’s learning journey to target their tasks, activities, and assessment in areas where they need the most support will help them remain engaged. 

Like the importance of creating communication channels between educators and students, peer-to-peer interactions will support student engagement as well. Educators and instructional designers should create opportunities for students to engage with each other in the learning process. Peer-to-peer interaction will not only help learners feel more connected but will immerse them in conversations with peers of diverse backgrounds and perspectives.

A final consideration to support engagement in an ongoing way is to monitor the effectiveness of instructional content, activities, and assessments and remain flexible.  Tracking key indicators like access, engagement, progression, and performance can help designers pivot from experiences that are less engaging and implement more engaging approaches.

DR. GLICK: As schools, colleges , and universities in the US and elsewhere embrace online learning, higher attrition will have a detrimental impact on students and institutions alike. Despite this, research on early warning systems and targeted interventions is scarce. What research there is, focuses almost exclusively on behavioral risk signals, such as course completion, persistence through the first year, retention through a program of study, and completion within a preferred period. In your and Dr. Jeff Bergin’s chapter, you note that monitoring and measuring these broader outcomes, while important in terms of institutional initiatives, may not enable early enough intervention to successfully improve an outcome. Rather, these broader outcomes are dependent upon students remaining engaged within courses throughout their academic programs. How can online higher education harness learning analytics and AI to improve student retention?

DR. MCWILLIAMS: When used responsibly, learning analytics and AI have the potential to substantially impact how we think about engagement and improving student retention in online higher education. AI is still very much in its infancy, but there are many applications of machine learning that can support institution-wide efforts to support student success.

There is sometimes a disconnect between inferences made from data typically used to monitor student engagement – like attendance and assignment completion – and a student’s actual level of engagement.  Typical measures may suggest a student is engaged while they may feel that they are not meaningfully progressing toward their goals. However, the aggregation of data captured in course-specific capabilities that measure non-cognitive constructs like confidence and reflecting on one’s own learning, can help institutions identify students who may require extra supports or alternative learning experiences. 

Additionally, research shows that students who are engaged in non-academic groups or programs are more likely to persist in higher education – both online and face-to-face.  AI has the potential to support student engagement through virtual-community recommendations.  Institutions typically collect data that could provide insight into a student’s area of interest or expertise.  Smart systems could learn about a student’s interest through that data and provide recommendations for community programs that might interest them. 

Lastly, actionable insights provided through shared dynamic dashboards can help a student’s support community – advisors or counselors - gain an understanding of the ecosystem of a student’s experience and provide targeted supports at the appropriate times. Data like course-taking and performance could be paired with self-perception data and community engagement data to provide a more holistic understanding of a student’s needs and provide alerts when appropriate.

Importantly, each of these possibilities should be considered within the context of the responsible use of AI.  It should be transparent to students what data are being captured, who they are being shared with, and what they are being used for.

DR. GLICK: A survey of 1,967 faculty members and 178 administrative leaders of digital learning done for Inside Higher Ed found that “Sixty percent of faculty members believe that academic fraud is more common in online courses than in face-to-face courses.” With many schools still uncertain about when their doors will fully reopen, the need to assess student learning is becoming more pressing. Many teachers express concerns around cheating. How should teachers approach assessment during COVID? Should they approach assessment differently?

DR. MCWILLIAMS: You are touching on two important topics with this question – academic integrity in remote environments and next generation assessment – and the two are very related.

The AI community has been developing capabilities that have the potential to support monitoring and identifying academic fraud in education, though they exist in varying stages of maturity.  Advances in AI through natural language processing and automated speech recognition as well as computer vision, eye tracking, and keystroke logging have already helped reduce instances of cheating, particularly on high stakes assessments, and will continue to do so as more research and development is conducted.

The broader question about assessment is even more salient, though. In short, yes, I believe educators should be approaching assessment differently – but not only because of the pandemic.  Everything that recent advances in the learning sciences have taught us points to the need for the educational community to start to view assessment as an opportunity to support students on their learning journey, rather than a summative measure of proficiency. 

An aligned, interconnected assessment, feedback, and targeted intervention system can help students achieve proficiency while also offering reliable measures of student performance.  Well-designed learning experiences can provide insight into whether students are meaningfully engaging with assigned content, whether and how often they are accessing recommended targeted content to fill skills gaps, the extent to which they are progressing in their understanding, as well as provide indicators of motivation and engagement.

There is certainly opportunity for academic dishonesty in the approach to assessment that I propose. But, if educators focus more on formative indicators of performance so that they can provide meaningful feedback and guidance, rather than summative indicators of proficiency, students may engage in the learning process in a more honest, transparent way.

DR. GLICK: Kara, thank you so much for sharing your valuable insights and experiences.


For more information regarding this research and to review Dr. Glick and Dr. Williams’s research, view the IGI Global publication, Early Warning Systems and Targeted Interventions for Student Success in Online Courses.

Available in print and electronic format, it is available at a 40% discount* when you utilize the coupon code GLICK40 through IGI Global’s Online Bookstore. Additionally, this publication is available across preferred providers such as GOBI Library Solutions, EBSCOHost, Oasis, and Ebook Central (discounts may vary), as well as IGI Global’s InfoSci-Books (6,000+ e-books) database.

Visit the publication’s webpage to order, or contact Customer Service at or 717-533-8845 ext. 100 with questions. For researchers, be sure to recommend this publication or the InfoSci-Books database to your library to have access to this critical content.

About Dr. Danny Glick: Danny Glick is a Research Affiliate at the University of California, Irvine’s Online Learning Research Center where he explores ways to improve student persistence and performance in online courses using early warning systems and light-touch interventions. He is a former visiting scholar at the University of California, Irvine’s School of Education where he investigated the effects of blended learning on the achievement of low-income students. Dr. Glick is also the Director of Pedagogical Implementation at Edusuft, a subsidiary of ETS, where he leads a team of EdTech implementation specialists. For the past 20 years, he has helped ministries of education and higher education institutions in 35 countries to shift from traditional instruction to online learning. Dr. Glick holds a PhD in Learning Technologies and a Master’s degree in Curriculum & Instruction and has presented and published on topics including early warning systems, targeted interventions, student persistence, and learning design.

About Dr. Kara McWilliams: Kara McWilliams is a leader in educational research, measurement, and evaluation. Currently, Dr. McWilliams is the Vice President of Learning Science at Macmillan Learning where she leads a team of research scientists who measure the efficacy of educational technology. Dr. McWilliams is passionate about researching the impact of digital technologies in higher education, and how insights can inform teaching and learning. She has twelve years of experience conducting qualitative and quantitative investigations of how course and classroom interventions can improve learner outcomes and influence learning gains. Dr. McWilliams holds a doctorate in Educational Research, Measurement and Evaluation and a Master’s degree in Curriculum & Instruction from Boston College and has presented and published on topics including methods for researching the efficacy of edtech, hierarchical linear modeling, and other designs for nested classroom data, and college and career readiness and success.

For your reference, find below a sample of related titles, which are also featured in IGI Global’s InfoSci-Books database and are available for purchase in print and electronic format. Be sure to recommend these titles to your librarian, to ensure your institution can acquire the most emerging research.


Disclaimer: The opinions expressed in this article are the author’s own and do not reflect the views of IGI Global.

About IGI Global: Founded in 1988, IGI Global, an international academic publisher, is committed to producing the highest quality research (as an active full member of the Committee on Publication Ethics “COPE”) and ensuring the timely dissemination of innovative research findings through an expeditious and technologically advanced publishing processes. Through their commitment to supporting the research community ahead of profitability, and taking a chance on virtually untapped topic coverage, IGI Global has been able to collaborate with over 100,000+ researchers from some of the most prominent research institutions around the world to publish the most emerging, peer-reviewed research across 350+ topics in 11 subject areas including business, computer science, education, engineering, social sciences, and more. To learn more about IGI Global, click here.

Newsroom Contact
Caroline Campbell
Assistant Director of Marketing and Sales
(717) 533-8845, ext. 144

*40% discount is only valid on purchases made directly through IGI Global's Online Bookstore. It is not intended for use by book distributors or wholesalers.
Browse for more posts in:
Author NewsEducationArtificial IntelligenceInterview

No comments Comments

Log in or sign up to comment.
Be the first to comment!