The Quality Matters™ Program (www.qualitymatters. org) is a set of standards (or rubric) for the design of online college-level courses and the online components of hybrid/blended courses, and a peer review process for applying these standards. The Quality Matters Rubric is based on recognized best practices, built on the expertise of instructional designers and experienced online teachers, and supported by distance education literature and research. The goals of the program are to increase student retention, learning and satisfaction in online courses by implementing better course design.
The Quality Matters project was initiated by the MarylandOnline (MOL) consortium, a voluntary, not-for-profit, educational association of two and four-year institutions in Maryland. MOL was established in 1999 to leverage the efforts of individual campuses that were committed to the expansion of online educational opportunities in Maryland through collaborative activities. MOL and its members cooperate to support and maintain a portal for online programs and courses in Maryland, engage in joint faculty training initiatives, develop joint online programs, share online courses through a seat bank arrangement, and pursue federal, state and foundation support for a variety of distance learning initiatives. One of these initiatives is the Quality Matters project.
In the spring of 2003, MOL submitted a proposal to the Fund for the Improvement of Post Secondary Education (FIPSE) for the creation of a rubric for the design of online courses and a peer review process for evaluating and improving existing online courses.
The title of the proposal was: “Quality Matters: Inter-Institutional Quality Assurance in Online Learning.” FIPSE awarded MOL $509,177 over three years (September 2003 – August 2006) to carry out the project. The agency was interested in this proposal among many that involved quality assurance in online education because of the prospect of developing standards that would be inter-institutional and inter-segmental and the peer-to-peer structure of the proposed course review process. This proposal held the promise of a quality assurance tool that was both scalable and replicable, criteria that are fundamental to the FIPSE grant program.
The collaborative nature of the project operated at several different levels. The co-principal investigators, Mary Wells, Director of Distance Learning at Prince George’s Community College and Christine Sax, Assistant Dean of Social, Behavioral, Natural, and Mathematical Sciences at the University of Maryland University College, personified the inter-segmental character of the initiative. Experienced faculty and support staff from throughout the MOL institutions served on the various committees scanning the research and best practices literature, developing the rubric standards and a training program for peer reviewers, testing and refining preliminary versions of the rubric, etc. External institutional and organizational partners across the U.S., including the Kentucky Virtual University (now the Kentucky Virtual Campus), the Michigan Virtual Community College Consortium, the Sloan Consortium, the Southern Regional Education Board (SREB), and the Western Cooperative for Educational Telecommunications (WCET), advised the co-directors as the project moved through its various phases.
During the second year of the grant, the co-principal investigators began making presentations at state, regional and national conferences. These presentations generated widespread interest in the Quality Matters Rubric and evaluation process. In 2005, MOL received several awards, including the WCET Outstanding Work (WOW) Award and the
USDLA 21st Century Best Practice Award. In the second and third year of the grant, peer reviewer training to develop a cadre of reviewers attracted participants from 158 different institutions spanning 28 states. More than 700 faculty and instructional development staff were training during this period. Trained peer reviewers served on the first rounds of course reviews, but also brought their experience back to their home campuses. Several MOL institutions made formal commitments to review and enhance their online courses using the Quality Matters Rubric, and a variety of institutions across the country began to adapt the QM Rubric and review process to serve their own agendas for online course development and quality assurance.
Key Terms in this Chapter
Learning Outcomes: The accomplishments of students in a course, as measured through various forms of assessment.
Scalability: The potential of a process or function to handle a larger volume of activity without degrading.
Rubric: A set of criteria to benchmark or evaluate a product, activity, or process.
Hybrid Course: A course with both online and face-to-face components. Generally speaking, at least 25% of the course must be online for a course to be treated as a hybrid in the Quality Matters course evaluation process.
Quality Assurance: A systematic program for determining whether a product or process is performing according to established standards.
Alignment: Critical course elements working together to ensure that students achieve the desired learning outcomes.
Student Retention Rate: In the context of a course, the student retention rate is the percentage of initially enrolled students who complete the course
Course Design: The forethought and planning that an instructor or course development team puts into a course, i.e., the elements of a course that are built online and the planning for how a course should unfold over time.
Learning Objectives: Course learning objectives describe what students are to gain from instruction in a course. The Quality Matters Rubric expects learning objectives to be measurable.
Course Delivery: The actual teaching of a course, i.e., the implementation of the course design.