Evaluating the Geography E-Learning Materials and Activities: Student and Staff Perspectives

Evaluating the Geography E-Learning Materials and Activities: Student and Staff Perspectives

Karen Fill, Louise Mackay
DOI: 10.4018/978-1-59904-980-9.ch013
OnDemand:
(Individual Chapters)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

This chapter is concerned with the evaluation of learning materials and activities developed as part of the DialogPLUS project. A range of evaluation activities was undertaken, focusing on the experiences of students, teaching staff, and the entire project team. Student evaluations included both quantitative and qualitative approaches, particularly using a questionnaire design drawing on a specific methodology and generic quality criteria, facilitating comparative analysis of results. Discussion of the student evaluations is focused on specific taught modules from both human and physical geography. Results of these evaluations were discussed with teaching staff and contributed to improvements in the various online resources. Both internal and external evaluators were involved in interviewing key project staff and their different perspectives are presented. The chapter concludes by reflecting on the effectiveness and impact of different DialogPLUS activities, highlighting the principal impacts of the project as perceived by the students and staff involved.
Chapter Preview
Top

Background

The education team at Southampton developed an overall evaluation strategy for DialogPLUS based on the principles of utilization-focused evaluation (Patton, 1986).

... in the real world of trade-offs and negotiations, too often what can be measured determines what is evaluated, rather than deciding first what is worth evaluating and then doing the best one can with methods. Relevance and utility are the driving force in utilization-focused evaluation; methods are employed in the service of relevance and use, not as their master. (Patton, 1986, p. 221)

The approach involves identifying key stakeholders and working with them to understand how they intend to use the outcomes of evaluation and which major questions are useful to answer. This information then informs the design of evaluation approaches and instruments. The full range of the DialogPLUS stakeholders was identified early in the project (see Table 1).

Table 1.
DialogPLUS stakeholder groups, their interests and concerns
StakeholdersInterests and concerns
Geographers at partner institutionsRelevance and value of the project to their local context.
Access to digital library resources.
Barriers and enablers to nugget development, usage and sharing.
Effectiveness of nuggets in their local learning & teaching context.
Collaborative nature of the project and how it has worked.
Usability and effectiveness of the toolkit.
Embedding of outputs / outcomes.
Changes to professional practice resulting from involvement in the project.
Learners at partner institutionsThe kind of skill or conceptual understanding needed to use the nuggets.
Accessibility of resources / nuggets.
Effectiveness of resources / nuggets.
Impact on their learning processes and outcomes.
Computer scientists and educational technologists at partner institutionsBarriers and enablers to development of the toolkit.
Usability and effectiveness of the toolkit.
Barriers and enablers to developing systems for nugget sharing.
Usability and effectiveness of solutions for nugget sharing.
Convergence with emerging standards in learning design, interoperability, resource discovery and reuse.
Educationists and the evaluation teamInnovation in teaching and learning.
Pedagogical soundness of the toolkit.
Barriers and enablers to changing practice.
Effectiveness of the evaluation methodology adopted.
Evaluation findings and their relevance.
Project teamEnsuring that the project is successfully completed on time and to budget and meets the original aims and objectives of the proposal.
Facilitating communication between partners.
Monitoring of project activities against project plan.
Collaborative nature of the project and how it has worked.
Institutional managersSuccessful project completion.
Usability and effectiveness of project outcomes.
Funding bodiesValue of the collaboration.
Synergies between related JISC/NSF projects and programmes.
Applicability and transferability of the outcomes to the wider community.
The Higher Education communityProject contribution in the areas of digital resources / repositories, distributed learning design, development and implementation, international collaboration, teaching and learning in Geography at tertiary level.

Complete Chapter List

Search this Book:
Reset