Professional Skills Assessment: Is a Model of Domain Learning Framework Appropriate?

Professional Skills Assessment: Is a Model of Domain Learning Framework Appropriate?

Sadan Kulturel-Konak (Penn State Berks, Reading, PA, USA), Abdullah Konak (Penn State Berks, Reading, PA, USA), Gul Okudan Kremer (Penn State University Park, State College, PA, USA) and Ivan E. Esparagozza (Penn State Brandywine, Media, PA, USA)
DOI: 10.4018/IJQAETE.2015010104
OnDemand PDF Download:
No Current Special Offers


Today's global economy demands that new graduates excel in not only technical knowledge but also professional skills. In fact, the lack of professional skills in project teams has been identified among the most important factors contributing to the high failure rate of complex engineering projects. As a response, academic programs have incorporated professional skills in their curricula. However, there are challenges in the assessment of learning outcomes related to professional skills. This paper presents a novel assessment framework based on the Model of Domain Learning, to assess students' development in professional skills across different disciplines. The proposed assessment model can be tailored to various learning objectives and student levels to facilitate integration of the assessment of professional skills into an overall program assessment plan. An empirical study, which assesses the teamwork communication skills, is presented to demonstrate the applicability of the proposed framework and its advantages as compared to other traditional assessment rubrics in engineering and technology education.
Article Preview

1. Introduction

In order to be successful in their career choices, new technology and engineering graduates are required to excel in professional skills in addition to having valuable technical skills. Professional skills are habitual and judicious use of certain skills that complement technical skills in the practice of a profession (Epstein and Hudert, 2002). Professional skills expected from science, technology, engineering and mathematics (STEM) graduates are well aligned with the broad learning outcomes defined in the Accreditation Board of Engineering and Technology (ABET)’s Criterion 3 and the knowledge and skill areas defined in the Assurance of Learning Standards of the American Association to Advance Collegiate of Schools of Business (AACSB) (e.g., Ethics, Teamwork, Global Awareness and Creative Problem Solving). ABET and AACSB are the boards of accreditation that are recognized for their authority to review and certify the adequate content and delivery of curricula imparting necessary skills in engineering and business, respectively. ABET sanctioned professional skills for engineering are also described as (1) process skills (i.e., communication, teamwork, and the ability to recognize and resolve ethical dilemmas), and (2) awareness skills (i.e., understanding the impact of global and social factors, knowledge of contemporary issues, and the ability to do lifelong learning).

Unlike technical skills, which can be acquired and assessed discretely, intellectual and social abilities of students slowly mature throughout their education (Perry 1970; Alexander et al. 1997). Because the above mentioned process and awareness skills involve intellectual and social abilities, the assessment of students’ professional skills development is challenging (McNamara 2013). While students’ development in their disciplines can be assessed with program specific goals and objectives, there is a need for an assessment model to assess students’ professional skill development. Although significant efforts are expended by educators to enhance the curricula for simultaneous acquisition of professional skills along with the technical content, the absence of a robust assessment framework limits the effectiveness of these efforts. Another challenge is that existing assessment tools were developed based on different frameworks or models; therefore, integrating assessment tools that are based on various frameworks into an overall program assessment is difficult. Whereas acquisition of professional skills in our graduates is increasingly crucial due to global competition and intensifying pressures on companies (i.e., companies have fewer resources and less time to train employees on these skills), the absence of a robust assessment framework inhibits the propagation of pedagogical initiatives by faculty.

Beard et al. (2008) suggest that an assessment plan to evaluate curricular efforts to integrate professional skills into programs should include standardized rubrics for targeted courses in addition to comprehensive exit surveys, internship assessments, and student self-assessments. In this paper, we argue that if assessment tools for professional skills are designed and collected assessment data are analysed based on the same theoretical framework, we can gain deeper insights on why students perform in certain ways (e.g., limited knowledge or interest). Thereby, the improvement of curricular interventions might be informed. With this thought, we present an assessment framework based on Alexander’s Model of Domain Learning (MDL) (Alexander et al. 1997) to assess professional skills.

The objective of the paper is to demonstrate how a theoretical learning model can be utilized in order to gain better insights about students’ professional skills development. The MDL is selected as the theoretical framework for an assessment model because of its validity of predicting the stages of student development. The MDL has been tested in many different technical domains but not in the assessment of professional skills. The MDL also has the potential to provide insight into what might undermine the effectiveness of the programs attempting to improve students’ professional skills (i.e., content knowledge versus student interest).

Complete Article List

Search this Journal:
Open Access Articles: Forthcoming
Volume 6: 2 Issues (2017)
Volume 5: 4 Issues (2016)
Volume 4: 4 Issues (2015)
Volume 3: 4 Issues (2014)
Volume 2: 4 Issues (2012)
Volume 1: 2 Issues (2011)
View Complete Journal Contents Listing