Doctoral Faculty 2020: Preparing for the Future in Organizational Leadership

Doctoral Faculty 2020: Preparing for the Future in Organizational Leadership

Peter E. Williams, L. Hyatt
DOI: 10.4018/978-1-61520-869-2.ch006
OnDemand:
(Individual Chapters)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

Leaders are in demand and leadership institutions are necessary to prepare students for important roles in our society (Green, Alexander, & Boryczka, 2001). Faculty are a central factor in graduate and post-graduate programs as they are responsible for guiding student learning (Bryant, 2003). This case reports on an exploratory case study of 21st century competencies for faculty in an organizational leadership doctoral program. Esteemed researchers have conducted studies regarding transitioning students to become new faculty (e.g. Austin, 2002; Austin, Connolly, & Colbeck, 2008; Gardner, 2005; Purcell 2007); however, little research exists specific to the new-century skills necessary for doctoral faculty in leadership programs. As the surge in technology continues to bring about global changes, preparing potential leaders is our best hope going forward. It follows then, that it is important to discover the competencies doctoral faculty will need in order to address the challenges of the future.
Chapter Preview
Top

Background

U.S. Doctoral Education

By the mid-1800s higher education institutions in America had begun to recognize a need for further intellectual preparation (Storr, 1969). Rather than representing the “completion of formal study,” as was its original intent, undergraduate degrees became known as a “first degree of academic life” (p. 1). Even though these degrees provided students with sufficient general knowledge about certain fields e.g. philosophy, religion, mathematics, and social and physical sciences, Storr states that students “…did not necessarily love good literature, recognize their own special talents, or command any profound knowledge of things outside the limits of conventional learning” (p. 2).

Although undergraduate level institutions provided new materials for students, they rarely treated subjects thoroughly. Storr (1969) notes that the professors did not necessarily leave lasting impressions with students, and coverage of materials offered only occasional opportunities for student dialogue and debate. The need for universities to offer more than undergraduate studies and preparation for specific areas, such as medicine and law, inevitably gave rise to the graduate school.

In 1861, Yale University awarded the first U.S. doctorates. It took another decade for the University of Pennsylvania and Harvard University to begin conferring doctorates. In addition to acknowledging a need for graduate and post graduate education, there were additional events that drove an increase in doctoral education in America. The two Morrill Land Grant Acts, the first in 1862, provided land for states to establish colleges to meet the new industrial needs of society. The second Morrill Land Grant Act in 1890 led to the development of colleges in response to segregation resulting in historically black colleges and universities (HBCUs). While the Morrill Land Grant Acts were growing the number of public higher education institutions, there was also an increase in the number of private colleges and universities forming (e.g.: Johns Hopkins University, 1876; Stanford University, 1891; University of Chicago, 1892) and/or reorganizing (e.g.: Columbia and Harvard) to offer graduate education. Although the U.S. model of doctoral education was established by the late 1880s and early 1900s, large numbers of American students were still drawn to European institutions especially those in Germany. This was attributed to a lack of regulations and accountability, and a perceived inconsistency in the quality of U.S. colleges and universities (Speicher, 2000).

To stem the tide of the student exodus, 14 leading U.S. doctoral-granting institutions sent representatives to a conference in January, 1900 at the University of Chicago. This resulted in the formation of the Association of American Universities (AAU). The founding AAU member institutions are listed in Figure one.

Table 1.
Colleges and universities at the 1900 AAU meeting (AAU, 2003)
Catholic University of AmericaJohns Hopkins UniversityUniversity of Michigan
Clark UniversityPrinceton UniversityUniversity of Pennsylvania
Columbia UniversityStanford UniversityUniversity of Wisconsin-Madison
Cornell UniversityUniversity of California-BerkeleyYale University
Harvard UniversityUniversity of Chicago

During the 1930s, 1940s, and 1950s U.S. graduate and post graduate education grew steadily. The federal government increasingly looked to universities for research in technology and policy in the New Deal era of the 1930s. The emphasis on research continued producing alliances with such federal government agencies as the National Institutes of Health (NIH) in 1946, The Office of Naval Research (ONR) 1949, and the National Science Foundation (NSF) in 1950 (Geiger, 1993). A significant driver of U.S. graduate and post graduate education came from another European event. The Soviet Union’s launching of the Sputnik satellite in 1957 resulted in another growth spurt in institutions granting doctorates. Research and development spending tripled between 1957 and 1970 and higher education more than doubled rising from three million to seven million (Thurgood, Golladay, & Hill, 2006).

Steady but slowed growth characterized the 1980s and most of the 1990s. However, at the beginning of the 21st century, in 2003, the successful plotting of the human genome fostered renewed interest in universities and research. Currently, university research attentions are focused on new technologies that increase mobility, stem-cell research, organ farming, cognitive and neuro-processes, understanding the complexities of human behaviors such as leadership, and robotics, etc. By 1900 fewer than 20 universities conferred about 250 doctorates (Thurgood, Golladay, & Hill, 2006). At the start of the 21st century 426 higher education institutions in the U.S. had awarded approximately 1.4 million doctorates.

The U.S. doctorate is conceptualized as education arranged around rigorous real-world research experiences that prepares students to “discover, integrate, and apply knowledge, as well as communicate and disseminate it” (CGS, 1990, p.1).

Graduate education began to fill the void in U.S. universities and was aimed at “bringing the student to an understanding of the conceptual structure of his field at the frontier, and research that is aimed to push the frontier a little further” (Rees, 1972, p. 144). The mastery of certain disciplines allowed the expansion of research within those areas in an organized manner. Explaining the necessity of graduate schools, Rees posits that:

The thrill of intellectual discovery, of insight into the mysteries of antiquity, of understanding of the motives and lives of those who made our civilization; the excitement that human beings can find from looking into the physical and biological universe and into the world of human creations are still a part of significant human experience. The graduate schools are needed to sustain and enhance their significance. But so also are the applications of our competencies to better social solutions, to the betterment of lives in our own country and abroad. To find a better way is the task that lies ahead. (p. 144)

The creation of specific graduate programs would then increase the competencies of the student and appropriately prepare them for a society waiting for leaders that will guide them towards a better world.

Complete Chapter List

Search this Book:
Reset