Using Evaluation to Support Organizational Learning in E-Government System: A Case of Malaysia Government

Using Evaluation to Support Organizational Learning in E-Government System: A Case of Malaysia Government

Hasmiah Kasimin (School of Economics, Faculty of Economics and Management, Universiti Kebangsaan Malaysia, Bangi, Selangor, Malaysia), Aini Aman (School of Accounting, Faculty of Economics and Management, Universiti Kebangsaan Malaysia, Bangi, Selangor, Malaysia) and Zulridah Mohd Noor (School of Economics, Faculty of Economics and Management, Universiti Kebangsaan Malaysia, Bangi, Selangor, Malaysia)
Copyright: © 2013 |Pages: 20
DOI: 10.4018/jegr.2013010103
OnDemand PDF Download:
No Current Special Offers


This paper discusses the need for a framework to understand the use of evaluation to support learning in the process of implementing an e-Government system in a developing country, namely, Malaysia. A conceptual interpretive framework focusing on evaluation use and evaluation practices to support organizational learning is proposed. The framework makes use of organizational learning theory and integrates the focus of previous interpretive frameworks. Application of the framework in a case study of an e-Government system is also elaborated. The framework is useful for exploring and understanding the use of evaluation to support learning in improving the e-Government system and helps in identifying the need for learning about evaluation practices to extend the benefits from evaluation. An implication from the analysis reveals the need for Knowledge Management Systems that is capable of providing feedback and feed-forward of shared information to support learning about the e-Government system and its evaluation practices.
Article Preview


In Malaysia, on-going evaluation is required in order to realize the benefits derived from e-Government systems (EGS). Evaluation is necessary to improve system’s performance due to the complexity of its development and implementation processes. The transition to an electronic government will take time, and will not be perfectly orderly (Karim, 2003). At the same time, e-Government concepts, application requirements and technology that support e-Government implementation are continuously evolving. EGS needs to evolve on a continuous basis to adapt to new requirements. Thus, it is not surprising that e-Government applications are subjected to a high risk of failure (Dada, 2006) and unsustainable in terms of citizen and business up-take (Aichholzer, 2009). Achieving EGS’s objectives is not a straight-forward process. Many (Gupta, 2007; Phang et al., 2008; Skinner, 2004) believes that learning through experiences is an important process to deal with the complexity of EGS implementation processes. One of the superior ways of learning is through systematic conduct of an evaluation (Forss et al., 2002; Engel & Carlsson, 2002; Skinner, 2004).

Conventionally, evaluations are mainly for accountability and assessment purposes. As for evaluation to support learning, evaluation practitioners need to perform the function of facilitating the evolutionary adaptation of both the stakeholders involved during the implementation of the system and the system itself, and must also be prepared to evolve their evaluation practice (Fidock & Carroll, 2004). Previous literature on information system (IS) evaluation also shows that only effective evaluation may lead to successful information systems (Remenyi, 1999; Beynon-Davies et al., 2004, McDonald & Kay, 2010; Thomas et al., 2007). Poor evaluation procedures mean it is difficult to select projects for investment, to control development and to measure business return after investment and leads to high rate of IT project failure (Farbey et al., 1999; Thomas et al., 2007). Previous literatures from evaluation research argue that evaluation provides information that can improve management decision making (Calder, 1994; Love, 1991); create new insights and understanding (Preskill & Torres, 1999; Weiss, 1998); lead to wider acceptance and commitment to change initiatives (Carnall, 1995; Kirkpatrick, 1985); develop innovation (Forss et al., 2002); provide opportunities for reflection prior to undertaking further change (Patrickson et al., 1995) and provide the necessary feedback as required by the necessary learning through experiences (Engel & Carlsson, 2002). In reality however, practicing evaluation to support learning is not easy (Torres & Preskill, 2001).

Malaysian public sector believes that e-Government evaluation is important (Mohd Zahri, 2009). Evaluation processes involve many agencies and each conducts evaluations to support different objectives and serve different point of time in a system’s life cycle. In most cases, evaluation is not on-going and conducted only when needed. Some evaluation activities are carried out internally on ad-hoc basis and involve informal processes. In some cases, evaluations are conducted by external consultants, especially on impact evaluations. Their outputs are not well-coordinated into an integrated framework to foster effective learning through-out the system’s life cycle. And their use of evaluation to support the required learning on EGS development and implementation is unclear. Evaluation practice is based on standard procedures, guidelines and set of criteria laid by the relevant government agencies. Formal methods and techniques are only used especially when the evaluation is done by external consultants. Learning to use new methods, techniques and measurement criterias is regarded as necessary but is limited and lacks priority.

Complete Article List

Search this Journal:
Open Access Articles
Volume 18: 4 Issues (2022): Forthcoming, Available for Pre-Order
Volume 17: 4 Issues (2021): 3 Released, 1 Forthcoming
Volume 16: 4 Issues (2020)
Volume 15: 4 Issues (2019)
Volume 14: 4 Issues (2018)
Volume 13: 4 Issues (2017)
Volume 12: 4 Issues (2016)
Volume 11: 4 Issues (2015)
Volume 10: 4 Issues (2014)
Volume 9: 4 Issues (2013)
Volume 8: 4 Issues (2012)
Volume 7: 4 Issues (2011)
Volume 6: 4 Issues (2010)
Volume 5: 4 Issues (2009)
Volume 4: 4 Issues (2008)
Volume 3: 4 Issues (2007)
Volume 2: 4 Issues (2006)
Volume 1: 4 Issues (2005)
View Complete Journal Contents Listing