Frameworks for Integration of Future-Oriented Computational Thinking in K-12 Schools

Frameworks for Integration of Future-Oriented Computational Thinking in K-12 Schools

Scott R. Garrigan
DOI: 10.4018/978-1-7998-1479-5.ch003
OnDemand:
(Individual Chapters)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

Computational thinking (CT) K-12 curricula and professional development should prepare students for their future, but historically, such curricula have limited success. This chapter offers historical analogies and ways that CT curricula may have a stronger and more lasting impact. Two frameworks are central to the chapter's arguments. The first recalls Seymour Papert's original description of CT as a pedagogy with computing playing a formative role in young children's thinking; the computer was a tool to think with (1980, 1996). This “thinking development” framework emphasized child-centered, creative problem solving to foster deep engagement and understanding. Current CT seems to include creativity only tangentially. The second framework encompasses emergent machine learning and data concepts that will become pervasive. This chapter, more prescriptive than empirical, suggests ways that CT and requisite professional development could be more future-focused and more successful. It could be titled “Seymour Papert meets Machine Learning.”
Chapter Preview
Top

Background

Computational Thinking ideas arose from founders of the internet, artificial intelligence, and educational technology. These CT ideas gestated half-a-century ago in 1968 discussions between MIT professor Seymour Papert and the Bolt, Baranek, and Newman inventors of the eastern half of what would become the internet. CT ideas were disseminated in Papert’s 1970 paper, Teaching Children Thinking: Artificial intelligence memo number 247, in which he stated the ideas were “deeply influenced by AI pioneer Marvin Minsky …” Papert christened (first named) “Computational Thinking” in a 1996 paper, and he researched and elaborated CT ideas through MIT’s Artificial Intelligence lab (which he co-founded with Minsky). He created the LOGO computer language to develop children’s mathematical thinking, and LOGO was an early educational technology used in schools for decades. LOGO was developed from LISP, the artificial intelligence language of the day, and modern children’s languages like Scratch are LOGO’s direct descendants. CT had an honorable beginning.

Three of Papert’s central CT ideas form much of the framework discussed below: 1) CT is a way of thinking that needs to begin in elementary school, 2) essential core elements of CT are curiosity and creativity, and 3) CT helps us understand human thinking. These ideas evolved from Papert’s 1960’s work investigating how young children learn mathematics in collaboration with his mentor, child psychologist Jean Piaget. Through Papert’s lens, CT in education could well be called Computational Learning. But for reasons to be discussed in the closing, Papert’s work had minimal impact on mainstream classroom teaching and learning. It was a decade after Papert coined the term Computational Thinking that Jeanette Wing, head of Carnegie Mellon’s computer science department, brought CT into the education mainstream.

Wing’s 2006 ACM Viewpoint, Computational Thinking, explained six elements of CT she characterized as necessary for everyone to understand today’s technical world. While Wing wrote for a university community, her meme resonated strongly in the K-12 community where, by then, computers had become common in classrooms. Prior to the general use of computers in schools, teachers and administrators were not prepared to understand the context and importance of CT. As computer science professor, later VP of Microsoft Research, and now Director of Columbia Data Sciences Institute, Wing developed a deep understanding of the network of connections of computer science concepts to a broad swath of university study and adult life. In CT she embraced comparisons of human and machine learning, randomness, heuristics, recursion, and, of course, abstraction. She connected CT to fields from biology and chemistry to physics and economics. But her only references to children learning CT outside of college were, “To reading, writing, and arithmetic, we should add computational thinking to every child’s analytical ability” and “We should expose pre-college students to computational methods and models.”

Key Terms in this Chapter

Artificial Intelligence (AI): A branch of computer science, traditionally, AI was defined as a computer program that does something normally done by humans. More recently, as machine learning has given AI the capability to learn, the definition of Artificial Intelligence have evolved to be often synonymous with, or to include, machine learning.

Generative Adversarial Network (GAN): A powerful machine learning technique made up of two learning systems that compete with each other in a game-like fashion. Features of the winning system are “genetically” added to the loser along with random mutations. GANs teach themselves through this “survival of the fittest” evolutionary model. They “generate” new solutions through many, often millions, of generations.

Bayesian Statistics: A branch of statistics based on Bayes’ Rule, conceptualized 250 years ago by Reverend Thomas Bayes, formalized by Lavoisier, but only growing in acceptance over the past century. Bayesian models depart from traditional “frequentist” statistics in three ways. First, input data is expected to change as new information arrives (the posteriors become the new priors). Second, the outcome of a Bayesian analysis is not decisive, but rather is a probability or likelihood. Finally, the outcome is actually a distribution of probabilities, recognizing that there are several possible future outcomes differentiated by their likelihood. Bayesian models are extensively used in future-oriented AI and machine learning where they have great predictive value.

Scikit-Learn: A popular free library of machine learning algorithms for the Python programming language. Scikit-Learn is often used in machine learning courses as well as in research, prototyping, and real-world applications.

Machine Learning: a large branch of artificial intelligence that studies, develops, and implements computer systems capable of learning beyond their explicit programming. Such systems “learn” through their experience with numeric, language, visual, auditory, and/or tactile data.

NetLogo: An agent-based modeling/simulation system developed by Uri Wilensky, a student of Seymour Papert. NetLogo’s graphical output makes its simulations accessible to even elementary students while its underlying powerful programming language makes the system valuable in formal academic research and development. NetLogo’s visual demonstrations of emergent phenomena make it a powerful tool for learning Computational Thinking across many grade levels and subjects.

Models: “formal structures represented in mathematics and diagrams that help us to understand the world.” Models can simplify the complexity of complex relationships, they can use mathematics as analogies to natural processes, and they can be artificial-but-helpful constructs. ( Page, 2018 , p. 1) Many models, sometimes called simulations, use visual representations of relationships such as those in NetLogo. The algorithms used in a machine learning task form the “learning model” of the system.

ISTE: International Society for Technology in Education: The largest and most influential organization for the advancement and dissemination of technology in schools. It’s large membership is composed mostly of K-12 educators.

Tensorflow: A popular free Python library created by Google for research, teaching, and production. It includes algorithms for machine learning systems like neural networks. Tensorflow models can perform high level machine learning in a very few lines of code making the ML concepts accessible to those with limited programming skills.

Python: An easy-to-learn, general-purpose computing language. An extensive body of add-on code libraries enable Python to be customized for many applications. Scikit-Learn and Tensorflow are two such libraries that have made Python the language of choice for exploring machine learning.

R: A computer language designed for data manipulation and analysis. It is common to explore and visual data with R, then to use the data in Python for AI and machine learning applications.

Monte Carlo: An AI/machine learning approach that uses random chance as a key element in optimization and other algorithms.

Computational Thinking (CT): The creative and disciplined thought processes involved in imagining and anticipating, identifying, and formulating problems such that their solutions are in a form compatible with today’s and tomorrow’s information processing (computing) technology. This departs from most CT definitions in its emphasis on creativity, imagination, problem identification, and future anticipation.

Complete Chapter List

Search this Book:
Reset