Expanding Technology and Human Readiness Levels: Measuring the Maturity of New Learning Technology and Content
Open Access
Article
Conference Proceedings
Authors: Kevin Owens, Lisa Townsend, Joan Johnston
Abstract: According to the most recent research on the effectiveness of the US Army's institutional training and education, only fifty percent of what is taught is actually transferred to the learner's later real-work activities and competence. This finding is consistent with other independent research conducted for the US Navy which begs the question of not only whether the methods we use in our institutions are effective but whether the technology and content designed to support learning is a root-cause of this low transfer in general. Today the means of measuring and accepting military training simulations, stimulations and learning systems, as well as other forms of learning content, is purely functional. During current learning system and/or content verification and validation, there is no focus or time spent on testing whether or not the target learning outcomes actually occur. This requires a form of measurement standard that is informed with instrumented, objective, and inspectable data that Army acquisition professionals, training and education developers and decision-makers can use to discern and measure a system's learning impact before making final procurement, design or implementation milestone decisions. This standard and its associated instruments could also help learning researchers and engineering teams gauge the maturity progress of their results during formative testing.This paper will describe and discuss what we term Learning Readiness Measures (LRM) that could inform such a standard, and the required minimum associated data instruments needed to provide the data evidence required to support such a standard. The LRMs are designed to expand on the already ubiquitous Technology Readiness Levels used by DoD and government, and the recent published Human Readiness Levels expansion-standard to assess acquisition and implementation risk. In addition, this paper will discuss the larger learning engineering process within which this standard and its instruments could be used to help make data-informed investigations, as part of creating learning technology and content, that would enable more informed training/education implementation decisions. The learning engineering approach uses a systemic process that iterates three distinct phases (1) the creation of learning stimulation engines, content and/or feedback, (2) the implementation that produces the inspectable data-evidence via naturalistic measures of human psychomotor and/or cognitive indicators associated with a competency being learned, and (3) investigating and analyzing the collected data to determine how well target users can not only intuitively use or understand the learning technology or content - with minimal preliminary instruction - but actually develops the targeted long-term behavior or affective changes the products are intended to produce.We will also briefly discuss the recommended minimal instruments needed to collect data on user naturalistic and involuntary responses to learning stimulus. These include measurement capability that measures factors such as: cognitive recognition speed and accuracy, cognitive attention processes, cognitive load (too many sensory inputs to track or correlate), stress from not understanding stimulus and finding navigation paths to a feature or function, gross and fine motor activity while performing a task, internal user temperature within the environment performed in, verbal responses and/or inquiry during technology operation and emotional state or intensity.Our premise is that the LRMs and associated instruments will improve learning system evaluation and quality, help improve the low transfer of institutional training and education, but also reduce DoD or Government related project risks when making investments in researching and/or procuring technology-based learning systems and content. Such capability will also assist teams involved in a learning engineering process to gauge their progress toward an effective learning solution and to address challenges faced by DoD or Government in producing more ready forces and/or a more productive future workforce.
Keywords: Learning Readiness, Data Strategy, Learning Instruments, Learning Engineering, Acquisition
DOI: 10.54941/ahfe1006271
Cite this paper:
Downloads
24
Visits
89