The SIMILE suite of performance assessment tools allows in-depth assessments of student performance against a defined set of learning objectives in order to identify student deficiencies and provide feedback to both students and instructors. SIMILE includes a set of standards-based data models and protocols that can formally represent the complex interactions between the events raised within a simulation and the associated learning objectives, task conditions, and standards for each series of lessons. SIMILE creates significant efficiencies in the process for how performance assessment data can be collected and distributed to other systems. These tools allow your team to map simulation content to the various learning objectives a game or simulation is trying to achieve.

The SIMILE workbench application provides a drag & drop user interface to allow training developers to easily create “assessment models” that account for the different relationships between the student, other students, and the simulation. The SIMILE assessment engine listens to events defined within each assessment model, and uses them to track a learner’s progress toward completion of tasks and objectives. The workbench allows developers to rapidly create and/or modify assessment models so that training developers can tailor the way a student is evaluated inside each simulation scenario without having to write code.

To create the assessment models for SIMILE, we combine the tasks in a hierarchal manner to form more complex tasks, which align to one or more learning objectives. Using this hierarchy, tasks and learning objectives are transposed into “events” based on the operation or procedure to be performed and its associated performance criteria. Events may be triggered by simple user actions, such as “student pushed button” or complex concepts that capture the interaction between objects and other system components such as “time” and “state” variables within the learning environment. The detail and complexity of what constitutes an event is determined by instructional designers, subject matter experts and simulation developers according to instructional needs and simulation architecture.

The SIMILE Assessment Engine listens to these events, and uses them to track a student’s progress toward completion of defined tasks. When the actions are compared to the model, the system responds accordingly (insert fault(s), provide feedback, alert instructor, shut down system, etc.)  Since the SIMILE Assessment Engine reports on when an assessment object changes state, when tasks are completed, and when task scores are updated, it easy maps these assessment events to the Tasks, Conditions, and Standards of a competency. The system records the student’s actions and outputs the recorded data which is used to evaluate and report progress to the instructors as well as a Learning Management System, After Action Review System, or Intelligent Tutoring System. 

Delivering Innovative Results

Copyright © 2014 Engineering & Computer Simulations, Inc.