Measurement tools for collecting learning outputs: short term effects
Measurement tools for collecting learning outputs: short term effects
What is this about?
This module will give an overview of measurement tools and evaluate the possible use of the identified measurement methods for short term effects.
We have divided the tools according to Kirkpatrick’s framework (1959) for training effectiveness. The framework has been used for training evaluation in REI context (Steele et al., 2016; Stoesz & Yudintseva, 2018) as well as HE context (Praslova, 2010), and includes the following levels (different kinds of tools may provide information about the achievement of the level):
- reactions (participants’ self-assessment) – different kinds of instrument may be used to collect learners’ affective and utility judgements;
- learning process (knowledge, content) – content tests, performance tasks, other course-work that is graded/evaluated, pre-post texts (tests);
- behaviour and practices (acting in the research community) – end-of-programme/course integration paper/project, learning diaries/journals (kept over a longer period), documentation of integrative work, tasks completed as part of other courses;
- results (e.g. institutional outcomes) – results can be monitored via alumni and employer surveys, media coverage, awards or recognition. In addition, nation-wide surveys may indicate the ‘health’ of RE/RI.
MMLA – multi-modal learning analytics tools
to set up and collect student responses (anonymously). The tools analyse the collected data instantly and provide teachers with an overview of the impact of the training. In addition to collecting learner reactions, the topics indicated by the app bring those aspects of learning into focus and learners start paying greater attention to them – this may have a more long-term impact on their behaviour.
ProLearning collects learner responses to teacher-generated questions (yes/no or scale 0-100), asks teachers to predict the learner responses and write a short description of the learning situation. Only then can the teacher see the graph outlining learner responses (based on groups) in relation to their own predictions. If the teacher’s prediction of the performance of a group of learners is seen as the expected learning outcome, then the use of this tool can provide a measure of the alignment or discrepancy between the expected effectiveness and the learners’ performance.
In addition, engagement in activities may have an impact on how well people acquire competencies. For measuring engagement, an application ForgetNot (by EduLog) is available. In the application, learners can provide their feedback on three aspects of the training: how they were engaged in the activities (behavioural aspect of learning), how they felt (emotional aspect of learning) and how relevant the knowledge was for them (cognitive aspect of learning). All these aspects are relevant in case of successful learning. The responses accumulate based on the group and provide information to the facilitator how the training format was perceived by learners.
Both applications are suitable for any educational context, including HE. Data collected by tools provide an overview of the group advancement and is more suitable to evaluate short-term effects of training on the Kirkpatrick level 1 (learner reactions). It is more suitable to use this tool to evaluate shot-term trainings or specific activities during the training sessions. Both tools are freely available online and the teacher needs to create an account to create inquiry sessions and see the results. Students do not need to create an account they can join with the session code provided by the teacher.
MMLA tools use statistics to analyse collected information and create graphs to illustrate the results. Teachers need to be able to read the graphs and draw conclusions based on the data. Modifications can be made to the training format and implementation based on the results.
MMLA tools may be most suitable for beginners and short trainings where collecting reactions fast is relevant. But they can also be used with more advanced level learners.
The tools are available at:
ProLearning: www.prolearning.realto.ch
ForgetNot (by EduLog): https://web.htk.tlu.ee/forgetnotSelf-reflection form/compass (SRF/SRC)
The self-reflection tool (as a form or an app) helps learners and teachers monitor the learning process as well as provide important insights about the uptake of REI course content to facilitators.
The tool supports teachers to get insights whether the content of the training has been understood, how the learners progress and achieve their learning outcomes, measure if the training has been effective. The tool also helps implement reflection into training which is a crucial part of ethics competencies. The results from testing iterations show that most learners can evaluate quite accurately their level of understanding in the context of research ethics and integrity, and repeated reflection appears to improve accuracy of self-reflection.
The self-reflection tool asks the learner to assess their level of understanding on the teacher-assigned or self-assigned topic (activity or content) and then write a short reflective paragraph on what has been learned and how they perceive it. After submission the tool provides pre-written feedback on the student-selected level and provides advice on how to improve understanding. In the app version the teacher can also provide feedback on the texts written by learners. Repeated use of the tool will show the progress of learners as well as pinpoint topics that may need further revision (e.g. if they have not been understood well enough).
The tool is more suitable for evaluation of short-term outcomes of training (like specific tasks or topics), result can provide information on Kirkpatrick levels 1 and 2. Conclusions on training impact on researcher behaviour cannot be made based on the self-reflection alone, but perhaps in combination with other tools. The teacher should introduce the usefulness of the tool to the learners and encourage them to use it repeatedly. It would work best if the tool is combined with other measurements to provide a holistic picture of the learning process. The tool is suitable for HE context and it is not field-specific.
The tool is based on the SOLO taxonomy and the reflective texts can be analysed based on both the SOLO taxonomy as well as reflection levels.
The tool may be used with all target groups in HE and it is most suitable for short trainings.
The MS Forms version of SRF is available here (learner’s view): https://forms.office.com/e/YTzAzJSAz7
The Google Forms and MS Forms copiable links are here:
The SRC app is under development and expected to be launched by February 2025.Eye-tracking
Focusing and attention are important in obtaining new skills and competencies, the same applies to ethics and integrity. Monitoring physiological markers in the context of ethics training has not been researched much. In the context of education there are examples of studying engagement and heart-rate, but eye-tracking is novel.
Eye-tracking/gaze-tracking may have the potential of revealing new insights about how learners process information for moral judgment in learning situations. Gaze-tracking collects a very large number of data points and thus allows within-person comparisons through focusing on events that are similar and frequent. This enables the analysis of how an individual (re)acts across these situations (Kirkpatrick’s level 2). The data can be viewed for example through heat maps.
To facilitate the analysis, visual markers (QR codes) can be used to locate and synchronise the data. The latest SeeTrue eye tracking device model we have used has a software component to recognize code markers. The gaze tracking device will recognize markers as they appear depending on where the person wearing the device is looking at. Thus, with an additional software component developed by researchers at the University of Helsinki, it has been possible to consolidate the gaze points coordinates of various markers into a unified pair of coordinates, allowing tracking the gaze of the person wearing the device on an idealized static version of the scene which contains the areas of interest, in our case posters on ethical principles and ethics theories (Figure 13a).
Eye-tracking data are analysed statistically and displayed often in density plots or heat-maps (Figure 13b). With these plots we can see what the salient areas of interest observed by a subject during a given time interval in the discussion taking place during an episode happen to be. A sensible temporal analysis of the gaze patterns and scan paths is also possible once this representation is available. An initial analysis suggests that the participant whose gaze is visualized in Figure 13b has focused especially on the ethical principle of beneficence and virtue ethics theory. This information can then be considered in light of the case that the participants were discussing, as well as under prior and subsequent arguments presented by the participant. If understanding ethical principles and ethics theory is an intended learning outcome of the training, then, the way in which learners apply the theories and principles can indicate how effectively the training conveys its content. It is noteworthy that the information in the poster boards should correspond with content related to the intended learning outcomes of the training for this method to be used in assessing training effectiveness.
The analysis is time-consuming and would not be used for large-scale measurement of learning. Specific equipment and software are necessary for collecting and analysing the information. If the facilitator has access to these tools, this measurement gives interesting insights about the learning process. Ethical considerations must be considered: while the equipment is easy to remove, it nevertheless attaches to the body (Hannula et al., 2022).
Figure 13. a) to the left code recognition (with fisheye distortion correction using our own software) and b) to the right density plot (heat map) of the gaze point locations during a short interval.
The tool may be most suitable for students and early-career researchers as the target group of training and in case of small groups.Remarks
Authors: Erika Löfström, Anu Tammeleht, Simo Kyllönen,
This course was produced on behalf of the BEYOND project. The BEYOND project was finded by the European Union uder the grat agreement n. 101094714