Expanding the Analytical Horizon: Additional Possibilities of Multimodal Dashboards in ISILA

The development of multimodal dashboards within the ISILA project represents not only a technical enhancement of learning analytics infrastructures, but also a conceptual shift in how learning processes can be observed, interpreted, and acted upon. Rather than treating dashboards as passive visualization tools, the ISILA approach positions them as epistemic instruments: interfaces through which complex, multidimensional learning phenomena can be rendered interpretable for pedagogical decision-making.

In WP5, we explored other possibilities of multimodal dashboards and data reports beyond our piloting phase:

From Data Aggregation to Analytical Integration

A central contribution of the multimodal dashboards developed in ISILA lies in their ability to integrate heterogeneous data sources within a unified analytical environment. The underlying architecture—based on xAPI and Learning Record Stores—enables the aggregation of data originating from learning management systems, external communication tools, surveys, and game-based environments into a common representational format.

However, the significance of this integration is not merely technical. By aligning disparate data streams within a shared temporal and analytical framework, dashboards enable the juxtaposition of indicators that would otherwise remain disconnected. For instance, behavioral activity patterns can be interpreted alongside self-reported motivation, or collaborative interaction intensity can be examined in relation to task progression in game-based environments. This co-presence of modalities supports a form of analytical triangulation, where interpretations emerge from the convergence—or divergence—of multiple indicators.

Multimodal Dashboards as Tools for Construct Validity

One of the most promising possibilities emerging from the ISILA work is the potential of dashboards to enhance construct validity in learning analytics. Traditional dashboards often rely on proxy indicators—such as login frequency or time-on-task—to infer constructs like engagement. Multimodal dashboards, by contrast, allow these constructs to be approximated through multiple, theoretically aligned signals.

For example, engagement can be examined not only through activity logs but also through SRL indicators (e.g., effort, motivation), interaction patterns in collaborative tools, and persistence within game-based tasks. The combination of these modalities enables a more nuanced interpretation that reduces the risk of misclassification inherent in single-source analytics.

This does not eliminate ambiguity, but it reframes it: instead of producing definitive classifications, multimodal dashboards support informed uncertainty, where instructors can weigh multiple pieces of evidence in context.

Extending Analytical Scope: Beyond Individual-Level Monitoring

Another key possibility lies in extending analytics beyond the individual learner. The integration of collaborative data—such as Discord-based interaction traces—enables the representation of group-level dynamics within dashboards. Indicators such as participation balance, communication frequency, and temporal coordination patterns can be visualized and analyzed alongside individual performance and engagement metrics.

This opens the door to forms of analysis that are largely inaccessible in traditional dashboards, including:

  • the identification of participation asymmetries within groups,
  • the detection of emergent leadership or coordination roles,
  • the monitoring of social cohesion and interaction density.

In this sense, multimodal dashboards enable a shift from learner analytics to learning system analytics, where the unit of analysis can flexibly move between individuals, groups, and activities.

Temporalization of Learning Processes

A further analytical possibility concerns the temporal representation of learning. Multimodal dashboards can incorporate time-based visualizations that capture how learning unfolds as a process rather than as a series of static outcomes.

This is particularly evident in game-based learning environments such as educational escape rooms, where dashboards can display sequences of actions, time spent on tasks, and patterns of help-seeking. These temporal traces allow instructors to distinguish between qualitatively different learning behaviors—for example, productive persistence versus inefficient trial-and-error—based on the structure and timing of actions rather than on aggregate metrics.

More broadly, the inclusion of longitudinal SRL data enables the tracking of changes in motivation, anxiety, or self-regulation over time, providing insight into how learners’ internal states evolve in relation to course events and interventions.

Analytical Extensibility: Incorporating Advanced Methods

The multimodal nature of the ISILA data ecosystem also enables the application of a wider range of analytical methods, many of which extend beyond descriptive visualization.

For survey-based data, clustering techniques can be used to identify recurring learner profiles based on combinations of SRL indicators. These profiles are not fixed categories but dynamic groupings that can change over time, reflecting shifts in learning conditions and student behavior.

For collaborative data, methods such as sentiment analysis, discourse analysis, and social network analysis can be applied to interaction traces. These approaches allow dashboards to move beyond quantifying participation toward analyzing the quality of interaction, including affective tone, communicative functions, and structural patterns of collaboration.

In game-based environments, process mining and sequence analysis can reconstruct learners’ pathways through tasks, revealing common trajectories, bottlenecks, and divergence points. Such analyses provide insight into how learning strategies unfold in practice and how they relate to outcomes.

Together, these methods illustrate that multimodal dashboards are not limited to visualization—they can serve as entry points into more sophisticated analytical workflows, while still maintaining interpretability for educators.

Supporting Reflective, Not Automated, Decision-Making

A notable characteristic of the ISILA approach is the explicit avoidance of fully automated intervention logic. Multimodal dashboards are designed to support reflective teaching practices, rather than to prescribe actions.

This design choice acknowledges the inherent ambiguity and contextual dependency of learning data. While multimodal integration increases the richness of available evidence, it does not eliminate the need for professional judgment. Instead, dashboards function as decision-support systems that augment instructors’ situational awareness.

This positioning is particularly important given the risk of over-reliance on algorithmic outputs in learning analytics. By keeping the interpretive loop centered on the instructor, ISILA maintains a balance between data-driven insight and pedagogical autonomy.

Transferability and Scalability

Finally, the ISILA experience suggests that multimodal dashboards can be designed in ways that are both scalable and transferable across contexts. By relying on interoperable standards (such as xAPI) and focusing on generic, interpretable indicators, the dashboards can be adapted to different courses, tools, and institutional settings without requiring highly specialized infrastructure.

Crucially, multimodality in this context does not depend on advanced sensing technologies. Meaningful multimodal insights can be derived from relatively accessible data sources—such as surveys, communication platforms, and structured activity logs—provided that these are integrated within a coherent analytical framework.

Concluding Remarks

The additional possibilities of multimodal dashboards lie less in their capacity to produce more data, and more in their ability to support richer interpretations of learning. By integrating multiple modalities, enabling temporal and multi-level analysis, and supporting advanced analytical methods, these dashboards extend the scope of what can be known about learning processes in authentic educational settings.

At the same time, their value depends critically on how they are used. Multimodal dashboards do not resolve the fundamental challenges of learning analytics—such as construct validity, contextual sensitivity, or interpretive ambiguity—but they provide more robust tools for engaging with these challenges in a systematic and pedagogically meaningful way.