In previous posts, we discussed the current and future pressures and imperatives that will shape the evolution of assessment reporting. Today, we take that discussion further and examine the following intriguing question:
- What would happen if best-practice data visualisation principles were applied to how assessment results are reported?
It may seem like a potentially superfluous question. Surely assessment results have always been reported using such principles? We would argue that within the world of assessment reporting, best practice data visualisation has only sometimes been applied, and at times, not at all.
Certainly those of us who remember internships filled with narrative report writing will know that data visualisation was not always part of the IO Psychology vocabulary. But times have changed and now it’s not uncommon for international forums like the annual SIOP Conference to feature workgroups on data visualisation practices.
What best-practice data visualisation looks like
The field of data visualisation is a dynamic and still-evolving one, but despite its youth, there are some consensus views among data visualisation experts on what best practice might be:
- Be audience-specific. Good data visualisation takes its audience into account. In assessments, it is critical to know the potential audience of assessment results. Assessment candidates have different requirements and expectations compared to line-managers. Expert consumers of data often need more drill-down, detailed analysis of data inside their data reports and dashboards. Good visualisation should account for such differences.
- Data should answer questions. Best practice data visualisation presents results as answers to specific, likely questions the audience would pose to the data. What that means is that visualisations ought to address questions by presenting data in a concise, pointed manner. For instance, a typical and common question a line manager would ask of assessment data is: Does the candidate fit the role based on the assessment results? If that question is not answered in the immediate, first-view data visualisation, and is instead buried deep inside the report, the data visualisation fails at answering the audience’s question.
- The type of data should be matched to the right visualisation. This is a cornerstone principle of good data visualisation. Knowing the type of data that needs to be represented is the first step. In assessment reports, we often have mixed data. Demographic data is categorical, while ability results are often in interval form (e.g. Stens). We cannot use the same methods of visualisation for both types of data. Categorical data can often be best represented with simple icons or single words, while interval data is best represented with bar graphs, scatter diagrams, and similar graphs.
- Iteration and UAT are key. Agile and DevOps project methodologies that have taken many industries by storm have application in data visualisation as well. In this regard, the best kind of data representation is iterative. What that means is that data visualisations should be released to potential consumers early and often. User Acceptance Testing like this can provide invaluable (and otherwise unattainable) feedback that should inform new iterations of data visualisations.Thus, a constant improvement of methods is ensured and encouraged.
In contrast to principles of continuous deployment and improvement, assessment reporting has often been static and slow to change. Of course, future system and methods could enable a far quicker and agile turnaround in presenting assessment results in enhanced and innovative ways.
Data visualisation applied: The case for assessment results
While business and management intelligence reporting has been an early adopter of best practice data visualisation practices, assessment reporting has, we would argue, lagged behind. Perhaps it is a necessary function of the work we do as IO Practitioners. The data gathered by assessments is sensitive and highly confidential. It also requires expert interpretation and guidance.
But it would be doing the discipline of IO Psychology a disservice if we do not heed the growing field of data visualisation. At TTS, we believe that old ways of reporting assessment results may at best, confuse or frustrate potential consumers, and at worst, subvert the scientific rigour of the discipline (by for instance, prioritising clinical judgement over mechanical decision-making).
We look forward to a future where best-practice data visualisation practices are integrated into assessment reporting. This will require new technologies that will allow consumers of assessment data to interact with results in new ways, and to decide how they journey through their data.
Next steps
We’re excited to continue this work and will be partnering with clients in testing our new interactive reporting systems with potential users. If you’re interested to take part in this ground-breaking work, why not drop us a line at info@tts-talent.com?