As the discipline of IO Psychology evolves, practitioners have had to adapt to changes brought about by the fourth industrial evolution. Although such changes take on many forms, common themes across different spheres include:
- Rapid technological innovation (e.g. the influence of AI on work)
- Increased integration between digital systems and people-mediated processes (e.g. Big Data)
- Emphasis on data and evidence-based decision making
Given these factors, it is more important than ever to examine how the fourth industrial revolution might impact the reporting and consumption of assessment results.
We have little doubt that using objective assessments in talent management and selection will likely increase in popularity. Especially given a renewed emphasis on data-driven decisions and scalable efficiency in a post-COVID business environment, using assessments to inform talent decision-making is a clear imperative for talent practitioners.
However, what is bound to change is the modality and method of reporting on assessment results.
Assessment reporting: Time for a change?
Considering how the consumption of data in the modern business environment has changed over the last decade, it is perhaps surprising to note how the reporting of assessment data has not followed suit.
For instance, while most line managers now consume management data and business intelligence using data dashboards and similar visualization technologies, the average assessment report still takes the form of an attached PDF-ed report or “flat file.”
What is potentially problematic about this is:
- Assessment data is not accorded the same status of presentation as business-related data, and may therefore be viewed as outmoded or less relevant
- The assessment end-user cannot choose their own path through the data and must proceed in a linear fashion, page-by-page
- Flat-file, PDF-type reports are limited in their capacity to clarify assessment data, provide definitions of key concepts, or provide context for scores
In addition, PDF-type reporting is vulnerable to security breaches, given that they are either transmitted via email (which can be sent to the wrong receiver or intercepted) or printed out as hard copies that can be misplaced or lost.
Online, interactive reporting: A new alternative
In designing our online, interactive reporting platform, we wanted to address the shortcomings and potential problems found in flat-file reporting discussed above. To achieve a true sea-change in assessment reporting, our goal was to bring together the following key concepts of the future reporting landscape:
- Data security. Assessment data is highly sensitive, and in a post-GDPR / POPI world, there will be few excuses for not adequately securing personal data.
- Data visualization. Given that assessment data can sometimes be quite complex, especially for non-professional audiences, it becomes vital to use the best practices in data visualization available.
- Design thinking and user experience. Employing principles from design thinking, we saw the opportunity for assessment reporting to be far more user-centric in its presentation. In other words, we wanted the user experience of interacting with assessment data to be friendlier, less mystifying, and ultimately, highly effective at conveying the correct message.
- Actuarial decision-making. Since making talent decisions using actuarial, mechanical principles consistently outperform subjective judgments, we wanted the reporting platform to embody this insight and promote the use of more objective, scientific talent decisions.
The Interactive Talent Viewer: Development and testing
In our journey of developing the interactive TMR and Merit List platform we utilized several stages of testing, design, and user-mediated testing.
The information gained from this process, along with data visualization best practices, informed our eventual design and allowed us to focus on the following themes:
- Be audience-specific. Good data visualization takes its audience into account. In assessments, it is critical to know the potential audience of assessment results. Assessment candidates have different requirements and expectations compared to line-managers. Expert consumers of data often need more drill-down, detailed analysis of data inside their data reports and dashboards. Good visualization should account for such differences.
- Data should answer questions. Best practice data visualization presents results as answers to specific, likely questions the audience would pose to the data. What that means is that visualizations ought to address questions by presenting data in a concise, pointed manner. For instance, a typical and common question a line manager would ask of assessment data is: Does the candidate fit the role based on the assessment results?
If that question is prominently answered in the immediate data visualization, but is instead buried somewhere else in the report, the data fails at answering the audience’s questions adequately.
- The type of data should be matched to the right visualization. This is a cornerstone principle of good data visualization. Knowing the type of data that needs to be represented is the first step. In assessment reports, we often have mixed data. Demographic data is categorical, while ability results are often in interval form (e.g. Stens).
One cannot use the same methods of visualization for both types of data. Categorical data can often be best represented with simple icons or single words, while interval data is best represented with bar graphs, scatter diagrams, and similar graphs.
The Interactive Talent Viewer: Implementation
After several phases of extensive interface and user acceptance testing, TTS is proud to announce the roll-out of our interactive, online Talent Viewer platform.
As discussed above, it was paramount for us to create a seamless, user-friendly experience for assessment data end-users that would accommodate the unique needs of assessment experts (e.g. IOPs) as well as non-experts such as line managers.
An important factor in this regard was to enable users to “drill-down” into data, thus deciding on their own course through the assessment results.
In addition, when examining individual data, it was important for us to emphasize the holistic nature of assessment data without sacrificing actuarial judgment. As a result, our interactive platform emphasizes talent match data along with other considerations that may influence a talent decision.
We believe that in the interactive Talent Viewer platforms, our clients have a future-proof reporting solution that matches the flexibility and data visualization capabilities of best-of-breed management intelligence dashboards that will appeal to line managers and talent professionals alike!
While business and management intelligence reporting has been an early adopter of best practice data visualization practices, assessment reporting has, we would argue, lagged behind. Perhaps it is a necessary function of the work we do as IO Practitioners. The data gathered by assessments is sensitive and highly confidential. It also requires expert interpretation and guidance.
But it would be doing the discipline of IO Psychology a disservice if we do not heed the growing field of data visualization.
At TTS, we believe that old ways of reporting assessment results may at best, confuse or frustrate potential consumers, and at worst, subvert the scientific rigor of the discipline (by, for instance, prioritizing clinical judgment over mechanical decision-making).
In the roll-out of our online, interactive reporting platforms, we foresee a future where best-practice data visualization practices are integrated into assessment reporting. This new technology will allow consumers of assessment data to interact with results in new ways, and to decide how they journey through their data.
If you’re interested in using our new online, interactive reporting solutions, why not drop us a line at firstname.lastname@example.org?