Improving user experience of assessment results with interactive reporting

How can IOPs improve user experience of assessment results? In this article, we build on previous posts that argued for the application of best-practice data visualization principles to assessment reporting. One of the conclusions we drew from that work was that user acceptance testing is a key feature of successful data visualization implementations.

Why is that so important?

Because, no matter what data visualization is ultimately decided on, it will not succeed if user experience is not taken into consideration and taken seriously as part of the reporting solution.

Let’s take a closer look at what is meant by user experience and how online, interactive reporting can be used to significantly improve how a user of assessment results will experience reporting.

User experience defined

While user experience is often associated with information technology design and development, IO Practitioners may be surprised to learn that it is actually a subdiscipline of ergonomics, which has traditionally formed part of many IO Psychology curricula taught around the world.

In fact, the international standard for ergonomics, ISO 9241-10, emphasizes distinctly psychological constructs, such as perceptions and behavioral responses in defining user experience. Therefore, it might be argued that IOP as a profession is ideally suited to investigate (and improve) user experience.

A closely aligned concept is that of usability, i.e. the ease of using a particular product or interface, and the degree to which that product or interface helps the user in accomplishing goals and objectives.

Although IOPs can contribute to debates surrounding usability and user experience, the reality is often quite different. This is unfortunate, because when it comes to assessment reporting, user experience and usability are quite vital.

The usability of assessment reporting

In frequent consultations with our clients, combined with our research on best practices in data visualization, we have identified several key components of what ideally usable reporting might look like.

For instance, best practice data visualization principles suggest that good assessment reporting should have the following attributes:

  • Audience-specificity: showing data most appropriate to the current end-user (e.g. expert level data for experts, easily consumable data for non-experts)
  • Data that answers questions: reporting should answer burning questions first and in easy-to-understand ways, rather than burying vital data too deeply inside a report or presenting less important data too prominently.
     
  • Data presented by the correct visualization: This is a cornerstone principle of good data visualization. The same methods of visualization cannot be used for all types of data.  For instance, categorical data can often be best represented with simple icons or single words, while interval data is best represented with bar graphs.

Based on the above, we can already see that current reporting practices in conventional, off-line reporting may often fall short of best practice data visualization.

Client needs and user experience

A few years ago, we asked our clients what they saw as the future requirements for talent assessment reporting. We also wanted to know how we could improve their user experience of assessment results. Their responses were instructive. Clients wanted reporting that:

  • Allowed for user-specific data journeys
  • Was instantly available
  • Was digitally secure
  • Required little to no prior knowledge of psychometry to understand and use
  • Was interactive

When taken together with best practice principles of data visualization, the above needs point to a series of user experiences that the reporting of the future will have to accommodate.

For instance, future reporting will necessarily be online and digital in order to meet the requirements of being secure and interactive. Another requirement is ease of use.  That will demand extensive user testing and acceptance of any new reporting solution.

The TTS Interactive Talent Viewer

Based on the principles outlined above, TTS has developed a platform that delivers online, interactive Merit Lists and Talent Match Reports to assessment end-users, the iTalentViewer.

Our aim in developing this platform was to bring together four key requirements for the future of assessment reporting:

  • Robust data security
  • Enhanced user experience and usability
  • Reporting that promotes actuarial talent decisions
  • Best practice data visualization 

Using principles founded in agile and DevOps methodology, we developed a solution that was informed by extensive user testing and usability studies, something that is difficult to do in conventional reporting.

We believe that our interactive reporting platform is a future-proof solution for clients. The way it renders data also respects the unique nature of talent assessment data and the needs of assessment end-users like line-managers and HR professionals.

The interactive Talent Viewer represents a fundamental change in the way our clients will be consuming and utilizing assessment data into the future.

Not only is it fully secure and GDPR compliant, but the platform also allows for continuous advances in assessment reporting through constant enhancements and upgrades: a familiar concept in software as a service (SaaS) domains.

As an example of future upgrades, group data analytics and video and audio integrations are in development for deployment in the near future.

Final thoughts

In previous articles, we questioned whether assessment reports have kept pace with contemporary trends in data visualization.

For instance, line managers routinely consume business metrics by way of data dashboards and similar tools. Yet, assessment results relevant to potential hires or highly talented team members routinely find their way to a manager’s desk via an emailed or printed PDF report (or spreadsheet in the case of a Merit List).

TTS’s interactive Talent Viewer platform will change that.

Now, assessment professionals, IOPs and talent managers can be assured that vital talent data will be represented in a secure, best-practice, and technologically advanced manner. What’s more, line managers should recognize familiar interactive conventions of consuming data that they are already using in other dashboard systems and in other data domains.

This innovation might well be the death of PDF!

Here at TTS we believe that this new technology will significantly improve and enhance the experience a user has when consuming assessment results. For more information on our interactive reporting solutions, or to take part in this exciting project, drop us a line at info@tts-talent.com