Friday, July 5, 2013

Should job applicants be allowed to access online selection tests using mobile devices?

Written by Neil Morelli and A. James Illingworth

Online selection testing has grown tremendously in recent years due to its ability to process heavy applicant volume, efficiently score assessments, and increase test availability. As a result, how organizations deploy talent assessments, and the ways in which applicants access these assessments, has changed dramatically. Whereas in the past applicants reported to a central location to complete assessments using a desktop computer, now, with the proliferation of wireless technology, applicants can access and complete assessments remotely over the Internet on a variety of mobile devices (e.g., Smartphones, hand-held tablet computers).
Despite the added benefits of this new accessibility, such as lower delivery costs and greater contact with larger, more diverse applicant pools, organizations are faced with a dilemma: Should the administration of online assessments be extended to mobile devices? Before a clear recommendation can be made, two questions must first be answered. Do the hardware features (e.g., smaller screens, virtual keyboard) of mobile devices change how the test works, and thereby change the interpretation of the test results? And, does completing an assessment on a mobile device drive meaningful performance differences?
To address these questions, APTMetrics investigated the test equivalence, performance differences, and differential adverse impact of a text-based, non-cognitive test administered to a large sample of customer service job applicants (N= 937,243), using both desktop and mobile formats. It was discovered that the assessment could be interpreted the same way across devices, and that job applicants performed equally well regardless of the administration format. This finding is encouraging as it provides evidence that non-cognitive assessments can be just as reliable and valid when delivered on a Smartphone or tablet computer as they are on a desktop computer. 
Since this initial examination, two other studies were conducted to replicate and extend the previously described findings. The first research study was designed to replicate the results of the previous investigation using a different non-cognitive selection assessment and job family (e.g., retail sales job applicants). Using a similar study design and analysis methodology, it was discovered that the assessment operated similarly across devices and no meaningful performance differences between device groups were detected. The second study expanded the previously described findings by focusing on the potential differences between devices based on their operating system (e.g., Windows 7, OSX) and Internet browser (e.g., Internet Explorer, Mobile Safari). An examination of three different non-cognitive performance predictors discovered that operating systems and Internet browsers native to mobile devices did not change how the assessments function or result in meaningful performance differences between applicants.
So, should job applicants be allowed to access online selection tests using mobile devices? The evidence gathered thus far is extremely encouraging.  Based on this research, text-based, non-cognitive assessments of performance predictors appear to function similarly across devices without affecting applicant performance or resulting in adverse impact.
While these findings suggest organizations can give the green light to job applicants using mobile devices for text-based, non-cognitive tests, many unanswered questions still remain about the broader use for mobile devices for administering assessments. For instance, what happens as organizations and test developers design assessments that are specifically intended for mobile devices? Or, how will mobile devices react to more technologically advanced assessments, such as those with images or video? In addition, it will be important to know whether these results can be extended to cognitively oriented tests.  Research is underway to address these questions as mobile devices become more pervasive and assessments evolve in their reliance on audio and video multi-media.