Publisher
University of Tennessee at Chattanooga
Place of Publication
Chattanooga (Tenn.)
Abstract
Applicant Reactions to Automated Assessments: Moderation by Applicant Quality Caleb Pollard, Dr. Yalcin Acikgoz Organizations are beginning to implement technologies such as artificial intelligence with the intent to quickly identify qualified candidates (Wesche & Sonderegger, 2021). Artificial intelligence has recently been employed to automate the selection decision-making process within screening and job interviews (Jaser et al., 2022). Utilizing these automated selection processes allows employers to distill the applicant pool to include only qualified candidates at significantly quicker times (Noble et al., 2021). Since the use of automation is prevalent, it is important to consider how applicants may react. Links between applicant reactions and job acceptance intentions as well as job pursuit intentions have been established in the literature (Chapman et al., 2005, as cited in McCarthy et al., 2017). Overall, the current literature reports that applicants tend to have negative reactions toward implementing automation within the selection processes (Acikgoz et al., 2020). However, minimal research has examined moderating variables in this relationship. This brings us to the current study examining how applicant quality may impact reactions to automation in the selection process. It is expected that higher quality (conscientiousness, cognitive ability, education, etc.) applicants will be more confident in their selection and therefore will have more positive reactions to the use of automation. H1: Higher applicant quality will result in more positive applicant reactions. RQ1: Will demographic characteristics have any interaction effect on these relationships? The data for this study was collected recently for another manuscript examining applicant reactions. 635 prolific users were randomly assigned to receive a brief description of different selection procedures (automated and traditional) and were told to imagine an organization using this selection procedure. Reaction data was then collected using the selection procedural justice scale (SPJS) (Bauer et al., 2001). Additionally, measures of “invasion of privacy” and litigation intentions were collected on Likert-type items. Participant characteristics such as self-report IQ, education level, personality variables, experience, and demographics (race, age, gender) were collected. A correlational analysis will be conducted to examine if these variables correlate with reactions to automation. It is expected that applicants of higher quality will have more positive reactions to automation. If this is the case, then organizations looking to hire high-quality applicants may not need to be as concerned about the possibility of negative reactions to automation. This may add further support to the use of automation in the selection process. Future research should continue to search for moderating variables. References Acikgoz, Y., Davison, K. H., Compagnone, M., & Laske, M. (2020). Justice perceptions of artificial intelligence in selection. International Journal of Selection and Assessment, 28(4), 399–416. https://doi-org.proxy006.nclive.org/10.1111/ijsa.12306 Bauer, T. N., Truxillo, D. M., Sanchez, R. J., Craig, J. M., Ferrara, P., & Campion, M. A. (2001). Applicant reactions to selection: Development of the selection procedural justice scale (SPJS). Personnel Psychology, 54(2), 388–420. https://doi.org/10.1111/j.1744-6570.2001.tb00097.x Jaser, Z., Petrakaki, D., Starr, R., & Oyarbide-Magaña, E. (2022, January 27). Where automated job interviews fall short. Harvard Business Review. https://hbr.org/2022/01/where-automated-job-interviews-fall-short McCarthy, J. M., Bauer, T. N., Truxillo, D. M., Anderson, N. R., Costa, A. C., & Ahmed, S. M. (2017). Applicant perspectives during selection: A review addressing “So what?,” 'What’s new?,’ and “Where to next?” Journal of Management, 43(6), 1693–1725. https://doi-org.proxy006.nclive.org/10.1177/0149206316681846 Noble, S. M., Foster, L. L., & Craig, S. B. (2021). The procedural and interpersonal justice of automated application and resume screening. International Journal of Selection & Assessment, 29(2), 139–153. https://doi-org.proxy006.nclive.org/10.1111/ijsa.12320 Wesche, J. S., & Sonderegger, A. (2021). Repelled at first sight? Expectations and intentions of job-seekers reading about AI selection in job advertisements. Computers in Human Behavior, 125. https://doi-org.proxy006.nclive.org/10.1016/j.chb.2021.106931
Date
11-9-2024
Subject
Industrial and organizational psychology
Document Type
posters
Language
English
Rights
http://rightsstatements.org/vocab/InC/1.0/
License
http://creativecommons.org/licenses/by/4.0/
Included in
Applicant Reactions to Automated Assessments: Moderation by Applicant Quality
Applicant Reactions to Automated Assessments: Moderation by Applicant Quality Caleb Pollard, Dr. Yalcin Acikgoz Organizations are beginning to implement technologies such as artificial intelligence with the intent to quickly identify qualified candidates (Wesche & Sonderegger, 2021). Artificial intelligence has recently been employed to automate the selection decision-making process within screening and job interviews (Jaser et al., 2022). Utilizing these automated selection processes allows employers to distill the applicant pool to include only qualified candidates at significantly quicker times (Noble et al., 2021). Since the use of automation is prevalent, it is important to consider how applicants may react. Links between applicant reactions and job acceptance intentions as well as job pursuit intentions have been established in the literature (Chapman et al., 2005, as cited in McCarthy et al., 2017). Overall, the current literature reports that applicants tend to have negative reactions toward implementing automation within the selection processes (Acikgoz et al., 2020). However, minimal research has examined moderating variables in this relationship. This brings us to the current study examining how applicant quality may impact reactions to automation in the selection process. It is expected that higher quality (conscientiousness, cognitive ability, education, etc.) applicants will be more confident in their selection and therefore will have more positive reactions to the use of automation. H1: Higher applicant quality will result in more positive applicant reactions. RQ1: Will demographic characteristics have any interaction effect on these relationships? The data for this study was collected recently for another manuscript examining applicant reactions. 635 prolific users were randomly assigned to receive a brief description of different selection procedures (automated and traditional) and were told to imagine an organization using this selection procedure. Reaction data was then collected using the selection procedural justice scale (SPJS) (Bauer et al., 2001). Additionally, measures of “invasion of privacy” and litigation intentions were collected on Likert-type items. Participant characteristics such as self-report IQ, education level, personality variables, experience, and demographics (race, age, gender) were collected. A correlational analysis will be conducted to examine if these variables correlate with reactions to automation. It is expected that applicants of higher quality will have more positive reactions to automation. If this is the case, then organizations looking to hire high-quality applicants may not need to be as concerned about the possibility of negative reactions to automation. This may add further support to the use of automation in the selection process. Future research should continue to search for moderating variables. References Acikgoz, Y., Davison, K. H., Compagnone, M., & Laske, M. (2020). Justice perceptions of artificial intelligence in selection. International Journal of Selection and Assessment, 28(4), 399–416. https://doi-org.proxy006.nclive.org/10.1111/ijsa.12306 Bauer, T. N., Truxillo, D. M., Sanchez, R. J., Craig, J. M., Ferrara, P., & Campion, M. A. (2001). Applicant reactions to selection: Development of the selection procedural justice scale (SPJS). Personnel Psychology, 54(2), 388–420. https://doi.org/10.1111/j.1744-6570.2001.tb00097.x Jaser, Z., Petrakaki, D., Starr, R., & Oyarbide-Magaña, E. (2022, January 27). Where automated job interviews fall short. Harvard Business Review. https://hbr.org/2022/01/where-automated-job-interviews-fall-short McCarthy, J. M., Bauer, T. N., Truxillo, D. M., Anderson, N. R., Costa, A. C., & Ahmed, S. M. (2017). Applicant perspectives during selection: A review addressing “So what?,” 'What’s new?,’ and “Where to next?” Journal of Management, 43(6), 1693–1725. https://doi-org.proxy006.nclive.org/10.1177/0149206316681846 Noble, S. M., Foster, L. L., & Craig, S. B. (2021). The procedural and interpersonal justice of automated application and resume screening. International Journal of Selection & Assessment, 29(2), 139–153. https://doi-org.proxy006.nclive.org/10.1111/ijsa.12320 Wesche, J. S., & Sonderegger, A. (2021). Repelled at first sight? Expectations and intentions of job-seekers reading about AI selection in job advertisements. Computers in Human Behavior, 125. https://doi-org.proxy006.nclive.org/10.1016/j.chb.2021.106931
Department
University of Tennessee at Chattanooga. Dept. of Psychology