Department

University of Tennessee at Chattanooga. Dept. of Psychology

Publisher

University of Tennessee at Chattanooga

Place of Publication

Chattanooga (Tenn.)

Abstract

When I/O psychologists collect self-report data, they hope that every participant will carefully reflect on every questionnaire item. Unfortunately, this is not always the case: Datasets often include some participants who have responded carelessly to part—or all—of the study questionnaire (Meade & Craig, 2012). In this presentation I will discuss the detection and prevention of careless responding and I will identify the conditions that are most likely to produce careless responding. The following subsections provide an overview of my presentation. Careless responding occurs when research participants provide inaccurate data because they have failed to carefully read or comply with questionnaire instructions and item content (Huang, Curran, Keeney, Poposki, & DeShon, 2012). Because of its breadth, a variety of measures are needed to adequately capture careless responding. I will discuss several of these measures, including (a) infrequency indices, (b) inconsistency indices, (c) long string indices, (d) page time, and (e) self-reported carelessness (see Huang et al., 2012; Maniaci & Rogge, 2014; Meade & Craig, 2012). Research using undergraduate samples has found that roughly 12% of participants engage in egregious careless responding (Meade & Craig, 2012). Unfortunately, such levels of carelessness are sufficient to bias one’s research findings. An important question remains: Are unacceptable levels of careless responding present within applied datasets? As I will discuss in my presentation, applied datasets can contain high levels of careless responding. Meade and Craig (2012) discussed four potential causes of careless responding: (a) questionnaire length, (b) minimal researcher-participant social contact, (c) environmental distractions, and (d) and the lack of participant interest in the questionnaire content. I will discuss the degree to which these qualities are present within various applied situations. Participants are less likely to respond carelessly when careful responding is incentivized (Huang et al., 2012). I will discuss applied situations in which careful responding is inherently rewarded, and I will discuss how practitioners can use extrinsic rewards to encourage careful responding. This presentation will focus on the issues related to Millennials entering the workplace. There have been numerous articles written about the challenges that this generation faces, and presents, but there appear to be a lot of misconceptions about these individuals. This presentation will address the myths and the truths about the Millennial generation and the accompanying implications for work performance. This will be an interactive presentation and audience questions are welcomed.

Date

10-22-2016

Subject

Industrial and organizational psychology

Document Type

presentations

Language

English

Rights

Under copyright.

Share

COinS
 
Oct 22nd, 11:00 AM Oct 22nd, 11:50 AM

Your attention please! Careless responding as a threat to data quality

When I/O psychologists collect self-report data, they hope that every participant will carefully reflect on every questionnaire item. Unfortunately, this is not always the case: Datasets often include some participants who have responded carelessly to part—or all—of the study questionnaire (Meade & Craig, 2012). In this presentation I will discuss the detection and prevention of careless responding and I will identify the conditions that are most likely to produce careless responding. The following subsections provide an overview of my presentation. Careless responding occurs when research participants provide inaccurate data because they have failed to carefully read or comply with questionnaire instructions and item content (Huang, Curran, Keeney, Poposki, & DeShon, 2012). Because of its breadth, a variety of measures are needed to adequately capture careless responding. I will discuss several of these measures, including (a) infrequency indices, (b) inconsistency indices, (c) long string indices, (d) page time, and (e) self-reported carelessness (see Huang et al., 2012; Maniaci & Rogge, 2014; Meade & Craig, 2012). Research using undergraduate samples has found that roughly 12% of participants engage in egregious careless responding (Meade & Craig, 2012). Unfortunately, such levels of carelessness are sufficient to bias one’s research findings. An important question remains: Are unacceptable levels of careless responding present within applied datasets? As I will discuss in my presentation, applied datasets can contain high levels of careless responding. Meade and Craig (2012) discussed four potential causes of careless responding: (a) questionnaire length, (b) minimal researcher-participant social contact, (c) environmental distractions, and (d) and the lack of participant interest in the questionnaire content. I will discuss the degree to which these qualities are present within various applied situations. Participants are less likely to respond carelessly when careful responding is incentivized (Huang et al., 2012). I will discuss applied situations in which careful responding is inherently rewarded, and I will discuss how practitioners can use extrinsic rewards to encourage careful responding. This presentation will focus on the issues related to Millennials entering the workplace. There have been numerous articles written about the challenges that this generation faces, and presents, but there appear to be a lot of misconceptions about these individuals. This presentation will address the myths and the truths about the Millennial generation and the accompanying implications for work performance. This will be an interactive presentation and audience questions are welcomed.