Applied Measurement Industrial Psychology in Human Resources Management 1st Edition by Deborah Whetzel, George Wheaton – Ebook PDF Instant Download/Delivery. 1138875961, 9781138875968
Full download Applied Measurement Industrial Psychology in Human Resources Management 1st Edition after payment
Product details:
ISBN 10: 1138875961
ISBN 13: 9781138875968
Author: Deborah L. Whetzel, George R. Wheaton
An updated version of Deborah Whetzel and George Wheaton’s earlier volume, this text is a well-organized sourcebook for fundamental practices in industrial psychology and human resources management. Applied Measurement describes the process of job analysis and test development with practical examples and discusses various methods for measuring job performance. Its primary purpose is to provide practical, systematic guidance on how to develop the various kinds of measurement instruments frequently used in the fields of industrial psychology and human resources management to assess personnel. With easy to follow guidance written in straightforward language, Applied Measurement contains three new chapters focusing on training and experience measures, assessment centers, and methods for defending the content validity of tests; includes contributions from many prominent researchers in the field, all of whom have had a great deal of applied experience; begins each chapter with an overview describing the job analysis or measurement method; and uses one job, that of an electrician, as an example throughout the book so that readers can easily understand how to apply job analysis data for the purposes of test development and job performance measurement. This practical, concise book is recommended for students and entry-level practitioners in the fields of industrial psychology and human resources.
Applied Measurement Industrial Psychology in Human Resources Management 1st Table of contents:
1 Contexts for Developing Applied Measurement Instruments
Overview
Conducting Job Analyses
Developing a Measurement Plan
Developing Measures to Predict Job Performance
Developing Measures of Job Performance
Assessing the Quality and Legal Defensibility of a Testing Program
Adopting a Common Theme
References
2 Job Analysis: Overview and Description of Deductive Methods
Overview
The Analysis of Work
Methods of Job Analysis
Evaluating The Quality of Job Analysis Information
Theoretical Models of Reliability
Classical Reliability Estimates
Validity
Summary
Other Strategies for Evaluating the Quality of Job Analysis Information
Examples of Deductive Job Analysis Methods
The Occupational Information Network (O*NET)
The Position Analysis Questionnaire (PAQ)
Other Deductive Job Analysis Instruments
Issues in Choosing Job Analysis Methods
Summary
References
3 Job Analysis: Gathering Job-Specific Information
Overview
Background
A Seven-Step Process
Gather Background Information
Identify SMEs
Develop Work Behavior, Task, and KSAO Statements
Develop and Administer Job Analysis Questionnaire
Analyze JAQ Data
Gather Critical Incidents
Analyze Critical Incidents
Summary
References
Appendix A Job Performance Dimensions (from Williams, Peterson, & Bell, 1994)
Appendix A
4 Measurement Plans and Specifications
Overview
The Measurement Plan
Identify Worker Characteristics
Identify Methods for Measuring Worker Characteristics
Complete the Measurement Plan Matrix
Test Specifications
Describe External Contextual Factors
Describe Internal Attributes
Summary
References
5 Tests of Cognitive Ability
Overview1
Historical Foundations
General Cognitive Ability
Specific Ability, Knowledge, and Noncognitive Characteristics
Psychometric Characteristics of Cognitive Ability Measures
Reliability
Validity
Subgroup Differences
How to Select or Develop a Cognitive Abilities Test
Selecting an Existing Cognitive Ability Test
Developing a Cognitive Ability Test
Test Development Procedures
Summary
References
6 Measures of Training and Experience
Overview
Background
Measures of Training and Experience
Data Collection Procedures
Scoring Procedures
Accomplishment Records
Psychometric Characteristics of Measures of Training and Experience
Reliability
Validity
Subgroup Differences
Response Distortion
How to Develop Measures of Training and Experience
Select the Type of T&E
Develop T&E Forms and Scoring System
KSA-Based Questionnaire
Accomplishment Record
Summary
References
7 Employment Interviews
Overview
Background
Types of Structured Interviews
Definition of Structure
Common Structured Interview Formats
Psychometric Characteristics of Employment Interviews
Reliability
Validity
Incremental Validity
Moderators of SI and BDI Validity
Subgroup Differences
How to Create Structured Interview Questions
Basic Methodology
Situational Interviews
Behavior Description Interviews
Summary
References
8 Background Data
Overview
Personnel Selection Issues
Item Relevance
Faking
Item Content
Recall of Life Experiences
Alternative Formats
Scaling Methods
Psychometric Characteristics of Background Data Items
Reliability
Validity
Generalizability of Biodata
Subgroup Differences
How to Generate Background Data Items
Construct-Based Item Generation
Behavioral Consistency Item Generation
Career History Item Generation
Archival Item Generation
Item Response Formats
Assembling a Questionnaire
How to Scale and Validate Background Items
Empirical Keying
Rational Scaling
Factorial Scaling
Subgrouping
Summary
References
9 Situational Judgment Tests
Overview
What is a Situational Judgment Test?
A Brief History of Situational Judgment Tests
The Structure and Format of Situational Judgment Tests
Test Fidelity
Cognitive Complexity
Response Instructions
What do Situational Judgment Tests Measure?
Psychometric Characteristics of Situational Judgment Tests
Reliability
Construct Validity
Criterion-Related Validity
Incremental Validity
Subgroup Differences
How to Develop a Situational Judgment Test1
Develop Situational Item Stems
Develop Response Alternatives
Develop a Scoring Key
Summary
References
10 Assessment Centers
Overview
What is an Assessment Center?
Assessment Centers Today
Consensual Scoring Procedures
Types and Amounts of Feedback
Modes of Simulation
Types of Exercises
Psychometric Characteristics of Assessment Centers
Reliability
Validity
Subgroup Differences
How to Develop an Assessment Center
Developing Assessment Center Exercises
Developing Rating Scales
Pilot Testing Exercises and Scales
How to Ensure Quality Assessors
Selecting Assessors
Training Assessors
How to Evaluate the Simulations
Summary
References
11 Performance Measurement
Overview
How to Develop an Effective Performance Management Process
Performance Planning
Ongoing Feedback
Performance Evaluation
Performance Review and Feedback
How to Develop Effective Evaluation Tools
Graphic Rating Scales
Behaviorally Anchored Rating Scales
Behavioral Summary Scales
Behavioral Observation Scales
Mixed Standard Rating Scales
Selecting a Rating Format
How to Evaluate and Improve Performance Rating Quality
Rating Distributions
Rating Reliability
Rater Training to Improve Rating Quality
How to Implement Effective Performance Management Systems
Getting Organizational Members on Board
Communicating
Automating
Testing
Training
Evaluating and Improving
Summary
References
12 Tests of Job Performance
Overview
Characteristics and Uses of Job Performance Tests
Background
Applications of Job Performance Tests
Limitations of Job Performance Tests
Performance Tests: Measuring Job Processes and Job Products
Psychometric Characteristics of Job Performance Tests
Reliability
Validity
Subgroup Differences
How to Select Test Content
Specifying the Job Performance Domain
Sampling Strategies for Selecting Tasks
Selecting Test Content: Two Examples
How to Construct Job Performance Tests
Constructing Work Sample Tests
Scoring Work Sample Tests
Constructing Performance-Based Job Knowledge Tests
Summary
References
13 Strategies for Test Validation and Refinement
Overview
How to Develop a Validation Research Plan
Specify Objectives
Describe Validation Strategy
Develop the Sampling Plan
Select the Criterion Measurement Instruments
How to Collect Data
Collect Personnel Data
Collect Predictor Data
Collect Criterion Data
Prepare Data to Ensure Quality
How to Analyze Validity Data
Compute Basic Descriptive Statistics
Compute Standard Scores from Raw Scores
Compute Correlation Coefficients
Correct for Factors Affecting the Correlation Coefficient
Conduct Regression Analysis
Assess Predictive Bias
Assess Item Characteristics: Classical Item Statistics
Assess Item Characteristics: Item Response Theory (IRT)
Document the Research Design and Study Results
Summary
References
14 Developing Legally Defensible Content Valid Selection Procedures
Overview
Introduction to Litigation: Challenging Content Validity
The Problem of Minimum Qualifications and the Advent of Internet Recruiting
Develop Procedures before Developing an MQ or other Selection Process
Staffing the Development Project
Selecting the Right Expert
Documenting the Development Process
Additional Considerations in Developing and Documenting the Process
People also search for Applied Measurement Industrial Psychology in Human Resources Management 1st:
applied.measurement methods in industrial psychology
what is industrial psychology
applications of industrial psychology
what is an industrial psychologist