Bibliographic Details

Practice evaluation for the 21st century / William R. Nugent, Jackie D. Sieppert, Walter W. Hudson.

Author / Creator Nugent, William R.
Imprint Belmont, Calif. : Brooks/Cole-Thomson Learning, c2001.
Description xvii, 478 p. : ill. ; 24 cm.
Language English
Subject Social service -- Evaluation.
Social service -- Data processing.
Evaluation research (Social action programs)
Evaluation research (Social action programs)
Social service -- Data processing.
Social service -- Evaluation.
Format Print, Book
URL for this record http://pi.lib.uchicago.edu/1001/cat/bib/4450212
Other authors / contributors Sieppert, Jackie D. (Jackie Dale)
Hudson, Walter W.
ISBN 053434867X
Notes Includes bibliographical references (p. 455-468) and index.
Table of Contents:
  • Preface
  • 1. Introduction
  • Overcoming Bad Vibes
  • Accountability and Effectiveness
  • Two Definitions of Effectiveness
  • Three Definitions of Accountability
  • The Fundamental Equation of the Empirical Practice Model
  • Cost Benefit Analysis
  • Empirical Practice Model
  • A Simple Monitoring Design
  • Scientist and Practitioner
  • Practitioner Role
  • Scientist Role
  • Role Dependency in Science and Practice
  • The Role of Feedback
  • Feedback: An Illustration
  • The Promise of the EPM
  • The EPM for Everyone
  • A Human Problem Theory
  • The Role of Measurement
  • Computers and the Empirical Practice Model
  • Part I. Using Single-Case Time-Series Designs
  • 2. Designing for Effectiveness-Oriented Practice
  • A Basic Evaluation Typology
  • The Role of Goals, Objectives, and Activities
  • Program Goals
  • Program Objectives
  • Practice Objectives
  • Practice Activities
  • Relationship Between Goals, Objectives, and Activities
  • The Role of Information in Human Services
  • Final Thoughts About Collecting Practice Evaluation Information
  • Summary
  • 3. N = 1 Designs in Science and Practice
  • Social Work as Science and Practice
  • N = 1 Designs as Tools to Do and Enhance Practice
  • The Nature of Science Versus the Nature of Practice
  • Using N = 1 Designs to Conduct and Enhance Social Work Practice
  • Intake and Engagement
  • Data Collection and Assessment
  • Planning and Contracting
  • Intervention and Monitoring
  • Termination and Evaluation
  • Summary
  • 4. Monitoring and Evaluating Clinical Practice
  • The Nature of Monitoring Clinical Practice
  • The Single-System Framework
  • Deciding Upon Measurable Client Objectives
  • Selecting Appropriate Measures
  • Collecting Data Over Time
  • Graphing a Single-System Design
  • Phases and Baselining
  • The Concept of Phases
  • The Concept of Baselining
  • Basic Monitoring Designs
  • The B Design
  • The BC and BCD Designs
  • The AB Design
  • Summary
  • 5. Designs to Evaluate Practice
  • The Nature of Evaluating Practice
  • Establishing Causality Using Single-System Designs
  • Threats to Internal Validity
  • Threats to External Validity
  • Basic Evaluation Designs
  • The ABA Design
  • The ABAB Design
  • The BAB Design
  • The BCBC Design
  • The Multiple Baseline Design
  • Summary
  • 6. The Analysis of Single-Case Design Data
  • Visual Analysis and Statistical Analysis Procedures
  • Characteristics of Single-Case Design Data: Definitions and Representations
  • Level
  • Mean
  • Trend
  • Variability
  • Background Variability
  • Overlap
  • Between Phase Change
  • Change in Level and Mean
  • Change in Trend
  • Change in Variability
  • Assessing Between Phase Change
  • Latency of Change
  • Permanent Versus Temporary Change
  • Illustrative Visual Analyses
  • Illustrative Analysis Number One
  • Illustrative Analysis Number Two
  • Illustrative Analysis Number Three
  • Illustrative Analysis Number Four
  • Illustrative Analysis Number Five
  • Illustrative Analysis Number Six
  • Illustrative Analysis Number Seven
  • Illustrative Analysis Number Eight
  • Illustrative Analysis Number Nine
  • Illustrative Analysis Number Ten
  • Examples from the Literature
  • 7. Integrating Single-Case and Group-Comparison Designs for Evaluating Practice
  • Advantages of Integrated Designs
  • A Conceptual Link Between Single-Case and Group Designs
  • Direct Replication
  • Systematic Replication
  • Clinical Replication
  • Linking Single-Case and Group Designs
  • General Procedures for Integrating Single-Case and Group Methods
  • Hypothetical Examples of Integrated Designs
  • Hypothetical Example Number One
  • Hypothetical Example Number Two
  • Examples from the Literature
  • Illustrative Example Number One
  • Illustrative Example Number Two
  • Illustrative Example Number Three
  • 8. Analyzing Aggregated Single-Case Data
  • Graphical Methods
  • The Graphic Representation of Aggregated Data
  • Making Distinctions Between Client Groups
  • Comparing Different Context Variables
  • Statistical Methods
  • Additional Statistical Aids
  • More Advanced Statistics
  • A Simple Example
  • Summary
  • 9. Quality Management: Process and Outcome
  • The Emergence of Managed Care
  • The Spread of Managed Care to Mental Health Services
  • The Role of Quality Management in Managed Care
  • Utilization Management
  • The Role of EPM in Quality and Utilization Management
  • Example of Empirical Practice Model in Quality Management Process
  • Part II. Measurement Tools
  • 10. The Role of Measurement in Practice
  • The Nature of Measurement
  • Measurement Defined
  • Levels of Measurement
  • The Functions of Measurement
  • The Axioms of Treatment
  • Redefining Description
  • Enhancing Precision
  • Facilitating Practice Decisions
  • Improving Information Feedback
  • Ensuring Legal Accountability
  • Attitudes About Measurement
  • Characteristics of Good Measures
  • Reliability
  • Validity
  • Summary
  • 11. A Sampler of Short-Form Measures for Use in Practice
  • The CASS Scales
  • Personal Adjustment Issues
  • Clinical Anxiety Scale
  • Generalized Contentment Scale
  • Index of Alcohol Involvement
  • Index of Clinical Stress
  • Index of Drug Involvement
  • Index of Homophobia
  • Index of Peer Relations
  • Index of Self-Esteem
  • Sexual Attitude Scale
  • Dyadic Relationship Issues
  • Index of Marital Satisfaction
  • Index of Sexual Satisfaction
  • Non-physical Abuse of Partner Scale
  • Partner Abuse Scale: Non-physical
  • Physical Abuse of Partner Scale
  • Partner Abuse Scale: Physical
  • Family Adjustment Issues
  • Child's Attitude Toward Father
  • Child's Attitude Toward Mother
  • Index of Parental Attitudes
  • Index of Brother Relations
  • Index of Sister Relations
  • Index of Family Relations
  • Organizational Assessment Issues
  • Client Satisfaction Inventory
  • Index of Job Satisfaction
  • Index of Managerial Effectiveness
  • Index of Sexual Harassment
  • Summary
  • 12. Administering, Scoring, and Interpreting the Short-Form Scales
  • Structure and Scoring
  • Interpreting the Short-Form Scale Scores
  • Administering the Scales
  • Present the Scales with Confidence
  • Be Familiar with Scale Performance
  • Explain the Purpose of the Scales
  • Avoid Excessive Administration
  • Check Out Inconsistent Responses
  • Consider Social Desirability, Demand Characteristics, and Misleading Responses
  • 13. Multidimensional Assessment Tools
  • The Multi-Problem Screening Inventory (MPSI)
  • The MPSI Items
  • The MPSI Subscale Scores
  • The MPSI Subscales
  • Psychometric Characteristics of Subscale Scores
  • The Family Assessment Screening Inventory (FASI)
  • The FASI Items and Subscale Scores
  • Psychometric Characteristics of FASI Subscale Scores
  • 14. Administering, Scoring, and Interpreting the MPSI and FASI
  • Administering the MPSI and FASI
  • General Guidelines
  • The Nonapplicability of Some Subscales
  • Two Administration Models
  • Scoring the MPSI and the FASI
  • Interpreting the MPSI Subscale and FASI Subscale Scores
  • Interpreting the MPSI and FASI Score Profiles
  • 15. Questions About Using Assessment Scales
  • How to Select and Use the Scales
  • Screening and Diagnostic Applications
  • Using Scales with Children and Special Populations
  • Interpreting and Using Scale Scores
  • Interpreting Changes in Client Scores
  • Client Reactions and Responses
  • Dependability of Client Responses
  • Using Scales for Training and Supervision
  • Administrative and Logistical Questions
  • Research and Technical Questions
  • Summary
  • 16. Developing One's Own Measurement Scales
  • Conceptual Fundamentals
  • Ways of Measuring Client's Problems
  • Classifying Measurement Tools
  • Standardized Scales
  • Items
  • The Development of Standardized Summated Scales
  • The Scale Blueprint
  • Item Development
  • Initial Review of Items
  • Content Validity
  • Pilot Test and Item Analysis
  • Reliability and Validity Studies
  • Client-Specific Measures: Self-Anchored Scales
  • Summary
  • Part III. The Cass Software
  • 17. Acquiring and Installing the CASS Software on a PC or LAN
  • The Role of CASS in Practice
  • Acquiring a Copy of CASS
  • Hardware and Software Requirements for CASS
  • Installing CASS on a Personal Computer
  • Installing CASS from Diskettes
  • Installing CASS from the Web
  • Where Things Are Stored
  • Changing the Default Installation Parameters
  • Loading the Share.exe Program
  • Installing CASS on a Network
  • Calling Up and Using the Software
  • Free CASS Updates
  • Next Steps
  • 18. Using CASS in Practice
  • Setting Up the CASS Software
  • Change the UserID and Password
  • Install the Service Codes
  • Client Records and Service Periods
  • A First Tour of CASS
  • Conducting a First Case Review
  • Reviewing a Caseload
  • Learning About the Tool Bar
  • Ending the First Quick Tour of CASS
  • Working with Case Notes
  • Creating a Case Note Template
  • Naming Each Case Note
  • Dating Case Notes
  • Keeping Case Notes Brief
  • Writing "To Do" Notes
  • Case Note Accountability
  • Editing Case Notes
  • Case Note Limitations
  • Task Management
  • Quality Assurance Monitoring and Managed Care
  • Goal Attainment and Task Completion Scaling
  • Making the Most of Task Management
  • Defining Treatment/Service Delivery
  • Developing New Conceptual Tools
  • Understanding and Reducing Risks
  • Client Assessment and Progress Monitoring
  • Initial Assessment
  • Final Assessment
  • Interim Assessment
  • Problem Monitoring with Short-Form Scales
  • Seeing Real Change
  • Measurement as a Central Practice Task
  • Summary
  • 19. Making CAAP Available to Clients
  • Calling Up CAAP
  • Learning the CAAP Ropes
  • Training Clients
  • Training Receptionists
  • Using CAAP on a Personal Computer
  • Installing a Network
  • Using Two Independent Computers
  • Summary
  • 20. The Manager's Use of CASS
  • Standardizing Practice
  • Managing Service Codes
  • Helping with Program Evaluation
  • The CASS Utility Kit
  • The CASS Program Evaluator
  • Managing the CASS Software
  • Managing Passwords
  • 21. Concluding Remarks
  • Practice Evaluation Themes and Future Directions
  • The Integration of Practice and Evaluation
  • Managed Care and Quality Management
  • The Role of Technology
  • The Impact of Consumerism
  • New Models of Professional Practice
  • Summary
  • Appendix A. Some Technical Details
  • Appendix B. Sample Copies of the CASS Scales
  • Appendix C. Training and Practice Exercises
  • References and Suggested Readings
  • Index