Loading...
Loading...
Loading...
SIS Journal of Projective Psychology & Mental Health
👀 13 Reading Now
🌍 21,243 Global Reach
Support Our Mission

ad@dubay.bz

(907) 223 1088

Back to Case Studies

Case 65: On the Declining Use of Projective Techniques in School Psychology: A Response to Piotrowski (2019)

Published: March 19, 2026

On the Declining Use of Projective Techniques in School Psychology: A Response to Piotrowski (2019)

Nicholas F. Benson, R. G. Floyd, J. H. Kranzler, T. L. Eckert, S. A. Fefer & G. B. Morgan

        This article is a response to the recent editorial by Piotrowski (2019). In our article (Benson et al., 2019), we reported results from a national survey of the tests and assessment practices used by contemporary school psychologists in the United States. We appreciate Piotrowski’s (2019) acknowledgement of our results as offering

―a keen perspective on the emerging status and breadth of assessment practices, specific to the field of school psychology‖ (p. 75). Our article focused on a wide range of assessment practices, and Piotrowski’s concerns pertained largely to our presentation and discussion of results regarding projective techniques.

        First, Piotrowski (2019) lamented that we downplayed the use of projective techniques and suggested that we cast them as moribund. Specifically, he objected to our statement that, ―today, few projective tests of any kind are used‖ (Benson et al., 2019, p. 43) in school psychology. The word few means not many persons or things. Although it is true that 32% of participants in our study reported using at least one projective technique during the past year, 93% of those who used them did so less than once per month, on average (M = 0.39, SD = 1.36). Thus, Piotrowski’s criticism appears to be based on conflation of the percent of practitioners using these techniques with the average frequency of their use. According to our results, projective techniques are administered only about once every three months by school psychologists, on average. This is certainly not many. Moreover, although we did not ask respondents to indicate how they interpreted the tests they administered, other surveys regarding projective test usage found they were used for alternative purposes, such as establishing rapport (Holaday, Smith, & Sherry, 2000; Kennedy, Faust, Willis, & Piotrowski, 1994), and not to exclusively test projective hypotheses. In any case, our results showed a dramatic decline in the use of projective techniques in school psychology relative to results from the previous comprehensive assessment surveys

we cited that were conducted more than 20 years ago.1 As our results showed, the administration of projective techniques is no longer standard practice in school psychology.

        Second, Piotrowski (2019) expressed concern that we ―singled-out‖ (p. 75) projective techniques as rarely used instruments while minimizing the infrequent use of narrow-band, symptom-specific measures like the Children’s Depression Inventory (CDI; Kovacs, 2010) and Revised Children’s Manifest Anxiety Scale, Second Edition (RCMAS-2; Reynolds & Richmond, 2008). Although we did not explicitly label narrow-band instruments as rarely used, we noted that they ―are used relatively sparingly‖ and that the CDI and RCMAS-2 are used ―less than once per month, on average‖ (Benson et al., 2019, p. 45). Our results can reasonably be interpreted as indicating that narrow-band self-report instruments are sparingly used by practitioners in school settings. Furthermore, we also noted the decline in use of tests of perceptual and motor functioning in the same paragraph in question, as our intent was to note trends evident across surveys conducted over time and not to single out specific classes of instruments for criticism.

        Third, Piotrowski (2019) took issue with our claim that the results presented in our article are consistent with the evidence-based practice movement (i.e., a movement toward use of assessment and intervention practices with demonstrated research support; Gross, Farmer, & Ochs, 2018; Kratochwill, 2007). Although there are certainly examples of empirically supported instruments that are rarely used by school psychologists, we believe that the less frequent use of projective techniques by school psychologists can be attributed largely to a mismatch between the constructs measured by these instruments and the purposes of assessments conducted in school settings. Most assessments in schools are conducted as part of the eligibility determination process for special education under the Individuals with Disabilities Education Act (IDEA, 2004), and most referrals occur during childhood rather than during adolescence. Thus, some of Piotrowski’s examples of instruments that have solid empirical support but did not rank highly based on frequency of use, such as the Millon Adolescent Clinical Inventory (Millon, 1993) and the Minnesota Multiphasic Personality Inventory-Adolescent-Restructured Form (Archer, Handel, Ben-Porath, & Tellegen, 2016), are not surprising. Moreover, the top 50 tests in our list consist mostly of those with extensive evidence of reliability and validity to support their use for school-based assessment purposes.

        With respect to projective techniques, our data are consistent with Piotrowski’s conclusion that their use ―for assessment may be somewhat limited‖ (p. 75) — at least in the field of school psychology. Evidence-based practice involves using procedures supported by empirical research demonstrating their utility in guiding decisions about when intervention is warranted, what the targets of intervention should be, and what specific interventions should be used (Youngstrom& Van Meter, 2016). At the current time, there is a paucity of evidence supporting the utility of scores derived from projective techniques for the determination of eligibility for special education and the design of interventions in school settings (Nelson-Gray, 2003; Whitcomb, 2017).

        In our view, the critiques levied by Lilienfeld, Wood, and Garb (2000), which Piotrowski (2019) cited, were fair, objective, and constructive. In fact, Lilienfeld et al. reported that they found empirical support for some of the Rorschach Inkblot Test and Thematic Apperception Test indexes. They suggested that suboptimal design and construction of projective techniques likely contributed to the empirical shortcomings of other indexes and provided constructive recommendations for addressing these limitations. Specifically, these authors recommended the following principles: (a) aggregation across multiple items, (b) relevance of stimuli to the construct of interest, and (c) iterative, self-correcting test construction. These recommendations fit well with the current definition of validity in the Standards for Educational and Psychological Testing: ―Validity refers to the degree to which evidence and theory support the interpretations of test scores for proposed uses of tests‖ (American Educational Research Association [AERA], American Psychological Association [APA], & National Council on Measurement in Education [NCME], 2014, p. 11). Per this definition, reliability and validity are not properties of tests. Rather, evidence needs to be collected and marshalled for each score that is proffered for interpretation and use. To facilitate this process, it is important that projective techniques be designed to measure clearly defined attributes, evidence supports the derived scores as accurate representations of intended attributes, and the number of scores derived from the test does not exceed the number of attributes the test is designed to measure (Mari, Carbone, & Petri, 2015).

        Nonetheless, the now decades-old cautions against using projective techniques for clinical and forensic purposes offered by Lilienfeld et al. (2000) still hold true today. Although we noted the marked decline in the use of projective techniques by school psychologists in our article, those who use projective techniques can take solace in the fact that scholarship focusing on their use is alive and well, as evidenced by data presented by Piotrowski (2019; see Table 2). It is possible that multiple projective techniques may be viewed as evidence-based if they are designed and constructed appropriately and subjected to strong programs of construct validation consistent with the Standards for Educational and Psychological Assessment (AERA, APA, & NCME, 2014). Certainly, we welcome assessment innovations based on scientific methods, focusing on children and adolescents, for the determination of special education eligibility, and linked to the design of interventions in school settings.

        Last, Piotrowski (2019) hypothesized that the sharp decline in the use of projective tests in the field of school psychology over the past 20 years is due to the lack of faculty interest. This is quite plausible, although the apparent lack of faculty interest is most likely related to limited research supporting the validity and utility of projective techniques in the schools. Because our study focused on the test use and assessment practices of practitioners in the field, we did not survey faculty in school psychology training programs. Results of our survey therefore did not shed light on whether school psychology programs are currently meeting the training needs of future practitioners with regard to assessment. Moreover, results of our survey do not explain why the observed changes in the field occurred, particularly regarding the use of projective techniques. Further research on the content and scope of assessment curricula in school psychology training programs is obviously needed. We suspect that projective tests are currently given little attention in most training programs, consistent with findings from a summary review of survey results regarding coverage of drawing techniques in graduate-level clinical, counseling, and school psychology programs (Piotrowski, 2016).

Note:1Piotrowski (2019) discussed the results of two surveys that were not included in our literature review: Kennedy, Faust, Willis, and Piotrowski (1994) and Whitney (2011). Although it was not possible to review and cite every  survey  relevant  to  our  study,  we

generally limited our review to national surveys that were conducted on the comprehensive assessment practices of school psychologists and published in peer-reviewed journals. The study by Kennedy et al. focused solely on socio-emotional assessment practices, and the study by Whitney is an unpublished doctoral dissertation.

 

References:

American Educational Research Association, American Psychological Association, & National Council on Measurement in Education. (2014). Standards for educational and psychological testing (4th ed.). Washington, DC: Authors.

Archer, R. P., Handel, L., Ben-Porath, Y. S., & Tellegen,

  1. (2016). MMPI-A-RF interpretive manual. Minneapolis, MN: University of Minnesota Press.

Benson, N. F., Floyd, R. G., Kranzler, J. H., Eckert, T. L., Fefer, S. A., & Morgan, G. B. (2019). Test use and assessment practices of school psychologists in the United States: Findings from the 2017 National Survey. Journal of School Psychology, 72, 29-48.

doi:10.1016/j.jsp.2018.12.004

Gross, T. J., Farmer, R. L., & Ochs, S. E. (2018). Evidence-based assessment: Best practices, customary practices, and recommendations for field-based assessment. Contemporary School Psychology. https://doi.org/10.1007/s40688-018-0186-x

Holaday, M., Smith, D. A., & Sherry, A. (2000). Sentence completion tests: A review of the literature and results of a survey of members of the Society for Personality Assessment. Journal of Personality Assessment, 74, 371.383.

doi:10.1207/s15327752jpa7403_3

Individuals with Disabilities Education Act (2004). Pub. L. No. 108–446.

Kennedy, M. L., Faust, D., Willis, W. G., & Piotrowski, C. (1994). Social-emotional assessment practices in school psychology. Journal of Psychoeducational Assessment, 12, 228–240.

doi:10.1177/073428299401200302

Kovacs, M. (2010). The Children's Depression Inventory (2nd ed.). Toronto, ON: Multi-Health Systems.

Kratochwill, T. (2007). Preparing psychologists for evidence-based school practice: Lessons learned and challenges ahead. American Psychologist, 62, 829–843. doi:10.1037/0003-066x.62.8.829.

Lilienfeld, S. O., Wood, J. M., & Garb, H. N. (2000). The scientific status of projective techniques. Psychological Science in the Public Interest, 1(2), 27–66. doi:10.1111/1529-1006.002.

Mari, L., Carbone, P., & Petri, D. (2015). Fundamentals of hard and soft measurement. In A. Ferrero, D. Petri, P. Carbone, & M. Catelani (Eds.), Modern measurements: Fundamentals and applications (pp. 203–262). Hoboken, NJ: Wiley-IEEE Press.

Millon, T. (with Millon, C. & Davis, R. D.). (1993). Millon Adolescent Clinical Inventory manual. Minneapolis, MN: National Computer Systems.

Nelson-Gray, R. O. (2003). Treatment utility of psychological assessment. Psychological Assessment, 15, 521–531.

doi:10.1037/1040-3590.15.4.421

Piotrowski, C. (2016). Drawing techniques in assessment: A summary review of 60 survey-based studies of

training and professional settings. Journal of the Indian Academy of Applied Psychology, 42, 220-236.

Piotrowski, C. (2019). Projective techniques are not moribund: Comment on the Benson et al. (2019) assessment practices article. SIS Journal of Projective Psychology and Mental Health, 26, 73–76.

Reynolds, C. R., & Richmond, B. O. (2008). Revised Children's Manifest Anxiety Scale (2nd ed.). Los Angeles, CA: Western Psychological Services.

Whitcomb, S. A. (2017). Behavioral, social, and emotional assessment of children and adolescents (5th ed.). New York, NY: Routledge.

Whitney, S. R. (2011). Assessment practices of school psychologists (Unpublished doctoral dissertation), City University of New York, New York, NY.

Youngstrom, E. A., & Van Meter, A. (2016). Empirically supported assessment of children and adolescents. Clinical Psychology: Science & Practice, 23, 327–347. doi:10.1111/cpsp.12172.

About Us

Mental Health Service is our passion. We aim to help any and every human being in need regardless of race, religion, country or financial status.

Our Sponsors

We gratefully acknowledge the support of our sponsors.

© 2026 Somatic Inkblots. All Rights Reserved.