Date of Graduation
1997
Document Type
Dissertation/Thesis
Abstract
The purpose of this study is to determine if two independent raters can reliably score the Competent Learner Repertoire Assessment Level 2 (CLRA), if the CLRA is sensitive to student growth over time, and to analyze if the CLRA concurs with another adaptive behavior assessment. To answer these questions the behavior of approximately 159 preschool students in a large Southern California school district will be assessed with the CLRA. The researcher proposes to evaluate inter-observer reliability by comparing scores on each CLRA item. The scores will be recorded by each of four teacher, associate teacher (paraprofessional aides) pairs across a systematically selected sub-sample of preschool students from the eight classes they teach. Inter-observer reliability will be analyzed using Cohen's kappa statistic (Kazdin, 1982; Suen and Ary, 1989). To answer the second question, the CLRA will be completed in the fall and again in the spring of the 1996-1997 school year for all the students in the larger sample described above. Data from the fall and spring assessments will be compared using a t-test for repeated measures (Glass & Hopkins, 1984). An answer to the third question will be provided when the behavior of a systematically selected sub-sample of these students will be assessed with the Vineland Adaptive Behavior Scale Classroom Edition in addition to the CLRA. The relationship between the sub-sample's Vineland results and their CLRA results will be analyzed using a Spearman rank correlation coefficient (McCall, 1994; Siegel, 1956).
Recommended Citation
Deem, Joseph Wesley, "Inter-rater reliability, sensitivity to student growth, and concurrent validity of the Competent Learner Repertoire Assessment Level Two." (1997). Graduate Theses, Dissertations, and Problem Reports. 8734.
https://researchrepository.wvu.edu/etd/8734