CRDG faculty and staff do more than conduct evaluation studies; they also engage in research on evaluation. This research focus answers the repeated call in the professional literature to establish a stronger theoretical and empirical foundation for evaluators’ conceptions of the profession and of evaluation practice, methods, and theory. Examples of research on the practice of evaluation include the study of the degree to which the participation of program personnel in evaluations affects evaluation methods and results and the study of the degree to which programs are implemented as intended. Research on the methods of evaluation has examined the development, validation, and use of evaluation methods such as classroom observations, teacher logs, and other data collection instruments. As a result of this kind of research, CRDG faculty and staff have published several evaluation instruments in the professional evaluation literature. Research on the profession of evaluation has included reviews of the degree to which research on evaluation studies have been reported in the professional evaluation literature. Research on the theory of evaluation has addressed fundamental issues in evaluation—revisited regularly in new contexts over the years—that undergird how and why evaluations are conducted. Theory topics have included the extent to which evaluations should respect indigenous populations, the use of evaluation findings by program personnel, and others. The results of CRDG faculty and staff’s work on the four broad evaluation topics have been presented at national evaluation conferences, published in several national refereed journals, and discussed in books. They have provided some of the background necessary to be awarded grants from the National Science Foundation and the U. S. Department of Education and helped serve as the foundation for the selection of a CRDG faculty member as the 2013–2015 editor-in-chief of the American Evaluation Association journal, New Directions for Evaluation. Resources permitting, CRDG faculty and staff intend to broaden their body of research on evaluation practice, methods, the profession, and theory and thereby continue to enhance CRDG’s stature and contributions nationally and internationally.
- SAS Code for Calculating Intraclass Correlation Coefficients and Effect Size Benchmarks for Site-Randomized Education Experiments