This paper describes the development of rubrics that help evaluate student performance and relate that performance directly to the educational objectives of the program. Issues in accounting for different constituencies, selecting items for evaluation, and minimizing time required for data analysis are discussed. Aspects of testing the rubrics for consistency between different faculty raters are presented, as well as a specific example of how inconsistencies were addressed. Finally, a consideration of the difference between course and programmatic assessment and the applicability of rubric development to each type is discussed.
|Original language||English (US)|
|Number of pages||8|
|Journal||ASEE Annual Conference Proceedings|
|State||Published - Dec 1 2002|
|Event||2002 ASEE Annual Conference and Exposition: Vive L'ingenieur - Montreal, Que., Canada|
Duration: Jun 16 2002 → Jun 19 2002
All Science Journal Classification (ASJC) codes