Collecting programmatic assessment data with no "extra" effort: Consolidated evaluation rubrics for chemical plant design

Research output: Contribution to journalArticlepeer-review

Abstract

In order to gain accreditation, engineering programs must define goals and objectives, assess whether their graduates are meeting these objectives, and "close the loop" by using the assessment data to inform continuous improvement of the program. In ABET's jargon, program "objectives" describe capabilities that graduates are expected to possess, e.g., "Graduates of the Chemical Engineering program at Rowan University will be able to." Thus, the true success of the program in meeting its objectives is reflected in the first few years of graduates' careers. Practically speaking a program cannot be expected to assess directly the performance of graduates with respect to these objectives, at least not in a comprehensive way. Consequently, programs are expected to define and assess measurable "outcomes" which fit within the undergraduate curriculum, and which ensure, to the best degree possible, that graduates will meet the program objectives. A variety of assessment instruments are in common use and merits and shortcomings of each have been discussed in the open literature. For example, surveys and exit interviews are commonly used, but are subjective, rely on self-assessments and may oversimplify the questions under examination. This paper focuses on tools for direct measurement of student performance through objective evaluation of work product. Numerous authors have outlined the assessment strategy of constructing rubrics for measuring student achievement of learning outcomes and applying them to portfolios of student work. Other authors have outlined use of rubrics for evaluation and grading of individual assignments and projects. This paper will describe the use of a consolidated rubric for evaluating final reports in the capstone Chemical Plant Design course. Instead of grading each report and then having some or all of the reports evaluated through a separate process for programmatic assessment purposes, the instructor evaluates the report once using the rubric, and the same raw data is used both for grading and for programmatic assessment.

Original languageEnglish (US)
JournalASEE Annual Conference and Exposition, Conference Proceedings
StatePublished - Jan 1 2011

All Science Journal Classification (ASJC) codes

  • Engineering(all)

Fingerprint Dive into the research topics of 'Collecting programmatic assessment data with no "extra" effort: Consolidated evaluation rubrics for chemical plant design'. Together they form a unique fingerprint.

Cite this