Abstract
The process of seeking and gaining accreditation for an engineering program was substantially changed ten years ago when the EC2000 criteria were implemented. (The moniker EC2000 is no longer in use; they are now simply the ABET criteria.) Programs must now define goals and objectives for their program, provide evidence that graduates are meeting these objectives, and demonstrate evidence of continuous improvement. These accreditation criteria present programs with significant challenges. Departments must determine what data are needed and collect it regularly. To be sustainable, assessment plans must make efficient use of faculty time. This paper will present strategies for collecting assessment data that serves multiple purposes beyond accreditation, using the Rowan University Junior/Senior Engineering Clinic as an example. The Rowan University Junior/Senior Engineering Clinic is a multidisciplinary, projectbased course required for engineering students in all disciplines. Students solve real engineering research and design problems, many of which are sponsored by local industry. Because each clinic project is unique, grading student work and maintaining approximately uniform expectations across all projects is a significant challenge. At the same time, the Clinic is the course within the Rowan Engineering curriculum that best reflects professional engineering practice. Consequently, the Junior/Senior Clinic provides an excellent forum for assessing whether students have indeed achieved the desired pedagogical outcomes of the curriculum. This paper will present a set of assessment rubrics that is currently being used by the Rowan Chemical Engineering department. The data collected serves two purposes: It is used to grade individual student projects and it is used for program-level assessment. The assessment strategies presented are of potential utility to any engineering faculty member, but may be of particular interest to new faculty members, for whom research productivity and generation of publications are essential. This paper will present evidence that the implementation of the assessment process led directly to improved student performance in the Jr/Sr Clinic, and thus improved the overall research productivity of the entire department. Further, new faculty members often have innovative ideas for classroom teaching. This paper will demonstrate how the assessment rubrics have been used as a tool for turning pedagogical innovations into publishable pedagogical scholarship.
Original language | English (US) |
---|---|
Journal | ASEE Annual Conference and Exposition, Conference Proceedings |
State | Published - 2010 |
Event | 2010 ASEE Annual Conference and Exposition - Louisville, KY, United States Duration: Jun 20 2010 → Jun 23 2010 |
All Science Journal Classification (ASJC) codes
- General Engineering