{"id":17363,"date":"2014-01-07t08:43:53","date_gmt":"2014-01-07t13:43:53","guid":{"rendered":"\/\/www.imrbdigital.com\/?p=17363"},"modified":"2015-01-09t06:07:45","modified_gmt":"2015-01-09t11:07:45","slug":"whats-in-your-syllabus","status":"publish","type":"post","link":"\/\/www.imrbdigital.com\/2014\/01\/whats-in-your-syllabus\/","title":{"rendered":"what’s in your<\/i> syllabus?"},"content":{"rendered":"
by nancy chick, cft assistant director<\/em><\/p>\n i just posted my syllabus on yes (your enrollment services, where students register online) as part of vanderbilt’s effort to help students make informed choices about their courses, and i had a moment of anxiety.\u00a0 like anne bradstreet in “the author to her book<\/a>,” despite my worries that it’s “unfit for light,” it’s now “exposed to public view.” how will prospective students read and understand my syllabus?\u00a0 what does it tell them about my teaching? what do i think<\/em> it tells them about my teaching? <\/span><\/p>\n a recent study points to the challenge of relying on a syllabus to make judgments about a course or an instructor.<\/strong> in “the impact of the course development institute on faculty practice,” metzler, rehrey, and kurz (2013) analyzed syllabi, major assignments, and their rubrics to measure the impact of a summer course design workshop on participants’ actual classes. they also surveyed the participants about their intentions and perceptions of this impact. the researchers used this rubric<\/a> to analyze the syllabi as direct evidence of participants’ teaching, specifically looking for seven factors that align with the workshop’s curriculum: course goals, student learning outcomes, alignment of assessment and activities with course goals, expectations of student performance, authenticity of assessments, scaffolding of learning activities, and integrative learning.<\/p>\n