This Category has no FAQ yet
This Category has no FAQ yet
The Pinnacle grade book was implemented in high school in 2010-11 and will be expanded to middle school during 2011-12, and elementary in 2012-13. This grade book has the capability of linking standards to assignments and assessments. The goal is to shift to a standards-based grading system with an aligned formative and summative assessment system (RTTT-Final Scope of Work, Volusia County Schools, retrieved from http://myvolusiaschools.org/modules/_320_1/RTTT_Workplan.pdf)
One major flaw of the 100-point scale is the subjectivity in assigning points. Some teachers give more weight to items representing the simpler content, while others give more weight to items that address the more complex content. that was not directly addressed in class. As a result, each teacher may grade the same assessment differently (Marzano, p.141)
A student can be assigned a score of 0 only if he or she demonstrates no knowledge of a goal even with help from the teacher. In this case, a 0 would represent the student’s true status. Missing or incomplete assignments are not evidence of knowledge or skill (or lack thereof), and so should not be treated as such (Marzano, p. 146).
When zeros are entered into a student’s academic record for missing evidence or as punishment for missing deadlines, the resulting grade does not accurately reflect student achievement. There are several alternative solutions for these situations that indicate incomplete or insufficient evidence.
• M = no evidence presented/absent
• Z = original assessment not passed to ____% proficiency
• I = re-take not passed
• M’s, Z’s, and I’s calculate the same as a zero grade. However, the use of alternative symbols can render more effective communication of student achievement until sufficient evidence is provided.
Grades are broken when parents and students do not understand how grades have been determined, and when they have been excluded from assessment, record keeping, and communication. The fix is to ensure that students and their parents understand how grades have been determined and to involve them as much as possible in all phases of learning and assessment. Grades should always communicate achievement status, and both assessment and grading need to help students achieve at higher levels and develop positive attitudes about learning. This only happens when students and parents are involved as active participants in ongoing assessment and grading, so they see the entire process as something that is done with them, not to them. Teachers benefit when they share with students from the beginning how they will determine grades. It is also important that students (and parents) receive short, clear written statements about grading policy/procedures (O’Connor, 2007).
Consistent grading practices are encouraged when school leadership develops an action plan to align school practices and procedures with the new mission of proficiency. The school develops team/grade level or department agreement on the essential learning targets that should be learned to proficiency. High quality instruction is delivered to all students. Grading policies are implemented with fidelity across content areas. Curriculum maps are utilized to clarify essential learning targets, to organize pacing of instruction and to facilitate consistent delivery of instruction. The school builds a school wide intervention plan for the students who haven’t learned to proficiency (VCSB, 2011).
Any assessment that asks students to construct a response longer than a few words rather than selecting an answer from a list (multiple choice, matching, and true-false) requires a rubric or scoring guide. It allows students to know what’s going to count, how much it is going to count, and what you are looking for in the response. Here are some guidelines for making rubrics for the following types of assessments:
• Performance assessment- Assessment based on observation and judgment. You watch students do something (a performance), or create something (a product) and you judge its quality.
• Extended written response- Students write out their response to questions such as why they solved a problem as they did, how something works, why something happened (or did not happen) as it did, how things are alike and different, and so on.
• Extended oral response- Students answer questions orally such as why they solved a problem as they did, how something works, why something happened (or did not happen) as it did, how things are alike and different, and so on. (Arter, 2006)
Yes, rubrics come in several different forms. The varieties depend on what you are trying to assess and the purpose for giving the assessment. There are two major things to think about when choosing a rubric type: the kind of learning target you are assessing, and how you will use the rubric- whether as an assessment of or for learning (Arter, 2006).
It is important to distinguish between analytic and holistic rubrics as well as between task-specific and general rubrics. For example, consider a car engine. How well the engine runs overall can be examined several ways.
• Analytic Rubrics look at student work as having a list of attributes- called criteria or traits- that work together to create a product or performance of various levels of quality (analyzing the engine by examining its component parts).
• Holistic Rubrics consider the product or performance as a whole. The parts are considered all together to come up with a single judgment of how good the product or performance is (a car engine is examined as a whole).
• Task-specific rubrics should be considered when asking students for an extended written or oral response to see how well they understand a specific body of information and how it works together.
• General rubrics work best when you want to see how well students understand a body of information, but selection of information might vary among students.(Arter, 2006)
You should not use rubrics if you want to assess independent pieces of knowledge. Assess these with multiple-choice, true-false, matching, or short answer items (Arter,2006).