KDE responded saying it was all a mistake, that they would follow up and get back to KSN&C with a more formal response. KDE did that today.
This from KDE's Office of Assessment and Accountability, Division of Assessment Support:
POWERPOINT WITH MISINFORMATION CIRCULATING
A PowerPoint has been circulating that concerns formative assessment and answering Kentucky Core Content Test (KCCT) open response questions. This PowerPoint did not come from Measured Progress or KDE. It includes incorrect
Specifically, two slides in the PowerPoint state:
· The question does not always mean what it says. There is hidden meaning in some of the questions. Students must go a step beyond what the prompt requires (Like in the old days of assessment). We must coach students up on this. As a matter of routine, students must do more than what is asked of them by the question prompt.
· List does not mean list. The hidden meaning is list and explain or list and describe.
There are no hidden scoring features in the KCCT items or scoring guides. The question means exactly what it says -- students do not have to go beyond what the prompt requires. List means exactly that -- “list.” If there is a reason to explain or describe, the question will ask students to explain or describe. Answers should always be thorough, complete and full enough to relay knowledge and abilities about the question.
Scoring has not changed under the new contractor, Measured Progress. West Ed (the item/scoring guide contractor since the 1990s) still develops all the items and follows the same guidelines as used in the previous contract. Measured Progress applies the scoring guides created by West Ed to the scoring session.
Please see the recently released annotated open response questions for more details. They provide insight into how items are scored.
For more facts about KCCT scoring....Who ya gonna call?
KCCT MYTH:Restating the question is mandatory.
KCCT MYTH BUSTER says: No, it is not required, but is acceptable to do.
MYTH: Responses restating the question without further information will earn at least one point.
MYTH BUSTER: No; new or additional information must be included in order to receive any credit.
MYTH: A graphic organizer should be done on the response page.
MYTH BUSTER: Depending on the type of question being asked, a graphic organizer may not be the best way to record the answer. Best
practice would be to use the organizer on scrap paper to plan the
MYTH: Answers must be in paragraph form.
MYTH BUSTER: Scorers are trained to focus on content and not address the format of the response. A response in any format -- bulleted, labeled diagram or graphic organizer -- will be scored. However, the nature of graphic organizers is to outline or abbreviate rather than to give supporting information and/or explanations that are usually required by the questions.
MYTH: Doing more than required by the prompt will score a 4.
MYTH BUSTER: A score of 4 requires the response to completely and accurately reflect the correct answer according to the scoring guide. Extra information is not required to score a 4. If the extra information
is inaccurate, it can prevent the responses from scoring a 4.
MYTH: Content-specific vocabulary must be used in order to score a 4.
MYTH BUSTER: Not necessarily; if the content can be adequately expressed without the use of specific vocabulary, appropriate credit will be given to the response. Content vocabulary is only required if the question specifically asks for its use.
MYTH: Three or more examples always must be given.
MYTH BUSTER: No; the question will specify the number of examples required. Giving more will not increase the score.
MYTH: Scorers only have 30 seconds to score each piece.
MYTH BUSTER: No; scorers can take as much time as needed to score each response.
MYTH: Released items are “bad” items that have been thrown out of the test.
MYTH BUSTER:Not true; the items are good to use in instruction and are representative of the test. Release of items is not based on how the
MYTH: “Strike-throughs” on an open response are better than erasing.
MYTH BUSTER: Not necessarily; scorers will score what is written on the page. Strike-throughs may take up more space than erasing, leaving less space to complete a proper response.
MYTH: Open response answers will be scored using the new analytical
MYTH BUSTERS: Each open response item has a specific scoring guide/rubric. The questions are scored for content and not writing
MYTH: Reading and mathematics scores at the new grades added in 2007 will be counted for only NCLB.
MYTH BUSTERS: For grades 3-8, all reading and mathematics scores will be used for NCLB and CATS calculations.
KDE says they have been, as yet, unable to pinpoint the original source of the PowerPoint.
SOURCE: KDE communications