IL 教师对 PARCC 发表的意见# NextGeneration - 我爱宝宝
a*g
1 楼
IL 教师对 PARCC 发表的意见
【注:PARCC是基于Common Core的 Assessment】
Dear ICTM colleagues:
With the recent, extensive field test of PARCC test items now completed and
some sample items posted on the PARCC web site, teachers now have a better s
ense of the kinds of test items that PARCC will be asking. (Practice items
form the Grades 3-8 performance-based tests will be released in Fall 2014.)
I raise here one concern about plans for the performance (open response) ite
ms in the hope of generating some discussion among ICTM members regarding th
e limited ways that students are allowed to input their solution strategies
for open-response test items.
Many PARCC items, but especially the open-reponse items that will be part of
the performance-based test, require that students show or explain how they
solved the problems. That is a good thing. However, to the best of my know
ledge, the only options students have for showing how they solved the proble
ms is via a simple equation editor or via using a keyboard to type in a writ
ten explanation. That means that some of the most powerful tools that stude
nts typically use in mathematics to solve problems and to explain or justify
a solution — graphs, tables, drawings, and other representations — are no
t available for students to use on the PARCC exams.
Restricting students to verbal explanations (typed in on a keyboard) or equa
tions (typed in using an equation editor) robs students of some of the most
common ways they solve problems. If the intent of the test is to assess wha
t students know about the mathematics in a task, the ability to use drawings
, rough sketches, calculations (even partial), tables, lists, charts, counti
ng schemes, graphs, expressions as well as equations, functions and written
explanations is essential. Exam scorers also need to see that information in
order to assess accurately students’ understanding.
Some teachers whose students participated in the PARCC field test reported t
hat students generally solved the open-response (and other) problems using p
aper and pencil, then attempted to recreate their problem solving process in
a step-by-step, typed-in explanation, sometimes giving up before being able
to fully transfer their answer. In some cases, hand-written solutions were
submitted to PARCC along with the answers submitted online and the teachers
noted significant differences between submitted answers for the online and
hand-written solutions.
The potential undesired implications of not being able to use graphs, tables
, and drawings to demonstrate mathematical knowledge are significant. First
, students’ understanding is likely to appear less than students actual pro
ficiency. This is true for students who struggle, as well as advanced stude
nts. Schools may begin favoring only verbal or numerical explanations while
downplaying other mathematically rich ways of solving problems and justifyi
ng solutions. One could also imagine a strong push, even in the earliest gra
des, to spend time in mathematics classes on keyboarding skills. Furthermor
e, PARCC is likely to be offering both online and paper-and-pencil options f
or administering the test this year, in recognition that some schools may no
t have the requisite technology for the computer-based-only administration.
Will students doing the computer-based test have disadvantages over those do
ing the paper-and-pencil version, where graphs, charts, and drawings might b
e used with the open-response items? If we want to asses what students know
(as opposed to what they don’t know), students need access to the full ran
ge of tools that can used to provide evidence of their understanding.
That said, I have seen no public discussion of this issue among teachers who
se students participated in the PARCC field test. Perhaps such concerns were
expressed in direct feedback to PARCC or perhaps most do not share this con
cern. Especially if your students participated in the PARCC field test, I en
courage you to share any thoughts (on the ICTM listserv ) you may have about
this (without discussing any specifics related to individual test items). D
o you view this as a problem? Was it a problem for your students in the fiel
d test? Note that while PARCC has not yet indicated how much the open-respo
nse items will be weighted in a student's test score, earlier PARCC document
s indicated that the performance-based exam could account for up to 40% of a
student’s score. (Potentially, that is a good thing.)
If you prefer not to share your thoughts publicly, you can send them to me a
nd I can submit them to the listserv without attribution.
I understand that developing a high quality mathematics assessment for milli
ons of students is an extremely complex and formidable task. But PARCC prom
ised us “a next generation assessment system” and we should expect nothing
less. Illinois is an important state for PARCC. If this is an issue that m
erits concern, there may still be time to influence the test development. If
this concerns ICTM members, ICTM could advocate on behalf of mathematics st
udents and teachers—to make sure that the results on next year’s exam accu
rately reflect the mathematical understanding and proficiency of Illinois st
udents.
Sincerely,
Marty Gartzman
----------------------------
Martin Gartzman, Executive Director
Center for Elementary Mathematics and Science Education
The University of Chicago
1225 E. 60th Street
Chicago, IL 60637
【注:PARCC是基于Common Core的 Assessment】
Dear ICTM colleagues:
With the recent, extensive field test of PARCC test items now completed and
some sample items posted on the PARCC web site, teachers now have a better s
ense of the kinds of test items that PARCC will be asking. (Practice items
form the Grades 3-8 performance-based tests will be released in Fall 2014.)
I raise here one concern about plans for the performance (open response) ite
ms in the hope of generating some discussion among ICTM members regarding th
e limited ways that students are allowed to input their solution strategies
for open-response test items.
Many PARCC items, but especially the open-reponse items that will be part of
the performance-based test, require that students show or explain how they
solved the problems. That is a good thing. However, to the best of my know
ledge, the only options students have for showing how they solved the proble
ms is via a simple equation editor or via using a keyboard to type in a writ
ten explanation. That means that some of the most powerful tools that stude
nts typically use in mathematics to solve problems and to explain or justify
a solution — graphs, tables, drawings, and other representations — are no
t available for students to use on the PARCC exams.
Restricting students to verbal explanations (typed in on a keyboard) or equa
tions (typed in using an equation editor) robs students of some of the most
common ways they solve problems. If the intent of the test is to assess wha
t students know about the mathematics in a task, the ability to use drawings
, rough sketches, calculations (even partial), tables, lists, charts, counti
ng schemes, graphs, expressions as well as equations, functions and written
explanations is essential. Exam scorers also need to see that information in
order to assess accurately students’ understanding.
Some teachers whose students participated in the PARCC field test reported t
hat students generally solved the open-response (and other) problems using p
aper and pencil, then attempted to recreate their problem solving process in
a step-by-step, typed-in explanation, sometimes giving up before being able
to fully transfer their answer. In some cases, hand-written solutions were
submitted to PARCC along with the answers submitted online and the teachers
noted significant differences between submitted answers for the online and
hand-written solutions.
The potential undesired implications of not being able to use graphs, tables
, and drawings to demonstrate mathematical knowledge are significant. First
, students’ understanding is likely to appear less than students actual pro
ficiency. This is true for students who struggle, as well as advanced stude
nts. Schools may begin favoring only verbal or numerical explanations while
downplaying other mathematically rich ways of solving problems and justifyi
ng solutions. One could also imagine a strong push, even in the earliest gra
des, to spend time in mathematics classes on keyboarding skills. Furthermor
e, PARCC is likely to be offering both online and paper-and-pencil options f
or administering the test this year, in recognition that some schools may no
t have the requisite technology for the computer-based-only administration.
Will students doing the computer-based test have disadvantages over those do
ing the paper-and-pencil version, where graphs, charts, and drawings might b
e used with the open-response items? If we want to asses what students know
(as opposed to what they don’t know), students need access to the full ran
ge of tools that can used to provide evidence of their understanding.
That said, I have seen no public discussion of this issue among teachers who
se students participated in the PARCC field test. Perhaps such concerns were
expressed in direct feedback to PARCC or perhaps most do not share this con
cern. Especially if your students participated in the PARCC field test, I en
courage you to share any thoughts (on the ICTM listserv ) you may have about
this (without discussing any specifics related to individual test items). D
o you view this as a problem? Was it a problem for your students in the fiel
d test? Note that while PARCC has not yet indicated how much the open-respo
nse items will be weighted in a student's test score, earlier PARCC document
s indicated that the performance-based exam could account for up to 40% of a
student’s score. (Potentially, that is a good thing.)
If you prefer not to share your thoughts publicly, you can send them to me a
nd I can submit them to the listserv without attribution.
I understand that developing a high quality mathematics assessment for milli
ons of students is an extremely complex and formidable task. But PARCC prom
ised us “a next generation assessment system” and we should expect nothing
less. Illinois is an important state for PARCC. If this is an issue that m
erits concern, there may still be time to influence the test development. If
this concerns ICTM members, ICTM could advocate on behalf of mathematics st
udents and teachers—to make sure that the results on next year’s exam accu
rately reflect the mathematical understanding and proficiency of Illinois st
udents.
Sincerely,
Marty Gartzman
----------------------------
Martin Gartzman, Executive Director
Center for Elementary Mathematics and Science Education
The University of Chicago
1225 E. 60th Street
Chicago, IL 60637