ATEG Archives

January 2011

ATEG@LISTSERV.MIAMIOH.EDU

Options: Use Monospaced Font
Show Text Part by Default
Show All Mail Headers

Message: [<< First] [< Prev] [Next >] [Last >>]
Topic: [<< First] [< Prev] [Next >] [Last >>]
Author: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
Subject:
From:
Craig Hancock <[log in to unmask]>
Reply To:
Assembly for the Teaching of English Grammar <[log in to unmask]>
Date:
Sun, 16 Jan 2011 10:34:19 -0500
Content-Type:
text/plain
Parts/Attachments:
text/plain (324 lines)
Karl,
    You make some fine, thoughtful points.
    I think the main objection to these tests is not just that they don't
predict success as well as they could, but they they are high stakes
tests and therefor have great influence in driving the curriculum. Not
only is portfolio assessment a much better way to assess writing
ability, but by its very nature, it encourages a range of writing
assignments, including many that can't happen under those short term
constraints: the opportunity to deepen understanding of a topic over
time, the opportunity to weigh and assess competing perspectives, the
opportunity to explore one's own values as they relate to public
issues, the opportunity to revise one's work after feedback--the list
could go on for some time. In short, the nature of the test influences
curriculum in a negative way by rewarding a very narrow kind of
writing. Some studies have shown that length matters in scoring, even
when that length is largely filler or "bullshit," so the advice to
students is to bullshit and fill if you need to to make it long. It's
not a healthy influence. (That said, I do teach my students how to
write under  time pressure, though I do it with essays on course
material. Some writers do well in part because of high course
involvement and thoughtful preparation.)
    If the SAT test required explicit knowledge about language, then the
curriculum would shift to accommodate that. In our current
environment, very few students are exposed to that, so explicit
questions about language might work unfairly as a predictor. On the
other hand, the nature of these questions reinforces the view that
language can be handled unconsciously. It preserves the status quo.
Instead of asking "What are the most useful things for students to
know about language," we ask "What is the best way to prepare for the
SAT's. The test, which was never designed with that in mind, runs the
show.
   My experience has been that timed tests are especially poor predictors
for second language students. They do not predict how well those
students will do if given assignments that value meaning over a longer
period of time. I also seem to get better predictions from reading
comprehension.

Craig


 There are two issues here: the brief and unrepresentative nature of the
> writing sample and the remoteness of the multiple-choice questions from
> what we're really interested in.
>
> Those are both valid complaints, but there's one significant problem:
> once you accept the need for some sort of standardized assessment
> independent of the evaluation by the students' teachers, you are
> essentially forced into both choices by considerations of fairness and
> of the reality of assessments.
>
> Any writing done for a high-stakes evaluation must be done under
> controlled circumstances to prevent cheating. That means no carefully
> revised portfolios. So we're stuck with a timed essay. For the same
> reason, you can't announce a topic ahead of time, as some students will
> then memorize prepared essay which they will then reproduce. That also
> means the topic has to be broad, because we can't presuppose particular
> content knowledge for a general writing test.
>
> There's also been significant research done which shows that while
> shorter times allows result in poorer essays, the time constraint
> affects students more or less equally. That is, when you extend or
> reduce the time, the rank order of students stays pretty much consistent.
>
> The upshot is that if you're going to evaluate an essay, a short, timed
> piece on an unannounced topic is the only fair way to do it.
>
> As for the multiple-choice portion, the problem with the essay is that
> it's the part of the test that shows the most variability. To really
> know how good a writer the student is, we ought to have them write a
> number of essays over a longer period of time. That, obviously, is
> impractical outside the classroom, and the whole point of a standardized
> test is to get information that is independent of classroom assessment.
> Multiple-choice questions are significantly more reliable, but the
> question is are they measuring anything that we care about.
>
> The real question is not do multiple-choice questions such as those that
> appear on the SAT directly measure a student's writing ability. Clearly
> they don't, and I doubt even the test makers would try to defend that.
> The relevant question to ask is are they a valid proxy for good writing?
> In other words, is there a correlation between the ability to get a high
> score on those questions and actual good writing. The validity studies
> that ETS has done claims that there is, or at least that they correlate
> fairly well with grades in freshman composition classes.
>
> Could you design a better test than the one that appears on the SAT?
> Almost certainly. I'd be willing to bet, however, that any test that
> could show better evidence of validity than what the SAT does would wind
> up making the same basic decisions about format.
>
> On 1/15/2011 1:43 PM, Craig Hancock wrote:
>>      If I missed John's point, I hope he will point that out.
>>      Certainly, the best way to test writing is to look at actual
>> writing.
>> There has been lengthy discussion about whether a timed writing sample
>> on a topic a student has never seen before is a good test of that.
>> (The NCTE task force was highly critical.) Those of us who teach
>> grammar as a subject have various ways to test whether students are
>> learning the material. I am not a big fan of multiple choice tests.
>> Knowledge about language can be tested directly. If you believe it has
>> no value, then it makes no sense to test it.
>>
>> Craig>
>>
>>
>>   Craig, you miss John's point; students who do well on a SAT grammar
>>> question aren't necessarily better writers, but your question is as
>>> guilty--if not more so?  His point reveals your hypocrisy in
>>> criticizing
>>> SAT questions.  Be honest, it is hard--perhaps impossible--to write a
>>> multiple choice grammar question that reveals writing eloquence.  At
>>> least, most SAT questions can expose errors in skill.
>>>
>>>
>>> On Jan 15, 2011, at 2:15 PM, Craig Hancock wrote:
>>>
>>>> John,
>>>>     You seem to be making very thoughtful decisions about how to help
>>>> your
>>>> students. If there's an important hoop, you have to take it seriously,
>>>> but you don't need to be subservient to it.
>>>>    When I look at SAT sample questions, I find about one in four
>>>> perplexing. I think they want students to "intuit" a better sentence,
>>>> but they also want those intuitions to match the intuitions of the
>>>> test
>>>> makers, which, I have to admit, are not always the same as mine. I see
>>>> the same dynamic at work with writing teachers who rewrite students'
>>>> sentences in ways that sometimes seem arbitrary or idiosyncratic. In
>>>> effect, they want the students to share their intuitions, to write
>>>> like
>>>> them, without reflection on how difficult (impossible?) that might be.
>>>> I think this is one symptom of an overall loss of knowledge about
>>>> language, and you're right: the students affected the most will be
>>>> those whose background is different from the testmakers. Testing
>>>> explicit knowledge would level the playing field. All students would
>>>> be
>>>> on equal footing.
>>>>    As much as possible, I want students to own their own writing. They
>>>> should be free to break conventions if they can do so knowing what
>>>> those conventions are. They should make choices that they feel best
>>>> convey their own evolving intentions. Knowledge of conventions and
>>>> knowledge of rhetorical options are more important than behavior.
>>>>
>>>> Craig
>>>>
>>>>>
>>>>
>>>> Craig,
>>>>>
>>>>> I see another side of this issue every day – students who do very
>>>>> well
>>>>> on
>>>>> SAT questions aren't necessarily better writers for all their
>>>>> awareness
>>>>> (
>>>>> I hesitate to use knowledge) of SAT error patterns. And then the
>>>>> insinuation is made, yet again, that grammar instruction doesn't
>>>>> improve
>>>>> student writing. Your point about conceptualization is well taken;
>>>>> so,
>>>>> while the SAT test doesn't necessarily approach that aspect, it's
>>>>> more
>>>>> upon me as a teacher of that test-taking population to approach
>>>>> essential
>>>>> skill sets with a larger picture in mind – the rhetorical function of
>>>>> grammar in a particular phrase, paragraph, etc. We should always hope
>>>>> for
>>>>> transfer of knowledge within and across disciplines, and as we all
>>>>> know
>>>>> the SAT isn't constructed to demonstrate that type of thinking, I
>>>>> find
>>>>> it
>>>>> a matter of classroom practice.
>>>>>
>>>>> The SAT and related test-prep methodology and practices manage to
>>>>> keep
>>>>> students tracked and stratified, which to me is of even greater
>>>>> concern
>>>>> (and a matter for a different thread). Those who do well have paid to
>>>>> learn how to do well, or at least better than the average test taker.
>>>>> It's
>>>>> that student who also can afford to pay to actually attend the school
>>>>> he
>>>>> or she was "smart enough" to get into. Let's be courageous enough to
>>>>> admit
>>>>> that "school" is a glorified class system of haves and have nots and
>>>>> that
>>>>> Education has done a fine job in keeping those distinctions in proper
>>>>> working order.
>>>>>
>>>>> But back to the change in curriculum addressed by the article, we can
>>>>> hope
>>>>> it moves beyond correctness and into the dynamics of language. I was
>>>>> pleasantly surprised to watch my young nephew, in the 4th grade,
>>>>> learn
>>>>> about predicates and adverb phrases explicitly (while I have 11th
>>>>> grade
>>>>> students who arrive unable explain nouns, verbs, prepositions,
>>>>> fragments,
>>>>> etc.) ... but again, it's a step on a greater staircase. If we want
>>>>> others
>>>>> (let's start with NCTE, yes?) to believe that this particular,
>>>>> specific
>>>>> type of knowledge is valuable and can in fact improve student writing
>>>>> -
>>>>> which I fully believe it can - then the rest is up to good teaching.
>>>>> Purposeful, explicit, critical, rigorous, and ongoing instruction at
>>>>> that.
>>>>>
>>>>> Thanks...
>>>>>
>>>>> John
>>>>>
>>>>>
>>>>>> Karl,
>>>>>> It's interesting that they still equate grammar with
>>>>>> "conventions" and
>>>>>> with error, though they open up with more sophisticated terminology.
>>>>>> The SAT test doesn't measure explicit knowledge about language; it
>>>>>> simply asks you to find (or intuit) the best choice among options.
>>>>>> There isn't, for example, a need to identify a structure as a
>>>>>> prepositional phrase or modal auxiliary. No need to handle the "in
>>>>>> early morning dawn" type of question we have been discussing other
>>>>>> than to choose it as an alternative. It's interesting that they also
>>>>>> separate proofreading and grammar from "more conceptual" skills,
>>>>>> clearly not even aware that other views of grammar are possible.
>>>>>> (One
>>>>>> core concept of cognitive grammar--grammar is
>>>>>> conceptualization.)They
>>>>>> still don't seem to be making the judgment that knowledge about
>>>>>> language is valuable in itself.
>>>>>> This is pretty much true of the National Governor's Standards
>>>>>> as well.
>>>>>> They are a bit better, but still old school in their construal of
>>>>>> grammar. Whether you are for it or against it, it still seems to be
>>>>>> focused on correctness.
>>>>>>>
>>>>>> Craig
>>>>>>
>>>>>> I'm surprised that no one has brought this up. It appears Texas
>>>>>> schools>  are going to get a lot more explicit grammatical
>>>>>> instruction.
>>>>>>>
>>>>>>>
>>>>>> http://www.dallasnews.com/sharedcontent/dws/news/localnews/stories/DN-grammar_15met.ART.State.Edition1.14a5f2e.html
>>>>>>>
>>>>>>> When Texas was arguing about new curriculum standards, I heard
>>>>>> a lot
>>>>>>> about the fight over the science standards, but nothing at all
>>>>>>> about
>>>>>>> English standards.
>>>>>>>
>>>>>>> Are there any Texas educators on the list who would care to comment
>>>>>>> about what difference these changes are making in the trenches?
>>>>>>>
>>>>>>> To join or leave this LISTSERV list, please visit the list's
>>>>>> web interface
>>>>>>> at:
>>>>>>> http://listserv.muohio.edu/archives/ateg.html
>>>>>>> and select "Join or leave the list"
>>>>>>>
>>>>>>> Visit ATEG's web site at http://ateg.org/
>>>>>>>
>>>>>>
>>>>>> To join or leave this LISTSERV list, please visit the list's web
>>>>>> interface at:
>>>>>> http://listserv.muohio.edu/archives/ateg.html
>>>>>> and select "Join or leave the list"
>>>>>>
>>>>>> Visit ATEG's web site at http://ateg.org/
>>>>>>
>>>>>
>>>>> John Chorazy
>>>>> English III Academy, Honors, and Academic
>>>>> Pequannock Township High School
>>>>>
>>>>> Nulla dies sine linea.
>>>>>
>>>>> To join or leave this LISTSERV list, please visit the list's web
>>>>> interface
>>>>> at:
>>>>>      http://listserv.muohio.edu/archives/ateg.html
>>>>> and select "Join or leave the list"
>>>>>
>>>>> Visit ATEG's web site at http://ateg.org/
>>>>>
>>>>
>>>> To join or leave this LISTSERV list, please visit the list's web
>>>> interface at:
>>>>      http://listserv.muohio.edu/archives/ateg.html
>>>> and select "Join or leave the list"
>>>>
>>>> Visit ATEG's web site at http://ateg.org/
>>>
>>> To join or leave this LISTSERV list, please visit the list's web
>>> interface
>>> at:
>>>       http://listserv.muohio.edu/archives/ateg.html
>>> and select "Join or leave the list"
>>>
>>> Visit ATEG's web site at http://ateg.org/
>>>
>>
>> To join or leave this LISTSERV list, please visit the list's web
>> interface at:
>>       http://listserv.muohio.edu/archives/ateg.html
>> and select "Join or leave the list"
>>
>> Visit ATEG's web site at http://ateg.org/
>>
>
> To join or leave this LISTSERV list, please visit the list's web interface
> at:
>      http://listserv.muohio.edu/archives/ateg.html
> and select "Join or leave the list"
>
> Visit ATEG's web site at http://ateg.org/
>

To join or leave this LISTSERV list, please visit the list's web interface at:
     http://listserv.muohio.edu/archives/ateg.html
and select "Join or leave the list"

Visit ATEG's web site at http://ateg.org/

ATOM RSS1 RSS2