ATEG Archives

January 2011

ATEG@LISTSERV.MIAMIOH.EDU

Options: Use Monospaced Font
Show Text Part by Default
Show All Mail Headers

Message: [<< First] [< Prev] [Next >] [Last >>]
Topic: [<< First] [< Prev] [Next >] [Last >>]
Author: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
Subject:
From:
Craig Hancock <[log in to unmask]>
Reply To:
Assembly for the Teaching of English Grammar <[log in to unmask]>
Date:
Thu, 20 Jan 2011 10:27:40 -0500
Content-Type:
text/plain
Parts/Attachments:
text/plain (395 lines)
Bill,
     Among writing professionals, the most prevailing recent view is 
that writing assessment is inherently local. That may seem like an 
attempt to avoid accountability, except when the accountability itself 
is local.
    Local can mean local general standards, but it can also mean varying 
standards even within an institution--hence "writing across the 
disciplines," which acknowledges that writing in the sciences may be 
very different from writing in art history or English.  In a two year 
school, a Liberal Studies (transfer) diploma might require more than a 
two year technical program in a trade. Different departments are free to 
fail students who can't write at their acceptable level. In most 
colleges, those same students can find their way to a Writing Center for 
help. The Writing Center and English Comp (or other entry level courses) 
are often conceived of as service courses to bring students up to speed 
in ways appropriate to their institution. When they fail to do that, 
they face internal pressure.
     I have mixed feelings about that because, for the most part, 
English teachers are not overly interested in writing in other 
disciplines and the disciplines themselves are often very unreflective 
about their own standards and practices. It's a dialogue that hasn't 
proceeded very far.
    Our current attempt at standardization asks students to do something 
that most will not have to do in any other circumstance: write under 
time pressure on a topic they haven't seen before.  Writing teachers 
will  object to that partly because it distorts the idea of writing by 
subtracting it from context. A good writing teacher will help  students 
learn to add substance to their writing, and that's hard to do in a 
closed classroom with no way to add to what they know.
     If a high school has a rigorous writing program, a grade from that 
will tell far more about how that student will fare in college.

Craig



On 1/19/2011 5:44 PM, Spruiell, William C wrote:
> I've been trying to work through some of the issues that keep popping up in debates like this (partly because I'm also trying to figure out what I really think about standardized testing, which I keep perceiving as a good idea that is implemented in very bad ways due to too much influence by people who are more interested its utility for creating appearances than about its actual results). Whatever else is going on, there are at least three totally reasonable desires that people are trying to fulfill; the testing issue just brings these into conflict repeatedly. (I'm just trying to organize things in my head; this is going to be sadly lacking in originality, but for what it's worth...)
>
> (1) We need some way of measuring progress that isn't game-able by students, teachers, or administrators. ("Game-able" here just referring to the possibility that the apparatus behind designing the measurement, using it, and reporting the results can be manipulated by stakeholders in ways that don't have anything to do with the educational goals we say we're after).
>
> (2) We need to measure stuff that means something.
>
> (3) The measurement system can't require more resources than are available.
>
> It's quite possible to have a non-game-able measurement that's meaningless, of course, and far too possible to have an otherwise-meaningful measurement that's game-able. To the extent something gets gamed, it's (educationally) meaningless, but it can be meaningless for other reasons too.
>
> With writing, it *may* turn out to be impossible to satisfy both (1) and (2) while satisfying (3), given current resources. It would become possible if resources were increased, but until that happens, there's no real solution. It's easy to do (1) and (3) if you ignore (2), and it's *really* easy to do a partial version of (1) that involves making a test non-game-able by whichever stakeholder groups you're not in, a tactic that ends up eroding (3) quickly, but in a way that isn't too obvious too fast. Saying that you're ranking any of those three desires as being less important than the others will buy you a ton of trouble (for good reason!), so the public discourse ends up in a rhetorical equivalent of an infinite loop.
>
> With "grammar," there are a number of additional twists. For one thing, there's even less agreement about what students should know about it or be able to do with it than there is for writing, and this can be reflected in some startlingly ambiguous curriculum standards ("Er... does this mean students have to be able to *use* pronouns, which they've been doing since well before first grade, or be able to *identify* pronouns?"). This makes fulfilling (2) difficult, and creates a playground for people who want to dodge (1) while appearing to value it. When I see an ambiguous standard, I'm stuck trying to figure out whether someone just didn't think about it much, or did think about it and is gaming things. And the kind of language that gives you an unambiguous standard is frequently the same kind that leads to students just memorizing stuff and coughing it up on cue. The immense variability across school systems when it comes to grammar means that if you do anything much on (2), a bunch of the students are unfairly clobbered by whatever test you give. One of the biggest problems I have with my own classes is that I can't figure out how to address the needs of students who know pretty much nothing about basic terminology without boring the hell out of the ones who already do (I mean, boring them more than a grammar class might do otherwise; one of the other problems we face, of course, is the notion that each student is to be interested in every subject all the time...).
>
> --- Bill Spruiell
>
>
>
>
>
>
>
>
>
>
>
> -----Original Message-----
> From: Assembly for the Teaching of English Grammar on behalf of Craig Hancock
> Sent: Sun 1/16/2011 10:34 AM
> To: [log in to unmask]
> Subject: Re: Standardized Testing (was Re: Texas is getting explicit)
>
> Karl,
>      You make some fine, thoughtful points.
>      I think the main objection to these tests is not just that they don't
> predict success as well as they could, but they they are high stakes
> tests and therefor have great influence in driving the curriculum. Not
> only is portfolio assessment a much better way to assess writing
> ability, but by its very nature, it encourages a range of writing
> assignments, including many that can't happen under those short term
> constraints: the opportunity to deepen understanding of a topic over
> time, the opportunity to weigh and assess competing perspectives, the
> opportunity to explore one's own values as they relate to public
> issues, the opportunity to revise one's work after feedback--the list
> could go on for some time. In short, the nature of the test influences
> curriculum in a negative way by rewarding a very narrow kind of
> writing. Some studies have shown that length matters in scoring, even
> when that length is largely filler or "bullshit," so the advice to
> students is to bullshit and fill if you need to to make it long. It's
> not a healthy influence. (That said, I do teach my students how to
> write under  time pressure, though I do it with essays on course
> material. Some writers do well in part because of high course
> involvement and thoughtful preparation.)
>      If the SAT test required explicit knowledge about language, then the
> curriculum would shift to accommodate that. In our current
> environment, very few students are exposed to that, so explicit
> questions about language might work unfairly as a predictor. On the
> other hand, the nature of these questions reinforces the view that
> language can be handled unconsciously. It preserves the status quo.
> Instead of asking "What are the most useful things for students to
> know about language," we ask "What is the best way to prepare for the
> SAT's. The test, which was never designed with that in mind, runs the
> show.
>     My experience has been that timed tests are especially poor predictors
> for second language students. They do not predict how well those
> students will do if given assignments that value meaning over a longer
> period of time. I also seem to get better predictions from reading
> comprehension.
>
> Craig
>
>
>   There are two issues here: the brief and unrepresentative nature of the
>> writing sample and the remoteness of the multiple-choice questions from
>> what we're really interested in.
>>
>> Those are both valid complaints, but there's one significant problem:
>> once you accept the need for some sort of standardized assessment
>> independent of the evaluation by the students' teachers, you are
>> essentially forced into both choices by considerations of fairness and
>> of the reality of assessments.
>>
>> Any writing done for a high-stakes evaluation must be done under
>> controlled circumstances to prevent cheating. That means no carefully
>> revised portfolios. So we're stuck with a timed essay. For the same
>> reason, you can't announce a topic ahead of time, as some students will
>> then memorize prepared essay which they will then reproduce. That also
>> means the topic has to be broad, because we can't presuppose particular
>> content knowledge for a general writing test.
>>
>> There's also been significant research done which shows that while
>> shorter times allows result in poorer essays, the time constraint
>> affects students more or less equally. That is, when you extend or
>> reduce the time, the rank order of students stays pretty much consistent.
>>
>> The upshot is that if you're going to evaluate an essay, a short, timed
>> piece on an unannounced topic is the only fair way to do it.
>>
>> As for the multiple-choice portion, the problem with the essay is that
>> it's the part of the test that shows the most variability. To really
>> know how good a writer the student is, we ought to have them write a
>> number of essays over a longer period of time. That, obviously, is
>> impractical outside the classroom, and the whole point of a standardized
>> test is to get information that is independent of classroom assessment.
>> Multiple-choice questions are significantly more reliable, but the
>> question is are they measuring anything that we care about.
>>
>> The real question is not do multiple-choice questions such as those that
>> appear on the SAT directly measure a student's writing ability. Clearly
>> they don't, and I doubt even the test makers would try to defend that.
>> The relevant question to ask is are they a valid proxy for good writing?
>> In other words, is there a correlation between the ability to get a high
>> score on those questions and actual good writing. The validity studies
>> that ETS has done claims that there is, or at least that they correlate
>> fairly well with grades in freshman composition classes.
>>
>> Could you design a better test than the one that appears on the SAT?
>> Almost certainly. I'd be willing to bet, however, that any test that
>> could show better evidence of validity than what the SAT does would wind
>> up making the same basic decisions about format.
>>
>> On 1/15/2011 1:43 PM, Craig Hancock wrote:
>>>       If I missed John's point, I hope he will point that out.
>>>       Certainly, the best way to test writing is to look at actual
>>> writing.
>>> There has been lengthy discussion about whether a timed writing sample
>>> on a topic a student has never seen before is a good test of that.
>>> (The NCTE task force was highly critical.) Those of us who teach
>>> grammar as a subject have various ways to test whether students are
>>> learning the material. I am not a big fan of multiple choice tests.
>>> Knowledge about language can be tested directly. If you believe it has
>>> no value, then it makes no sense to test it.
>>>
>>> Craig>
>>>
>>>
>>>    Craig, you miss John's point; students who do well on a SAT grammar
>>>> question aren't necessarily better writers, but your question is as
>>>> guilty--if not more so?  His point reveals your hypocrisy in
>>>> criticizing
>>>> SAT questions.  Be honest, it is hard--perhaps impossible--to write a
>>>> multiple choice grammar question that reveals writing eloquence.  At
>>>> least, most SAT questions can expose errors in skill.
>>>>
>>>>
>>>> On Jan 15, 2011, at 2:15 PM, Craig Hancock wrote:
>>>>
>>>>> John,
>>>>>      You seem to be making very thoughtful decisions about how to help
>>>>> your
>>>>> students. If there's an important hoop, you have to take it seriously,
>>>>> but you don't need to be subservient to it.
>>>>>     When I look at SAT sample questions, I find about one in four
>>>>> perplexing. I think they want students to "intuit" a better sentence,
>>>>> but they also want those intuitions to match the intuitions of the
>>>>> test
>>>>> makers, which, I have to admit, are not always the same as mine. I see
>>>>> the same dynamic at work with writing teachers who rewrite students'
>>>>> sentences in ways that sometimes seem arbitrary or idiosyncratic. In
>>>>> effect, they want the students to share their intuitions, to write
>>>>> like
>>>>> them, without reflection on how difficult (impossible?) that might be.
>>>>> I think this is one symptom of an overall loss of knowledge about
>>>>> language, and you're right: the students affected the most will be
>>>>> those whose background is different from the testmakers. Testing
>>>>> explicit knowledge would level the playing field. All students would
>>>>> be
>>>>> on equal footing.
>>>>>     As much as possible, I want students to own their own writing. They
>>>>> should be free to break conventions if they can do so knowing what
>>>>> those conventions are. They should make choices that they feel best
>>>>> convey their own evolving intentions. Knowledge of conventions and
>>>>> knowledge of rhetorical options are more important than behavior.
>>>>>
>>>>> Craig
>>>>>
>>>>> Craig,
>>>>>> I see another side of this issue every day - students who do very
>>>>>> well
>>>>>> on
>>>>>> SAT questions aren't necessarily better writers for all their
>>>>>> awareness
>>>>>> (
>>>>>> I hesitate to use knowledge) of SAT error patterns. And then the
>>>>>> insinuation is made, yet again, that grammar instruction doesn't
>>>>>> improve
>>>>>> student writing. Your point about conceptualization is well taken;
>>>>>> so,
>>>>>> while the SAT test doesn't necessarily approach that aspect, it's
>>>>>> more
>>>>>> upon me as a teacher of that test-taking population to approach
>>>>>> essential
>>>>>> skill sets with a larger picture in mind - the rhetorical function of
>>>>>> grammar in a particular phrase, paragraph, etc. We should always hope
>>>>>> for
>>>>>> transfer of knowledge within and across disciplines, and as we all
>>>>>> know
>>>>>> the SAT isn't constructed to demonstrate that type of thinking, I
>>>>>> find
>>>>>> it
>>>>>> a matter of classroom practice.
>>>>>>
>>>>>> The SAT and related test-prep methodology and practices manage to
>>>>>> keep
>>>>>> students tracked and stratified, which to me is of even greater
>>>>>> concern
>>>>>> (and a matter for a different thread). Those who do well have paid to
>>>>>> learn how to do well, or at least better than the average test taker.
>>>>>> It's
>>>>>> that student who also can afford to pay to actually attend the school
>>>>>> he
>>>>>> or she was "smart enough" to get into. Let's be courageous enough to
>>>>>> admit
>>>>>> that "school" is a glorified class system of haves and have nots and
>>>>>> that
>>>>>> Education has done a fine job in keeping those distinctions in proper
>>>>>> working order.
>>>>>>
>>>>>> But back to the change in curriculum addressed by the article, we can
>>>>>> hope
>>>>>> it moves beyond correctness and into the dynamics of language. I was
>>>>>> pleasantly surprised to watch my young nephew, in the 4th grade,
>>>>>> learn
>>>>>> about predicates and adverb phrases explicitly (while I have 11th
>>>>>> grade
>>>>>> students who arrive unable explain nouns, verbs, prepositions,
>>>>>> fragments,
>>>>>> etc.) ... but again, it's a step on a greater staircase. If we want
>>>>>> others
>>>>>> (let's start with NCTE, yes?) to believe that this particular,
>>>>>> specific
>>>>>> type of knowledge is valuable and can in fact improve student writing
>>>>>> -
>>>>>> which I fully believe it can - then the rest is up to good teaching.
>>>>>> Purposeful, explicit, critical, rigorous, and ongoing instruction at
>>>>>> that.
>>>>>>
>>>>>> Thanks...
>>>>>>
>>>>>> John
>>>>>>
>>>>>>
>>>>>>> Karl,
>>>>>>> It's interesting that they still equate grammar with
>>>>>>> "conventions" and
>>>>>>> with error, though they open up with more sophisticated terminology.
>>>>>>> The SAT test doesn't measure explicit knowledge about language; it
>>>>>>> simply asks you to find (or intuit) the best choice among options.
>>>>>>> There isn't, for example, a need to identify a structure as a
>>>>>>> prepositional phrase or modal auxiliary. No need to handle the "in
>>>>>>> early morning dawn" type of question we have been discussing other
>>>>>>> than to choose it as an alternative. It's interesting that they also
>>>>>>> separate proofreading and grammar from "more conceptual" skills,
>>>>>>> clearly not even aware that other views of grammar are possible.
>>>>>>> (One
>>>>>>> core concept of cognitive grammar--grammar is
>>>>>>> conceptualization.)They
>>>>>>> still don't seem to be making the judgment that knowledge about
>>>>>>> language is valuable in itself.
>>>>>>> This is pretty much true of the National Governor's Standards
>>>>>>> as well.
>>>>>>> They are a bit better, but still old school in their construal of
>>>>>>> grammar. Whether you are for it or against it, it still seems to be
>>>>>>> focused on correctness.
>>>>>>> Craig
>>>>>>>
>>>>>>> I'm surprised that no one has brought this up. It appears Texas
>>>>>>> schools>   are going to get a lot more explicit grammatical
>>>>>>> instruction.
>>>>>>>>
>>>>>>> http://www.dallasnews.com/sharedcontent/dws/news/localnews/stories/DN-grammar_15met.ART.State.Edition1.14a5f2e.html
>>>>>>>> When Texas was arguing about new curriculum standards, I heard
>>>>>>> a lot
>>>>>>>> about the fight over the science standards, but nothing at all
>>>>>>>> about
>>>>>>>> English standards.
>>>>>>>>
>>>>>>>> Are there any Texas educators on the list who would care to comment
>>>>>>>> about what difference these changes are making in the trenches?
>>>>>>>>
>>>>>>>> To join or leave this LISTSERV list, please visit the list's
>>>>>>> web interface
>>>>>>>> at:
>>>>>>>> http://listserv.muohio.edu/archives/ateg.html
>>>>>>>> and select "Join or leave the list"
>>>>>>>>
>>>>>>>> Visit ATEG's web site at http://ateg.org/
>>>>>>>>
>>>>>>> To join or leave this LISTSERV list, please visit the list's web
>>>>>>> interface at:
>>>>>>> http://listserv.muohio.edu/archives/ateg.html
>>>>>>> and select "Join or leave the list"
>>>>>>>
>>>>>>> Visit ATEG's web site at http://ateg.org/
>>>>>>>
>>>>>> John Chorazy
>>>>>> English III Academy, Honors, and Academic
>>>>>> Pequannock Township High School
>>>>>>
>>>>>> Nulla dies sine linea.
>>>>>>
>>>>>> To join or leave this LISTSERV list, please visit the list's web
>>>>>> interface
>>>>>> at:
>>>>>>       http://listserv.muohio.edu/archives/ateg.html
>>>>>> and select "Join or leave the list"
>>>>>>
>>>>>> Visit ATEG's web site at http://ateg.org/
>>>>>>
>>>>> To join or leave this LISTSERV list, please visit the list's web
>>>>> interface at:
>>>>>       http://listserv.muohio.edu/archives/ateg.html
>>>>> and select "Join or leave the list"
>>>>>
>>>>> Visit ATEG's web site at http://ateg.org/
>>>> To join or leave this LISTSERV list, please visit the list's web
>>>> interface
>>>> at:
>>>>        http://listserv.muohio.edu/archives/ateg.html
>>>> and select "Join or leave the list"
>>>>
>>>> Visit ATEG's web site at http://ateg.org/
>>>>
>>> To join or leave this LISTSERV list, please visit the list's web
>>> interface at:
>>>        http://listserv.muohio.edu/archives/ateg.html
>>> and select "Join or leave the list"
>>>
>>> Visit ATEG's web site at http://ateg.org/
>>>
>> To join or leave this LISTSERV list, please visit the list's web interface
>> at:
>>       http://listserv.muohio.edu/archives/ateg.html
>> and select "Join or leave the list"
>>
>> Visit ATEG's web site at http://ateg.org/
>>
> To join or leave this LISTSERV list, please visit the list's web interface at:
>       http://listserv.muohio.edu/archives/ateg.html
> and select "Join or leave the list"
>
> Visit ATEG's web site at http://ateg.org/
>
>
> To join or leave this LISTSERV list, please visit the list's web interface at:
>       http://listserv.muohio.edu/archives/ateg.html
> and select "Join or leave the list"
>
> Visit ATEG's web site at http://ateg.org/
>
>

To join or leave this LISTSERV list, please visit the list's web interface at:
     http://listserv.muohio.edu/archives/ateg.html
and select "Join or leave the list"

Visit ATEG's web site at http://ateg.org/

ATOM RSS1 RSS2