ATEG Archives

January 2015

ATEG@LISTSERV.MIAMIOH.EDU

Options: Use Monospaced Font
Show Text Part by Default
Show All Mail Headers

Message: [<< First] [< Prev] [Next >] [Last >>]
Topic: [<< First] [< Prev] [Next >] [Last >>]
Author: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
Subject:
From:
Linda Comerford <[log in to unmask]>
Reply To:
Assembly for the Teaching of English Grammar <[log in to unmask]>
Date:
Mon, 5 Jan 2015 18:09:56 -0500
Content-Type:
text/plain
Parts/Attachments:
text/plain (110 lines)
These are interesting perspectives all-around.  Regarding tricky questions, my concern is with the "No error" option that seems to be part of every question.  I'm wondering if a student cannot label a particular error as we've been discussing, would that lead to a "No error" answer?  If "No error" were not provided, then selecting "by them" as the definitive error seems more likely.

Regarding standardized tests, I am in favor of them and glad to see the reasons some of you provided to support their use.  My issue has always been with including questions containing such subtle errors that they lead to more head-scratching than definitive answers for students.  

This is true for grammar exercises in textbooks too.  I've always been a fan of questions that reinforce the rules being taught rather than veer into confusing exceptions.  Agreed?

Linda



Linda Comerford
317.786.6404
[log in to unmask]
www.comerfordconsulting.com


-----Original Message-----
From: Karl Hagen [mailto:[log in to unmask]] 
Sent: Monday, January 05, 2015 3:57 PM
To: [log in to unmask]
Subject: Re: SAT question

I have my own issues with the SAT, but I�m going to be the advocatus diaboli here, because the contrary position is not as flimsy as you perhaps believe.

First, no one who knows anything about standardized testing is claiming that the answer to a single question like this, in isolation, can indicate whether someone is an adequate college-level writer. The real claim (although I suspect you�ll object to this one too) is that, when taken in aggregate, a series of questions sampling a wide variety of usage problems is, when combined with the score from an essay, an adequate stand-in for the thing we�re really interested in: how ready is the student for college-level writing. In other words, the claim is that the ability to edit someone else�s writing to conform to the norms of standard written English can serve as a reasonable proxy for the ability to produce one�s own college-level writing.

That claim is one that is subject to empirical testing, and there is published literature providing evidence that there is a reasonably good correlation. How persuasive those studies are can be the subject for a later discussion, but College Board has assembled a non-trivial body of evidence to support the use of the SAT Writing test, correlating its scores to grades in freshmen composition classes (which, I trust, are not graded with multiple-choice tests). That evidence isn�t unassailable, but this isn�t just a test cobbled together by uninformed amateurs, and if you�re going to attack it reasonably, you need to get into the psychometric weeds.

But why should we use a proxy measure at all? Why not just directly assess their writing ability, as Craig asks? The question is how can we do that in a way that is as fair as possible to as many people as possible and provides useful information to allow test-users (i.e., admissions officers) to make informed decisions. What are the alternatives?

(1) No standardized test of writing at all, in which case, we rely on high school grades alone. If we go this route, not only do we lose the ability to compare students, since grading standards vary wildly from school to school, and even from teacher to teacher within a school, but we also lose the ability to determine whether a particular grade average means that the student is college ready or not (because of grade inflation).

(2) Use a portfolio to assess writing ability. This method allows for more authentic writing, but there is no protection against cheating, as a portfolio cannot be created under proctored, secure conditions. In any high-stakes assessment, some students will cheat.

(3) Administer a standardized writing test that is a pure essay test. Intuitively, this seems like the best thing to do, but in fact, the reliability of essay-only tests is very bad. Writers are much less consistent from session to session, and graders have their problems too. There was a study a few decades back that showed AP tests would be significantly more reliable and make better decisions at the score cut points if the essays were removed. Of course essays are still on the APs. This is really a matter of �face� validity, that is, of making the test conform to what test users think it ought to contain, rather than really providing any useful information.

To me, a bigger weakness of the SAT Writing test is the essay, as it�s currently structured. In addition to the lower reliability of the essay than the other components, it�s not an authentic task. And College Board hasn�t, to my knowledge, provided very good evidence to suggest that the essay actually adds anything to the overall score. This situation may change with the new SAT, where the essay will be radically different, but I remain skeptical.

You dismiss this particular question as �tricky,� but I�m not sure I agree. The universal sentiment of everyone who answered the question seemed to be that it felt wrong, even if they couldn�t advance a theory as to why. If a student wrote this, I suspect that most of us would note it as an infelicity (mentally, at least, even if we�re not marking errors explicitly). I�d further venture that a student who had an accumulation of this sort of phrasing in a paper would not receive the highest grade.

A tricky question would be one that depended narrowly on some particular rule that is found only in some usage books, or enforced only by a few editors. For example, if students were expected to mark �which� in restrictive clauses as an error, I would call that a fussy trick. To my knowledge, the SAT steers away from such points of disputed usage.


> On Jan 5, 2015, at 10:33 AM, Hancock, Craig G <[log in to unmask]> wrote:
> 
> I can follow Karl's logic, but what it seems to come down to is that it's an error because people whose use of language matters generally say it that way.  My own inclination is to think the pattern of using "those" derives from shortening a noun phrase in which "those" functions as determiner.  "By those people who do not approve of it" becomes "by those who do not approve of it." You can't use "them" as determiner (at least in standard English).  It's interesting that there are three passives in the sentence, none problematic. 
> Does anyone believe for a moment that someone who sees this as an error is better prepared for college than someone who doesn't? Every time I look at these tests, I wonder whether they are doing much more harm than good. Whoever designs them seems to be looking for tricky little ways to catch people. Is our primary purpose for studying language to avoid error? What exactly makes an error an error, especially if usage differs? Why don't we test knowledge about language directly? Shouldn't we, as proponents of TEACHING grammar, be arguing constantly for that? Catching people on some obscure (and questionable) error diminishes the subject. 
> 
> Craig
> 
> 
> -----Original Message-----
> From: Assembly for the Teaching of English Grammar [mailto:[log in to unmask]] On Behalf Of Karl Hagen
> Sent: Monday, January 05, 2015 12:51 PM
> To: [log in to unmask]
> Subject: Re: SAT question
> 
> This problem has nothing directly to do with who/whom (a distinction that the SAT does not test).
> 
> You can�t just look at the single word following �by.� The object of the preposition is �them/those who did not approve it,� and the required word has to do with how it functions in this unit, which is a noun phrase headed by them/those.
> 
> The relative clause �who did not approve it� modifies them/those. But �them,� as a personal pronoun, virtually always stands alone in the noun phrase. It doesn�t take modifiers like the relative clause. I won�t get into a detailed analysis of �those," as modern accounts differ from a traditional analysis and the differences aren�t to the point here. Suffice it to say that �those� isn�t a personal pronoun and doesn�t have the same restriction.
> 
>> On Jan 5, 2015, at 9:34 AM, Jane Saral <[log in to unmask]> wrote:
>> 
>> A recent SAT "ID the error" question reads:
>> 
>> Although it is widely regarded as a masterpiece now, when it was built
>>                            A                                                  B
>> 
>> the Eiffel Tower was compared to a "ridiculous smokestack" by them who did
>>                                                                                            C
>> 
>> not approve of it.        No error
>>            D                       E
>> 
>> 
>> C just sounds wrong.  I would say "by those who did not approve of it."  But isn't the "them/those" word the stand-alone O.P. of by, unaffected by the relative clause that follows?  This does not seem to be dealing with the who/whom question;  "who" is correctly the subject of "did not approve."  
>> 
>> So why is this an error? 
>> 
>> Jane Saral
>> To join or leave this LISTSERV list, please visit the list's web interface at: http://listserv.muohio.edu/archives/ateg.html and select "Join or leave the list"
>> Visit ATEG's web site at http://ateg.org/
>> 
> 
> To join or leave this LISTSERV list, please visit the list's web interface at:
>     http://listserv.muohio.edu/archives/ateg.html
> and select "Join or leave the list"
> 
> Visit ATEG's web site at http://ateg.org/
> 
> To join or leave this LISTSERV list, please visit the list's web interface at:
>     http://listserv.muohio.edu/archives/ateg.html
> and select "Join or leave the list"
> 
> Visit ATEG's web site at http://ateg.org/

To join or leave this LISTSERV list, please visit the list's web interface at:
     http://listserv.muohio.edu/archives/ateg.html
and select "Join or leave the list"

Visit ATEG's web site at http://ateg.org/

To join or leave this LISTSERV list, please visit the list's web interface at:
     http://listserv.muohio.edu/archives/ateg.html
and select "Join or leave the list"

Visit ATEG's web site at http://ateg.org/

ATOM RSS1 RSS2