The phrase "Fourth of July oratory" has long been used to deride robust expressions of patriotism on Independence Day. "Has the oratory that is peculiar to the Fourth of July," asked senator Stanley Matthews on the floor of the Senate in 1879, "come to be ... a scorn and a reproach? Is it enough to smother opposition and put down argument to say that that is merely the sentimentality of a Fourth of July oratory?"
Four years later, James Russell Lowell, a poet who served as ambassador to the Court of St. James', told an Independence Day audience, "Now the Fourth of July has several times been alluded to, and I believe it is generally thought on that anniversary the spirit of a certain bird known to heraldic ornithologists -- and I believe to them alone -- as the spread eagle, enters into every American's breast, and compels him, whether he will or no, to pour forth a flood of national self-laudation." (He added that it took place only one day a year.)
Mark Twain was one of those who spoofed the Fourth, especially its commercialization by fireworks manufacturers. The master of the tall tale told an Independence Day audience that one of his uncles out West "opened his mouth to hurrah, and a rocket went down his throat ... blew up and scattered him all over the 45 states, and -- really, now, this is true -- I know about it myself -- 24 hours after that it was raining buttons, recognizable as his, on the Atlantic Seaboard."
Tall-tale telling is an old American art form, but elitist nose-wrinkling by those too easily discomfited by displays of love of country can trigger a go-too-far. When the orator Rufus Choate derided "the glittering and sounding generalities of natural right" that made up the Declaration of Independence, Ralph Waldo Emerson made Choate's phrase pithier and then demolished it: "`Glittering generalities!' They are blazing ubiquities." (A ubiquity has the capacity of being present everywhere all the time. It's a great word, but not as familiar as "omnipresent"; I wouldn't use it in a Fourth of July speech.)
Some controversialists worry about recusal: When should judges recuse themselves from deciding cases, based on a conflict of interest or desire to avoid criticism, and when do they have an obligation to sit in judgment, as they are trained and paid to do?
In the quiet haven of grammatical, etymological and semantic scholarship that is this space, I worry about whether the verb recuse is still transitive -- that is, if it transmits an action from subject to object and requires that object to achieve its meaning. For example, accuse is transitive; I [subject] accuse [transitive verb] you [object]. But if you just say "I accuse," the sentence just hangs there, looking forlornly for an object to receive the action; you can't get away with that unless you're Emile Zola charging ubiquitously. You can, however, say, "I accuse myself" -- turning the action inward and using a selfish reflexive pronoun as the object.
So when a judge uses the transitive verb `recuse,' shouldn't he (or she, as the case may be, and nobody says "as the case may be" anymore) use a reflexive pronoun as the object? Shouldn't it be "I recuse myself" or "She recused herself?"
That's not what Supreme Court Justice Antonin Scalia wrote in an explanation of why he decided not to disqualify himself in some case about sitting ducks. He wrote instead, "I do not think it would be proper for me to recuse." No object. As the learned counsel say: Hunh?
"It is probably correct that recuse is a transitive verb," he half-concedes as he argues his case in Scalia versus Safire (with me presiding in this, my courtroom, and not recusing myself), "but it seems to me common and proper usage, with some transitive verbs, simply to omit the object that is clearly implied from the context. `Accept,' for example, is a transitive verb; but when a friend invites you home to dinner, surely it is proper to say `I accept,' rather than, `I accept the invitation.' So also with `recuse.' Whom or what else would a judge recuse, other than himself?"
Hold on -- what's with this "probably correct" business? (I enjoy interrupting; it rattles the poor guy down at the bar in front of the bench.)
"I said that it is probably correct that recuse is a transitive verb because I suspect that some verbs (perhaps recuse among them) have evolved from purely transitive to transitive/intransitive by reason of the fact that the reflexive object (as in `I recuse myself,' `he recuses himself') was so obvious that it was regularly omitted."
What precedent can you cite for that? Or to use the vernacular, gimme a f'rinstance.
"`Preen' is perhaps an example. Webster's Second shows it as transitive and intransitive; Noah Webster's 1828 edition showed it as transitive only. It rings almost redundant in the modern ear to say `He was preening himself before the mirror.' So, either because the reflexive object of the infinitive to recuse is implied; or because recuse has already acquired an intransitive meaning: proper for me to recuse is perfectly OK."
Pretty loosey-goosey for a strict constructionist, it seems to me, though I have to admire a Supreme who clings to the majestically prescriptive Merriam-Webster's Second Unabridged, which was succeeded but not replaced by the scandalously descriptive Third Unabridged in 1961. Sure enough, in the entry for preen in my copy of the Third, the lines about the intransitive usages have increased, including Virginia Woolf's "She preened, approving her adolescence."
Whatever happened to "original intent?" With the purist jurist evolving into Swingin' Nino, who is left on the ramparts to refuse recuse without a reflexive object?
Remanded to the public for more usage.
For the Chinese Communist Party (CCP), China’s “century of humiliation” is the gift that keeps on giving. Beijing returns again and again to the theme of Western imperialism, oppression and exploitation to keep stoking the embers of grievance and resentment against the West, and especially the US. However, the People’s Republic of China (PRC) that in 1949 announced it had “stood up” soon made clear what that would mean for Chinese and the world — and it was not an agenda that would engender pride among ordinary Chinese, or peace of mind in the international community. At home, Mao Zedong (毛澤東) launched
The restructuring of supply chains, particularly in the semiconductor industry, was an essential part of discussions last week between Taiwan and a US delegation led by US Undersecretary of State for Economic Growth, Energy and the Environment Keith Krach. It took precedent over the highly anticipated subject of bilateral trade partnerships, and Taiwan Semiconductor Manufacturing Co (TSMC) founder Morris Chang’s (張忠謀) appearance on Friday at a dinner hosted by President Tsai Ing-wen (蔡英文) for Krach was a subtle indicator of this. Chang was in photographs posted by Tsai on Facebook after the dinner, but no details about their discussions were disclosed. With
To say that this year has been eventful for China and the rest of the world would be something of an understatement. First, the US-China trade dispute, already simmering for two years, reached a boiling point as Washington tightened the noose around China’s economy. Second, China unleashed the COVID-19 pandemic on the world, wreaking havoc on an unimaginable scale and turning the People’s Republic of China into a common target of international scorn. Faced with a mounting crisis at home, Chinese President Xi Jinping (習近平) rashly decided to ratchet up military tensions with neighboring countries in a misguided attempt to divert the
Toward the end of former president Ma Ying-jeou’s (馬英九) final term in office, there was much talk about his legacy. Ma himself would likely prefer history books to enshrine his achievements in reducing cross-strait tensions. He might see his meeting with Chinese President Xi Jinping (習近平) in Singapore in 2015 as the high point. However, given his statements in the past few months, he might be remembered more for contributing to the breakup of the Chinese Nationalist Party (KMT). We are still talking about Ma and his legacy because it is inextricably tied to the so-called “1992 consensus” as the bedrock of his