Bruce Norris’s Clybourne Park, winner of this year’s Pulitzer Prize in New York and Olivier Award in London, is the play I’ve been waiting for since the 1980s. Although Norris previously wrote six dramas for Chicago’s Steppenwolf Theatre Company, Steppenwolf will finally stage his masterpiece beginning September 8th.
It’s a bitterly funny two-act play set in the same two-bedroom house on Chicago’s Near Northwest Side in white-flight 1959 and then in gentrifying 2009. Norris is superb at writing dialogue in How We Talk Now. While most playwrights live for eloquent speechifying, Norris’s 2009 characters converse realistically in interrupting, overlapping, and apologizing snatches. Moreover, Clybourne Park is the first work I can recall to capture precisely what urbanites talk about most obsessively (real estate); how they converse (euphemistically); and why (the 3Ls of real estate are “location, location, and location,” which in Chicago means, above all else, race).
In my 18 years in Chicago, I was involved in innumerable conversations that included the phrase, “It just needs a little tuckpointing.” Yet how many famous plays or movies center around real estate? Sure, there’s David Mamet’s 1983 drama about Chicago real-estate salesmen, Glengarry Glen Ross.
The most relevant, though, is Lorraine Hansberry’s 1959 Broadway smash, A Raisin in the Sun. Hansberry based it on her family’s famous NAACP-backed lawsuit Hansberry v. Lee that supported their efforts to buy a house in an all-white Chicago neighborhood she calls “Clybourne Park.”
…what if we turned the story around and told it from the opposite angle, the angle of people like my family, the villains, the ones who wanted to keep them out?
Thus, the first act of Clybourne Park is set in 1959 in the house at 406 Clybourne St. that Hansberry’s heroes are trying to buy. Norris has Raisin’s bad guy, Karl Lindner, try to protectively purchase the house not just from the black family moving in, but also from the white family moving out.
SWALLOWING MORE THAN PRIDE
I keep asking all my friends, but I cannot get a straight answer: Is it OK if I do not swallow? I am 42 and married to a man of 40. We have two small children and everything is going just fine. I was at a play date for my oldest the other day when the topic came up again, and I always feel like a fool. Do I have to swallow? I feel like an idiot, but somehow I still don’t know what is right or what is OK.
—Spit or Swallow in Des Moines
Dear Spit or Swallow in Des Moines,
I hope no one ever asks me this question again. The real problem is not whether you spit or swallow; it’s why at 42 with two kids and a husband you are worried about it and talking about it openly. You should have sorted out the spit/swallow conundrum by the age of 16, or at the very latest 20. Why do you care, and why do you think others care? Nobody—and I mean nobody—wants to know about how you and your husband exchange bodily fluids. Trust me, they will judge you more because you brought it up than they would ever judge you for what you do behind closed doors—that’s why the doors are closed. If you are still feeling insecure, the only person to discuss it with is your husband. Since he’s the, eh, “donor” of what you choose to spit or swallow, he’s the only one with a vested interest. Do whatever you want and stop bothering people about it! Ew!
HAPPY OLD (POSSIBLY GAY) BACHELOR
My 58-year-old brother, once wed, once divorced, no children, has been on the market for years, but he can’t seem to get it done. By “get it done,” I mean, “remarry and have children.” He is attractive and young for his age, both in personality and in looks. What can I do to help him find a wife and hopefully have children?
—Concerned Sis in San Francisco
“Lenin is said to have declared that the best way to destroy the Capitalist System was to debauch the currency. By a continuing process of inflation, governments can confiscate, secretly and unobserved, an important part of the wealth of their citizens.”
“Lenin was certainly right,” John Maynard Keynes continued in his 1919 classic, “The Economic Consequences of the Peace.”
“There is no subtler, no surer means of overturning the existing basis of society than to debauch the currency. The process engages all the hidden forces of economic law on the side of destruction, and does it in a manner which not one man in a million is able to diagnose.”
Keynes warned that terrible hatreds would be unleashed against “profiteers” who enriched themselves through inflation as the middle class was wiped out. And he pointed with alarm to Germany, where the mark had lost most of its international value.
By November 1923, the German currency was worthless, hauled about in wheelbarrows to buy groceries. The middle class had been destroyed. German housewives were prostituting themselves to feed their families. That same month, Adolf Hitler attempted his Munich Beer Hall Putsch.
Today a coterie of economists is prodding Federal Reserve Chairman Ben Bernanke to induce inflation into the American economy.
Fearing falling prices, professor Kenneth Rogoff, former chief economist for the International Monetary Fund, is pushing for an inflation rate of 5 to 6 percent while conceding that his proposal is rife with peril and “we could end up with 200 percent inflation.”
Paul Krugman, Nobel Prize winner and columnist for The New York Times, is pushing Bernanke in the same direction.
Bernanke, writes Krugman, should take the advice he gave Japan in 2000, when he urged the Bank of Japan to stimulate the economy with “an announcement that the bank was seeking moderate inflation, ‘setting a target in the 3-4 percent range for inflation, to be maintained for a number of years.’”
And who inspired Bernanke to urge Tokyo to inflate? Krugman modestly credits himself.
“Was Mr. Bernanke on the right track? I think so—as well I should, since his paper was partly based on my own earlier work.”
But Krugman is not optimistic about Bernanke’s injecting the U.S. economy with a sufficient dose of inflation.
Why is Ben hesitant? Two words, says Krugman: “Rick Perry.”
I’ve recently started recovering from forty years among pseudo-academic weirdos in the collegiate loony bin. One persistent aspect of modern college life is its obvious loathing for anything that smacks of Christianity. This includes whiting out Christian symbols and references to Christian holidays from the academic landscape. In the fall of 2006, a bronze cross was carted out of Wren Chapel at William and Mary lest it cause offense to unidentified spectators. Faculty members I’ve had the misfortune of knowing usually vibrate with excitement at such displays of sensitivity, and whenever the possibility exists for replacing “Christmas greetings” with “Have a blessed Kwanzaa,” “Peace to you on Ramadan,” or an inspirational listing of white racist sins, academics will run to make this happen. Staff members once changed a “Yule Bowl Party” to a “Season’s Greeting Festival,” arguing that “Yule” references were an affront to non-Westerners. Perhaps an itinerant Hindu would wander into the gathering and go bonkers at the mention of something once associated with Christianity.
This reaching out to other cultures while pushing away the host culture took a particularly bizarre form at a nearby college about fifteen years ago when a Protestant chapel was being built. Plans had been made to crown the newly constructed steeple with a simple cross to indicate a Christian house of worship at what was then a quasi-denominational institution. But this was not permitted to transpire because a Jewish faculty member protested mightily against the blood-curdling symbol. It seems the cross reminded her of the Holocaust, an association that is perhaps understandable given that authors who are abundantly present in college libraries always make the same dubious connection. The proper answer in this instance would have been to tell the employee to look elsewhere for a job if she found Christian symbols so intolerable. Instead, the “cross was reconsidered”—that is, replaced by a less offensive spherical object.
More recently at the same institution, a memorial service was performed for a former college president who from all accounts was a conventional Protestant churchgoer. A college chaplain was on hand to offer a prayer, but before she began, she fitfully apologized to the attendees. She had planned to offer a non-denominational prayer but was afraid her words might offend someone who wasn’t quite cool with appeals to an Occidental God. With visible discomfort she looked around before mumbling her invocation. Perhaps she was scared the Holocaust Lady would show up and report her to some government agency.
My sister-in-law tells me about a meeting she attended at the English-Speaking Union, where the group announced it would not convene the following week because of Rosh Hashanah. My sister-in-law looked around at the craggy, longish English faces and wondered, “Cui bono?”—for whose benefit was the announcement being made? Perhaps the members were concerned that the Holocaust Lady would show up there as well.
Meddlesome nudnik that she was, Holocaust Lady had a set-to about fifteen years ago with a trendy male religion professor who came into her building (in a sense it was hers by right of bullying) wearing a swastika on his chest. The swastika-attired professor was calling attention to the folk culture of Native Americans, who apparently fancied the same symbol Hitler chose for his Thousand Year Reich. When the two met in the hall, it became a question of whose victim claims would prevail—the perpetually outraged Holocaust victim-by-proxy or the German Protestant who wished to save Native Americans from cultural genocide. Loud shrieking ensued until the celebrant of Native Americana agreed to strip himself of his ancient symbol of Susquehannock tribal life.
Even by Mexican Drug War standards, last Thursday’s death inferno at Monterrey’s Casino Royale seemed a bit much.
At least 52 people died after a group of eight or nine gunmen stormed the casino, began randomly firing at civilians, doused the entrance with gasoline, and torched the joint. Trapped inside, most of the victims were thought to have died of smoke inhalation. Many of the corpses were found clutching cell phones, vainly calling for help as they helplessly perished.
This sort of psychotic public-arena violence is nothing new in Monterrey—nor even at Casino Royale. In July, 27 people were shot to death in a Monterrey bar after another group of gunmen burst in and randomly began spraying lead. In June, 34 people were murdered in Monterrey over a single 24-hour period, all of it blamed on an escalating turf war between rival drug syndicates—the Gulf Cartel and Los Zetas Cartel. In May, rifle-wielding psychopaths robbed four Monterrey casinos, including Casino Royale. In January, armed gunmen opened fire on presumed rivals inside Casino Royale.
What’s truly insane is that the insanity is by no means confined to Monterrey. In less than five years, the Mexican Drug War is already thought to have stacked up over 40,000 corpses.
Try counting out loud to 40,000. You’ll likely give up before you reach 100.
And these aren’t merely shoot-’em-dead and hide-the-body murders. What’s uniquely odious about the Mexican Drug War is not only the astronomical death toll, but its exhibitionistic, eyeball-exploding brutality. People aren’t merely shot to death—it’s done on camera, with the results uploaded to YouTube. Victims aren’t just slowly tortured into nonexistence by sadistic teenagers—the perps record it while their friends laugh and then share it with the world. Victims are butchered, decapitated, and then proudly flaunted in public like hunting trophies or parade floats.
In July of 2010, the Gulf Cartel left 15 dead bodies in the middle of a busy road near San Fernando for motorists to see. A month later, four headless and hacked-to-pieces carcasses were hung by their ankles from a bridge in Cuernavaca. In January of this year, 15 headless bodies were strewn amid 15 bodiless heads outside an Acapulco shopping center. In February, seven bodies were found swinging from bridges in Mazatlan. In most such cases, the corpses are accompanied by written threats and taunts.
As with Thursday’s Casino Royale death blaze, random civilians are frequently targeted in attempts to paralyze the public will and force abject complicity. In 2008, at least eight were killed and over a hundred injured in Morelia, Michoácan after hand grenades were casually tossed into a crowd of thousands. In 2010, sixteen teenagers, none of them affiliated with drug gangs, were shot down cold at a party in Ciudad Juárez. Later that year in Ciudad Juárez, 14 were murdered at a boy’s birthday party. In October 2010, 15 people were shot to death at a car wash in the city of Tepic. According to a source only identified as “Juan” in the Houston Chronicle, the cartels have, merely for amusement, taken to kidnapping bus passengers and forcing them to fight each other in gladiatorial death matches, then dispatching the survivors on suicide missions against rival cartels.
In December of 2010, an estimated five dozen gunmen attacked the village of Tierras Coloradas in Durango, burning down all the houses and dozens of cars, causing all the natives to flee. Many villages, especially along the American border, have turned into ghost towns due to the skull-cracking violence and spirit-crushing terror tactics.
To begin with, it’s white.
In an era when “‘black hole’ is the new ‘niggardly’,” you’d think that particular objection to the new Martin Luther King, Jr. statue in Washington, DC would come up more often.
The focal point of a $120-million memorial, this 30-foot-tall marble figure was scheduled to be unveiled on the 48th anniversary of King’s “I Have a Dream” speech, which he delivered at the nearby Lincoln Memorial (the MLK statue is 11 feet taller than that iconic seated Lincoln), but the unveiling was postponed due to Hurricane Irene.
No doubt it will also be the only one spared (virtual) destruction in future Hollywood blockbusters. From Earth vs. the Flying Saucers through Independence Day, filmmakers have orchestrated the collapse of the White House and the Washington Monument with palpable glee. (My favorite example is set a ways north: In The Giant Claw, the titular vulture perches atop UN headquarters and takes a bite out of it.) The specter of Al Sharpton bellowing through his bullhorn, “They assassinated Dr. King a second time!” would keep even the most amytal-addicted studio exec up at night.
Given the statue’s size, real-life aliens may conclude that this Martin Luther guy is Earth’s literal “king.” An honest mistake: As King’s deification enters its sixth decade, he remains the only American with a national holiday to his name.
However, while the King statue was spared during last week’s freak earthquake, cracks have developed in the man’s reputation. These flaws are common knowledge on the “right,” yet they are finally being discussed in (almost) mainstream venues, too. This in-your-face memorial feels like a heavy-handed ploy to distract attention from King’s diminishing sainthood—a kind of clumsy left-wing table magic.
We’re not saying that King wasn’t an incredible person who did more to advance the human race than most of us can ever hope to do. We’re just saying that he was also a plagiarizing butthole.
No, American Renaissance wasn’t hijacked by drunken fratboys. That’s an excerpt from a cracked.com article called “5 Great Men Who Built Their Careers on Plagiarism.” OK, so that popular humor site isn’t exactly The New Republic, but what about Mental Floss, an award-winning magazine with a devoted hipster Utne Reader-ship? They’ve covered King’s “borrowings,” too, albeit in a gentler fashion.
These plagiarism “talking points” were once confined to “white supremacist” chat rooms. Today, the topic (as well as King’s serial adultery) is approaching “everybody knows” velocity. Perhaps now someone will ask King’s litigious, avaricious estate why they charge exorbitant licensing fees for the use of his speeches in everything from academic papers to TV commercials if those words never belonged to King in the first place. Helpfully, excerpts from those speeches are carved into the memorial, inadvertently turning it into a monument to intellectual-property theft. (King’s family billed the memorial foundation $800,000 for that privilege.)
On July 18, 2011, Sean Hoare was found mysteriously deceased by London police. This is notable for two reasons.
The first reason is that Hoare was a primary whistle-blower in the unfolding crisis at News Corporation. The scandal has implicated multitudes in illegal and immoral electronic eavesdropping on everyone from an abducted teenager to Prince William and his brother.
The second and far more ominous reason is because Sean Hoare’s demise was not suspicious in any way. The London police, though already implicated in the scandal for widespread corruption through bribe-taking and callow complicity in shuttering earlier eavesdropping investigations, assure us it is so.
Naturally, only the pitifully paranoid or unsettlingly obsessed would doubt this claim.
Everyone knows whistle-blowers are by far the most likely people to generally have poor timing in all matters of corporeal termination. At the most inopportune times they are prone to hang themselves, have inconvenient heart attacks, overdose on pills, and get into car accidents in which the other parties mysteriously vanish.
Thus, absolutely no one else was involved in Sean Hoare’s accident. It is a certainty that this will be deemed a former narcotics abuser’s drug overdose. Simple as that. We know this because we are told so (and we know to heed the master’s halt). Here are a few historical examples to erase all doubt.
DR. DAVID KELLY
Dr. David Kelly was a British scientist and expert in biological warfare employed by the British Ministry of Defence. He also became a very inconvenient figure when he met for an unauthorized discussion with a BBC reporter about the true state of the government’s dossier concerning Iraq’s supposed stockpiles of chemical weapons.
Dr. Kelly disputed claims that Iraq had the capability to fire biological weapons within 45 minutes, a major British argument for invading Iraq the second time. While Kelly said he believed it likely Iraq secreted some modicum of chemical weapons following international inspections, he was dubious as to their extent and immediate threat. Following his own inspections in Iraq, Kelly became more certain of his suspicions and spoke to a reporter from The Observer to state that Iraq did not possess mobile germ-warfare laboratories and that Iraqi claims of such devices being innocuous hydrogen-balloon production facilities were entirely accurate.
Yet what might have turned into an embarrassment for the British government and a serious impediment to its second war on Iraq was all but erased from memory when Dr. Kelly decided to kill himself. Despite numerous supportive letters and emails, a supposedly despondent Kelly went for his usual evening walk and ingested 29 painkiller tablets. Then for good measure, he cut his wrists. No one saw him do it and his family says they didn’t believe it, but what would they know? Kelly was gone, along with any hope of others in similar positions coming forward to prevent thousands of deaths and billions squandered. Naturally, Dr. Kelly’s keen sense of timing made it a good day for government and gun-sellers the world over.
COMMERCE SECRETARY RON BROWN
Ron Brown was United States Secretary of Commerce in the first Clinton Administration, appointed following exceptionally successful fundraising. He perished in a 1996 air crash near Croatia. Weather was claimed to have been the cause, though anyone who took the extra step of investigating would have learned the storms were not nearly so severe as initially stated. Following the accident, no public contact was made for a full 10 hours. Rescuers, much as they would later be during the JFK, Jr., disappearance, were directed toward the opposite direction of the crash site. When they finally reached the scene, a flight attendant was said to be upright and conscious (but died of a broken neck before reaching the hospital).
When additional rescue crews arrived, everyone was dead. Witnesses at Brown’s autopsy claimed he had what appeared to be a bullet hole in his temple region. Obviously, they must have been mistaken. It was merely the age-old “Whistle-Blowers’ Malady” which had struck. Ron Brown (who had already stated if he was going to jail, he wasn’t “going alone”) had been under investigation by US independent counsel and nearing indictment before Congress to answer charges of corruption followed up with widely rumored interrogation regarding the Clintons’ questionable financial dealings.
Television—thanks to the profit motive—is intensely imitative. If any type of show is successful, a horde will suddenly appear of usually lesser and shorter-lived knockoffs. In one season, the networks may be overwhelmed by alien invaders or ghosts, in another by “relevant” comedies. All such series are motivated by the hope of catching at least some overflow fans of this year’s plot du moment. The need for hits has become ever more desperate as the networks have lost ground to cable and now the Internet. Indicative of this desperation is the fact that the current “latch-on” show is itself a cable offering—AMC’s Mad Men.
Scheduled to start its fifth season next year, the series revolves around the adventures of a quintessential group of early 1960s types: advertising men based in New York City’s Madison Avenue. (Longtime readers of Mad magazine will remember how that immortal journal’s home street was always rendered as “MADison” Avenue.) Mad Men has become immensely popular during its run—especially among folk too young to remember the era. Hoping to cash in, ABC is offering Pan Am, featuring the adventures of that era’s quintessential figures, airline stewardesses (most definitely NOT “flight attendants”), and NBC gives us The Playboy Club, with yet another archetypal bunch, the Playboy Bunnies. These two series push the fact that their characters changed America, implying that we owe today’s perfection to them. But if they succeed, it will not be their alleged relevance that captures audiences, but the same retro factors that have made Mad Men so popular.
But what are those? Why should the first half of the 1960s (as opposed to the second, so beloved of the baby boomers) command such attention from the same boomers’ children? Because, although still within the memory of many living (including myself, barely) it was, superficially at least, everything this era is not. Let’s zero in on a few points.
First, elegance. Before feminists burned their bras and hippies let it all hang out, anyone who aspired to anything wanted to look right. Jacket, tie, and hat for men; slip, skirt, high heels, makeup, and jewelry for women. Prior to the cult of dirt emerging from Haight-Ashbury, the truly cool wanted to “look like a million bucks,” even if they had nowhere near that in the bank. As with fashion, so with manners. Without a free-speech movement to tell them that foul language was liberating, anyone outside construction sites, barracks, and stag parties tried to keep their language clean. Do boomer parents’ surviving children make a concerted attempt to be elegant? Only in fits and starts; but they are often aware in a dim way that their progenitors’ grunge was a definite loss from something better.
On Monday, the Cherokee Nation Supreme Court upheld a voter-approved constitutional amendment that revoked tribal membership for the so-called Cherokee Freedmen, the modern descendants of black slaves whom Cherokee tribesmen had owned prior to the Civil War. The 16-page ruling argued that the Cherokee, and only the Cherokee, had a right to define exactly who is a Cherokee.
The ruling was handed down in the tiny town of Tahlequah, OK. It came exactly 150 years and a day after the Cherokee Nation formally allied themselves with the Confederacy in the Civil War in a declaration that, as fate would have it, was also signed in teeny-weeny Tahlequah.
Most Americans these days probably don’t know that several “Native American” tribes allied with the Confederacy during the Civil War. Then again, most of them probably aren’t aware that Injuns also owned slaves, even before Paleface’s ivory-colored toes landed on America’s shores. Most of them also aren’t likely to know that thousands of black freedmen also owned slaves in antebellum America. To be blunt, most Americans these days don’t know much of anything besides what they’re spoon-fed from the boob tube.
It’s nearly impossible to conceive that anyone would choose to live in Oklahoma on purpose. Accordingly, the Cherokee arrived in that flat, dusty, brownish plain of misery at gunpoint in the 1830s after being driven from the East Coast along the notorious “Trail of Tears.” Slave-owning Cherokees brought their black servants along with them, in many cases forcing them to carry their luggage.
The Cherokee—members of what are known as the “Five Civilized Tribes,” a term that seems, quite frankly, like a slap in the face to all the other tribes—held an estimated 4,600 black slaves in 1860, more than any other tribe in the Indian Territory. Monday’s ruling came as a slap in the face to the estimated 2,800 “non-Indian” Afro-American Cherokee Freedmen who previously had been entitled to free healthcare and educational benefits as a result of their conceptual, rather than genetic, membership in the Cherokee tribe.
“This is racism and apartheid in the 21st century,” wailed Marilyn Vann, president of the Descendants of the Freedmen of the Five Civilized Tribes, who are attempting to Sioux sue the Cherokee Nation for what cynics might view as an egregious act of Indian-giving. “This is a continuation of our Trail of Tears as we fight for our rights,” Vann said in a fit of hyperbole. Her organization has planned an emergency meeting at 2PM on Saturday to discuss strategies for overturning the recent ruling. Vann has invited “anybody who is interested in civil rights and the law” to attend the meeting at the Martin Luther King Center in Muskogee, OK, a town which Merle Haggard famously sang he was proud to be from but was actually lying since he was from California.
Ah, reality! That mysterious, untouchable world of mass and energy, of gravity and fire, of genes and synapses. It lurks out of sight beyond our reach, knowable to us only by the occasional fragments that impinge on our senses—fragments that are then, after much error-introducing data compression and many detours around feedback loops of hope, fear, desire, and hate, presented at last to our higher faculties.
She’s a bitch, Miss Reality. She has things she wants to tell you, but many of them are things you’d rather not know. So while she natters on about solitude and pain, failure and humiliation, biology and physics, old age and death, you turn away from her, bury one ear in the bed, pull the quilt up over the other ear, and feign sleep. You almost make it, too. Then, just as you find you can mentally reduce her damn repetitive drone to mere noise, with the words having no significance—suddenly, she whacks you backside the head with a loaded pool cue!
But enough with personification. Let’s talk about a real woman: Laura Ingraham. She’s not just a woman, either, but a lady. I speak from knowledge here, from a personal encounter a couple of years ago. The details don’t matter and aren’t interesting, but I came away from that encounter deeply and honestly impressed, thinking: “What a classy lady!”
So you can log me as strongly pro-Ingraham. Oh, we’ve had our trifling differences, but trifling is how they seem when you’ve had memorably indisputable evidence of a person’s genuinely high quality. I don’t merely like Laura Ingraham: I admire her.
Laura Ingraham is, however, a middle-class white American raised in the last quarter of the 20th century. She is also a law-school graduate and so was immersed for four years in the perfumed warm soak-bath of political correctness that is the modern American law school. She seems to have survived the experience better than most, but given that current US jurisprudence is well-nigh all “ought” and no “is,” her sense of reality—as much of it as she’d been able to hold onto through a standard high-school and college experience—is bound to have suffered some permanent impairment.
This showed up back in June when Ms. Ingraham was guest-hosting The O’Reilly Factor—of which show, with all its silliness, shallowness, and self-congratulating pomposity, I am a long-time addict.
Black “flash mobs” were just starting to make the headlines, and they got a mention on the show. As I reported to Taki’s Mag at the time:
I just watched a segment of the O’Reilly show titled “Violent Teen Mobs Causing Chaos Across Country.” In the entire 6:15 segment, neither Laura Ingraham nor either of her two guests used any of the terms “black,” “African American,” or “colored.”
Worth noting, I thought, and still think, as fair comment on current taboos. Other people thought so, too: I saw three or four mentions in dissident-conservative blogs.
Those sentiments must have been represented in the show’s mailbag. On August 16, flash mobs came up again on the Factor, and again Laura Ingraham was guest-hosting. On this occasion, in what looked to me—though I’m only guessing—like a very reluctant concession to the mailbag, our hostess allowed (at 5:02 here), as briefly as she could have done without slipping the whole sentence in between two passes of the TV camera’s raster scan, that the flash mobs are regrettably but unquestionably lacking in diversity:
In many instances these mostly young people happen to be African American, they’re running into stores…
They happen to be! In many instances!
I have this mental image of a centurion in one of the border forts by the frozen Rhine on that terrible last day of the year 406 AD, dictating to a scribe the message to be pony-expressed back to the Emperor in Rome:
There are thousands of them, tens of thousands…with siege engines and carts full of weapons…swarming across the river…In many instances they happen to be barbarians, in multis casibus forte barbari sunt….