The singular “they”

In the not-too-distant past English used the pronoun “him” to refer to individuals of unspecified gender in phrases such as “Everyone is entitled to his own opinion.” This understandably annoys some people, who prefer “Everyone is entitled to his or her own opinion” or some other alternative. Sometimes the problem can be avoided by making the antecedent plural, as in “All people are entitled to their own various individual opinions.” Another option is to adopt a sort of jump-ball rotation, alternating between “he” and “she.” Of course, this requires a bit of bookkeeping for balance, and it can lead to such difficulties as “From each according to his ability, to each according to her need” or vice versa. There doesn’t seem to be a single solution that makes everyone happy, me included.

A lot of people use a singular “they,” as in “Everyone is entitled to their opinion.” This draws the obvious objection that it’s improper to have a plural pronoun refer to a singular antecedent.

The best defense I’ve yet read of the singular “they” is on the Oxford English Dictionary blog here. It points out that the singular “they” has been used for centuries, and moreover, there’s a good case to be made that a singular they is no more objectionable than the singular “you.” Quoting the blog post:

In 1660, George Fox, the founder of Quakerism, wrote a whole book labeling anyone who used singular “you” an idiot or a fool. Eighteenth-century grammarians like Robert Lowth and Lindley Murray regularly tested students on “thou” as singular, “you” as plural, despite the fact that students used singular you when their teachers weren’t looking, and teachers used singular you when their students weren’t looking.

Note, by the way, that even in modern English, plural verbs are used with you (“you are” and “you have” as opposed to “you art” or “you hast”) even when the antecedent is singular. That pretty clearly hints that “you” is a plural pronoun. Note that people similarly say “they are” and “they have” when using the singular “they.”

The blog post also mentions that “The New Oxford Dictionary of English (1998) not only accepts singular they, they also use the form in their definitions.” The post concludes, rather emphatically,

Former Chief Editor of the OED Robert Burchfield, in The New Fowler’s Dictionary of Modern English Usage (1996), dismisses objections to singular they as unsupported by the historical record. Burchfield observes that the construction is ‘passing unnoticed’ by speakers of standard English as well as by copy editors, and he concludes that this trend is ‘irreversible’. People who want to be inclusive, or respectful of other people’s preferences, use singular they. And people who don’t want to be inclusive, or who don’t respect other people’s pronoun choices, use singular they as well. Even people who object to singular they as a grammatical error use it themselves when they’re not looking, a sure sign that anyone who objects to singular they is, if not a fool or an idiot, at least hopelessly out of date.

So there.



Facebooktwitterredditpinterestlinkedintumblrmailby feather

The cost of exploring space

I recently saw some comments under a YouTube video asking why we don’t postpone going into space until we’ve solved our problems on Earth (which would be a pretty long postponement). This is a question people have asked for decades, and it arises in part from an inaccurate notion of the cost, just as many people incorrectly assume that a high percentage of the U.S, budget goes to foreign aid. It also overlooks the important role of space research in dealing with the problems we have on Earth, but for the moment let’s focus on the cost question, starting with a couple of examples.

New Horizons is the space probe that flew past Pluto and its moons in 2015 and later visited on object in the Kuiper Belt. Building and launching the spacecraft cost \$565 million, with operations through the Pluto flyby and subsequent collection and analysis of the data by teams of scientists adding another 215.6 million.

This means that over the 17 years of the primary mission, the total cost of New Horizons — a little less than 800 million — averaged less than 47 million per year, or 0.0000015% of U.S. federal spending over those years. That’s 15 ten-millionths of one percent. This is roughly the median cost for a U.S. robotic space mission.

Mars Global Surveyor was an orbiter launched in 1996 that operated for 10 years and was one of the most successful missions of its type. At the time, someone at NASA created a poster showing a series of comparisons between Mars Global Surveyer and the 1995 movie Waterworld. For example, a major goal of Mars Global Surveyor was looking for water, while people in Waterworld would have liked some dry land. The main expense of the former was scientific instrumentation, and of the latter, Kevin Costner. I forget the rest, but the bottom line was a comparison of costs: \$154 million for Mars Global Surveyor versus \$172 million for Waterworld (actually closer to \$235 million for the latter if you include prints and advertising).

Sending humans into space is much more expensive than sending robots. A single launch of the new Space Launch System rocket, under development for NASA’s Artemis project to return astronauts to the Moon, is likely to run nearly a billion dollars, at least initially. A SpaceX Crew Dragon launch carrying astronauts and cargo to the International Space Station (and ultimately back) runs about \$400 million.

For comparison, the major video streaming services are spending about \$30 billion per year on producing films and series. (That doesn’t count production costs for theatrical movies, broadcast and cable television, etc.)

NASA accounts for less than half of one percent of current federal spending. If you compute the grand total of NASA’s budget for every year from its founding in 1958 through the present, adjusted for inflation, the sum is not much more than U.S. defense-related spending for a single year. (See this earlier post for a graphic illustrating this.)



Facebooktwitterredditpinterestlinkedintumblrmailby feather

Censorship on Memorial Day

On May 31 Barnard Kemter, a 77-year-old retired U.S. Army lieutenant colonel, was delivering the keynote speech for a Memorial Day event at Markillie Cemetery in Hudson, Ohio, when his microphone went dead. After a moment’s pause he continued, using the loud voice he had in the past employed for addressing assembled troops, until the sound system started working again.

Immediately afterward, the event’s audio engineer, A.J. Stokes, came over to tell Kemter what happened: One of the event organizers from the local America. Legion post had ordered Stokes to turn off the amplifier during that part of the speech, and when Stokes refused, the organizer so himself. Barney Kemter did not want to disrupt a Memorial Day observance, so he said nothing to event organizers at the time. But local news media covering the event reported what had happened. (See for example here.) The event was also televised on a local community channel. Lieutenant Colonel Kemter’s speech begins about 46 minutes into the video below.

The Ohio Department of the American Legion immediately investigated and just four days later released a statement (PDF) reading in part: “We discovered that the censoring that occurred at the Memorial Day Ceremony in Hudson, Ohio, sponsored by Hudson American Legion Post 464, was pre-meditated and planned by Jim Garrison and Cindy Suchan.” Garrison was the adjutant of Post 464 and apparently the person who turned off Kemter’s microphone, and Suchan, president of the post auxiliary, was in charge of organizing the Memorial Day event.

As a courtesy Kemter had given Suchan an advance copy of the speech, and in response she asked him to omit the part that referenced Memorial Day’s origins. It was a brief but key part of the speech, so after considering the matter and discussing it with a city official, Kemter concluded that leaving it out would require a substantial rewrite that he didn’t have time to do, so he decided to deliver the speech as written.

In response Suchan and Garrison apparently decided to turn down the amplifier during the roughly two minutes of the eleven-minute speech containing the part they objected to.

In it, Kemter described what was likely the first observance of what later became Decoration Day and then Memorial Day. It took place in Charleston, South Carolina, in 1865. Some 240 United States servicemen who died there as prisoners of war had been dumped into a mass grave. Shortly after Lee’s surrender to Grant, volunteers in the city (which had itself surrendered to Union forces in February) exhumed the bodies, washed and dressed them, and reburied them in newly dug individual graves with honor and respect.

Then on May 1, a day of remembrance was held, during which the graves were decorated with flowers brought in armfuls by a procession of children singing “John Brown’s Body.” According to contemporary accounts, as many as 10,000 persons, including 3,000 children, took part in the observance.

What apparently made this objectionable to Garrison and Suchan was Kemter’s mentioning that all this was inaugurated and carried out by former slaves, including formerly enslaved children.

The Ohio branch of the American Legion deemed the censorship of Kemter’s speech to be inconsistent with the organization’s principles, including a 1923 resolution (PDF) reaffirmed in 1977 and 2017, against discrimination on the basis of race, religion, or class. The state organization demanded and received Jim Garrison’s resignation as an officer of post in question and as a member of The American Legion and took steps to close that American Legion post entirely. A few days later Cindy Suchan resigned as well. The mayor and city council of Hudson issued a statement against the attempted censorship.

Lieutenant Colonel Kemter has been asked to deliver his address again, this time without interruption, at an event for young people later in June. The full text of the speech is also prominently displayed on The American Legion’s website.

A web search for “Kemter memorial day speech” will turn up a good deal more about the incident.



Facebooktwitterredditpinterestlinkedintumblrmailby feather

Juneteenth

Today is Juneteenth, named in honor of the 19th day of June 1865, when Major General Gordon Granger of the United States Army, newly arrived on Galveston Island, issued a declaration that the Emancipation Proclamation would be enforced in Texas.

I’ve heard it suggested that communications were so much slower at the time that most Texans were not even aware of the Proclamation, but this is simply not true. News traveled by telegraph and newspaper nearly as quickly then as it did a century later, and the Proclamation had been issued nearly 30 months earlier. There might at most have been some naïve hope among some slave owners that Texas, having been largely spared the fighting, might for at least a while longer remain an outpost of slavery, but General Granger soon made it clear that wasn’t to be.

A more serious myth is that the Emancipation Proclamation freed no slaves, an oddly common misconception that I’ve written about before in 2012 and in 2015. It’s even sometimes claimed that Lincoln had not even intended for the Proclamation to free the slaves, since it applied only to the areas in rebellion as of the end of 1862. If Lincoln had been serious about ending slavery, the myth goes, he would not have exempted the four slave states that never seceded, and in any case the Confederates states would not have obeyed an order from Lincoln to free their slaves. This is all kinds of nonsense packed into an impressively small space. The whole legal basis of the Proclamation (see below) made it impossible to apply it to areas not in rebellion, and the Proclamation was an order not to the Confederate states to free their slaves but to the United States Army, Navy, and Marines to do so, which they did starting the very first day the Proclamation went into effect, 1863 January 1.

Lincoln strongly opposed slavery, but as he pointed out (for example in his fourth debate with Douglas in 1858 and in his first inaugural address in 1861), the Constitution gave the federal government no authority to end slavery in the states in which it was practiced. He hoped that the remaining slave states would eventually do away with it themselves, as other states had already done, but at the federal level he called only for an end to slavery’s westward expansion, that being unambiguously within the powers of Congress. Even this was enough for his election to prompt seven slave states to secede and form the Confederacy in the months before Lincoln was inaugurated. A further four slave states seceded soon after.

The war, which the South launched with its attack on Fort Sumpter in Charleston harbor in April of 1861, changed things. It was an established principle that military forces could seize enemy goods and chattels (such as wagons and crops) in furtherance of the war effort, and the second Confiscation Act of 1862 gave the president explicit authority to apply that principle to chattel slaves. On 1862 September 22 Lincoln issued a preliminary Emancipation Proclamation giving the rebellious states 100 days’ notice that he would order the freeing of slaves in states that did not return to the Union. None did, so the first day of 1863 he issued the final version that listed the areas still in rebellion, declared the slaves there permanently free, and ordered the military and naval forces of the United States to free them in fact.

On that same day, 1863 January 1, at least a few hundred slaves were freed, mainly on the sea islands of the Carolinas and Georgia. Then as Union forces advanced further into the Confederacy they freed thousands upon thousands more, many of whom made their own way to the Union lines in pursuit of liberty, and in the end nearly 4 million former slaves were free men, women, and children.

One of the most noteworthy military actions to enforce the Emancipation Proclamation took place on 1863 June 1, when United States Navy gunboats and about 300 United State Army soldiers staged an extraordinarily successful raid up the Combahee River. (For details see this article.) The soldiers, a company known as the South Carolina Volunteers, were former slaves. The operation was planned and carried out not by a military leader but by a famously courageous and brilliant volunteer spy named Harriet Tubman, who had previously helped run the Underground Railroad that helped slaves escape to Canada.

Tubman’s image was scheduled to replace that of Andrew Jackson on the U.S. $20 bill this year, but a little more than a year ago Treasury Secretary Steve Mnuchin announced that for supposed technical reasons the redesign had been postponed until 2028 at the earliest and might not feature Tubman. For some reason, however, there would be no such problem releasing new versions of the $10 and $50 bill sooner than that. It’s widely assumed that the “technical reasons” were actually objections from then-President Trump, who during the 2016 campaign attacked putting Tubman’s image on the bill as “political correctness.”

Yet another myth is that Juneteenth marked the end of legal chattel slavery in the United States. But the Emancipation Proclamation only emancipated existing slaves; it did not and could not outlaw slavery, and again it could not apply to the slave states still in the Union.

Of the four slave states that never seceded, Maryland outlawed slavery in 1864 November and Missouri in 1865 January. Both states already had large numbers of free black men and women. Slaves in Washington DC had been freed (with compensation to their owners) by an act of Congress signed into law by President Lincoln 1862 April 16. The new state of West Virginia came into existence as a free state, a condition for its admission to the Union in mid-1863. At the end of the war there were slaves left only in Delaware, Kentucky, and a few other places exempt from the Proclamation. Slavery finally became illegal and the remaining slaves freed by the ratification of the 13th Amendment in December 1865.

This is not a simple story with a happy ending. After the war, and especially after the end of Reconstruction, many former slaves (and their children and descendants) fell back into de facto slavery by other names. In some areas, for example, many black citizens were falsely convicted of crimes and sentenced to labor on private plantations. Many more were denied the right to vote or acquire an education, and Jim Crow laws enforced segregation and inequality. Even after the Brown v Board of Education Supreme Court decision in 1954, the Civil Right Act of 1964, and the Voting Rights Act of 1965, American citizens of African ancestry were and remain less than entirely free, and much injustice still exists no matter how much we might hope otherwise.

But that’s no reason not to celebrate the progress that has been made, and Juneteenth is an appropriate time to do it.

(Updated 2020 June 24 and July 1 and 2021 July 4 with some additional details and attempts to improve wording.)



Facebooktwitterredditpinterestlinkedintumblrmailby feather

Interesting times

Contrary to popular belief, “May you live in interesting times” isn’t actually a Chinese curse. And even if it were, I can easily think of a lot of interesting times in which the good outweighed the bad.

So it doesn’t bother me that 2020 is turning into an interesting year. Today I happened to be in my old neighborhood and saw “Black Lives Matter” signs in front of a majority of houses on the street where I used to live, and that street is overwhelmingly white.

Polling by UCLA/Nationscape has shown a substantial shift in support for Black Lives Matter over the past three years. In June 2017 Americans who said they supported the Black Lives Matter were outnumbered by those who didn’t by a margin of -4 percentage points. Net support became positive a little over two years ago and has now reached +28 points, with a massive shift over just the past two weeks. Of course, over this time the general public has learned of more and more cases of African Americans being harassed, threatened, and even killed by white vigilantes and the police, increasingly often backed up by graphic video, making the problem more and more obvious. We’ve also seen law enforcement personnel violently assault peaceful demonstrators and journalists, very often including white people, for no justifiable reason.

On most other subjects polled — gun control, immigration, the environment, etc. — changes in opinion have been less dramatic, either because they have been modest or because they have reflected a strengthening of what was already the majority view.

For example, since June 2017 favorable opinion toward Russia as an ally have fallen from a net negative of -39 percentage points to a net negative of -59, which is a shift of 20 points but in the same direction. Support for legalizing marijuana has similarly risen from from +32 to +43 over that time frame, and support for same-sex marriage from 28 to 35. Both were pretty strongly opposed by majorities of the public in the past, but the big shift in opinion from opposition to support was some years ago.

The case of same-sex marriage is interesting in that much of the opposition was based on the notion that its legalization would somehow harm traditional marriage and the family. When the Supreme Court legalized same-sex marriage and nothing disastrous happened it was hard to maintain that belief. The 6-3 Supreme Court’s ruling that the 1964 Civil Rights Act prohibits discrimination in employment on the basis of sexual orientation and gender identity may go the same way.

Another recent shift that can be said to reinforce a majority attitude is the drop in support for Donald Trump. A week ago Nate Cohn in The New York Times (echoed the same day by Greg Sargent in The Washington Post) pointed out Trump’s decline in approval ratings over the past couple of months. His net approval across fell from -6.7 percentage points on April 15 to an even worse -13.2 on June 7, based on an average of recent polls compiled by fivethirtyeight.com. As I write this it has deteriorated further to -14.3.

In keeping with this, Joe Biden’s lead over Trump in national polls has reached about 10 percentage points, up from 6 in March and April.

The Washington Post has a summary of recent polling as of earlier today, June 16, here. Among other things it notes that Joe Biden now leads in Michigan (which Trump narrowly won in 2016) by 12 points in one poll and by 16 in another. Ann Selzer, considered the gold standard for pollsters in Iowa, has Trump ahead in that state by just 1 point. (He won by 9 in 2016.) A May Fox News poll had Biden up by 8 nationally after being tied in April, pretty much in keeping with polls for The Washington Post/ABC and CNBC. No major poll shows Biden’s lead declining.

Obviously the election is still a good 140 days away, and those polls could change in either direction in the next 20 weeks. For most of 2016 the polls showed Hillary Clinton in the lead and she ended up losing the Electoral College. On the other hand, she won the popular vote by an amount quite close to what the polls were predicting just before the election, and Biden is generally doing better than she was at the same point in 2016.

It’s particularly interesting to look at subsets of the population. In the fall of 2016 polls showed people 65 and older favoring Trump over Clinton by 5 points, but this time the 65+ cohort is pro-Biden by an average of 7 points, a net shift of 12 percentage points toward the Democrats. In 2016 women supported Clinton, the first female major party presidential candidate, by 14 points over Trump, but they now favor Biden over Trump by 25.

On the other hand, Trump is currently doing better among nonwhite voters than he did in 2016. Back then they preferred Clinton by 50 percentage points but Biden’s lead is a narrower (but still impressive) 45. Trump leads among white voters as he did in 2016, though by only 5 points now versus 13 back then. And as in 2016, it’s white voters without a college diploma who give him that lead. White college graduates prefer Biden by 20 points (up from Clinton’s 12 in 2016).

What about the electoral vote? Polls of the larger swing states conducted May 1 through June 12 showed Trump ahead by an average of 1.5 points in Texas and 1 point in Georgia, which is far worse than he did in 2016. But Biden leads in Pennsylvania, North Carolina, Florida, Arizona, Wisconsin, and Michigan, states that went for Trump last time.

The website electoral-vote.com, which aggregates an immense amount of state-by-state polling data (and also has an excellent if snarky daily compilation of political news), today shows Biden ahead of Trump in the electoral vote by 352 to 148 based on the the most recent state-level polls (with Texas — Texas! — shown as tied, something the people running the site seem not to believe even though that’s what the polls indicate).

What’s more interesting is the breakdown by the size of the candidates’ lead in each state. If it’s at least 10 points the site rates the state as “Strongly” Democratic or Republican. If it’s at least 5 but less than 10 it’s considered “Likely.” if it’s less than five it’s “Barely.”

As I write this, Biden leads by at least 10 points in states totaling 229 electoral votes, and by 5-10 in states worth another 44. This means that if he wins only those states in which he has at least a 5-point lead and loses all the rest, he ends up with 273 electoral votes, more than the 270 needed to win. In contrast, Trump leads by at least 5 points in states with a total of just 104 electoral votes. To win Trump would have to take every state in which he currently has a lead, plus every state in which Biden currently leads by less than 5 points, and at least one additional state.

Again, the 10 fortnights from now until the election is a long time, and none of this means Trump won’t win a second term. In fact, there are a lot of pro-Trump Republicans who are predicting a Trump landslide. We’ll see.



Facebooktwitterredditpinterestlinkedintumblrmailby feather

The 2020s are here, dagnabbit

The eternal debate on when decades start

In the comic above, Ponytail accuses White Hat of being pedantic. Hah. I’ll show you pedantic.

The 21st century began on the first day of 2001. This has nothing whatsoever to do with whether there was a year zero. It’s just how counting works. If you’re eating donuts like mad, the first dozen comprises donuts number 1 through 12, and you don’t get to the second dozen until you start on donut 13.

(If you think nobody ever eats a dozen donuts, you’ve never been to Britt’s Donuts in Carolina Beach, North Carolina. On one such visit I watched a girlfriend consume over a dozen when she initially insisted she only wanted one. More generally, you don’t have to eat a dozen donuts all at once to eat a dozen donuts. You could eat them over the course of month or a year or whatever.)

Also, by the straightforward meaning of the words, the 2000s began on the first day of the year 2000, and that’s true whether you’re talking about the 2000s in the sense of a period of a 1000 years, 100 years, or 10 years.

Likewise, the 2020s began today, the first day of 2020, but the third decade of the 21st century won’t begin until the first day of 2021. That would also be the start of the 203rd decade, if the 203rd decade were actually a thing.

It really is that simple. What screws it up is the assumption that the 100-year period we call “the 2000s” and the 21st century are synonyms. They’re approximately the same, they’re the same to within one percent, but they’re not exactly the same.

Also, a century can be an ordinal century (as in “the 21st century”), but the term also means any 100-year period. So 2001-2100 is a century, and so are 1951-2050 and 2000-2999.



Facebooktwitterredditpinterestlinkedintumblrmailby feather

Review: Riverdale (2017 television series) and a failed previous effort

The comic book characters Archie Andrews and his friends have been popular since the early 1940s, so most people reading this are likely familiar with Riverdale as home to the wholesome red-haired teenager Archie Andrews and his parents, classmates, and various other recurring characters, including his eccentric fellow student Jughead and his two main love interests, Betty and Veronica. There were a number of animated television series based on Archie from the late 1960s through 2000 as well as live-action and animated shows featuring related characters such as Josie and the Pussycats and Sabrina the Teenage Witch, and there were a few stabs at creating a live action Archie television series as well, but none of them actually made it onto a network schedule until early 2017, when the CW network introduced Riverdale.

The CW is a joint venture of CBS and Warner Bros (hence the CW) and a large part of its schedule consists of programs adapted from comics, mostly of the superhero sort.

Riverdale is different—different from the superhero series, different from the Archie comics (except some of the most recent ones, anyway), and different from most television series based on comics. For example, when the series begins Archie is having a hot affair with his beautiful high school music teacher. Jughead isn’t a goofy eccentric but a darkly earnest young writer. Rich girl Veronica lives with her mother because her father is in prison for financial crimes. Archie lives with his divorced father, who was played by former teen idol Luke Perry until his death in March of 2019 of a stroke at the age of 52.

Other recognizable actors include Robin Givens as Riverdale’s mayor and the mother of head Pussycat Josie and Mädchen Amick as Betty’s control-freak mother and editor of Riverdale’s newspaper.

Cheryl Blossom, a lesser recurring character from the comics, has a more major role on Riverdale.  She’s a sometimes mean and blatantly sexy rich girl who becomes more likable as the series progresses but whose self-absorption is so hilariously over-the-top that it amounts to a running gag. For just one minor example, when Cheryl is describing a late-night scare she doesn’t just say she was in bed at the time, she says was in her “canopy sleigh bed.” In another episode she barely manages to reach her archery gear in time to defend herself from a bad guy trying to kill her, but she still takes a moment to put on a fashionable hunting cape before shooting an arrow at her would-be assailant, sending him fleeing.

The first season revolves around a mysterious death with a number of parallel subplots, most of them melodramatic. The second begins with a serial killer threatening the town but branches off from that in multiple directions, and I gather the third and fourth seasons, which I haven’t made it to yet, may be even more extreme.

There are a lot of hints beyond the bizarre storylines themselves that the whole thing isn’t meant to be taken too seriously, at least by the audience. For example, there are several references to a prison named “Shankshaw.” Also, while people use modern smart phones, there are landline phones everywhere, many of them with old-fashioned dials rather than pushbuttons.

In its general impression of eccentricity and unreality Riverdale reminds me (and a lot of other people, I gather) of Twin Peaks, the series created by David Lynch and Mark Frost, minus the supernatural elements but with Mädchen Amick. I’ve largely enjoyed Riverdale so far or its weirdness and its mostly likable protagonists, though I have mixed feelings about the fact that some of the characters become darker as the series progresses. But if you’re not put off by the radical deviation from the wholesome Archie of years past, the show is reasonably watchable, if not exactly great art.

Here’s a trailer for the first season:


Link: https://youtu.be/HxtLlByaYTc

And here’s a collection of (mostly mediocre) outtakes:


Link: https://youtu.be/RNkNyijkkAA

Archie: To Riverdale and Back Again (1990 made-for-TV movie)

While Riverdale is the only live-action television series based on Archie comics ever to make it onto a network television schedule, there were previous efforts, including 1990 made-for-TV movie broadcast on NBC that served as a pilot for a proposed series featuring the usual Archie characters but 15 years older than in the comics.

The movie has them all back in Riverdale for a high school reunion. Archie himself is now a lawyer engaged to be married, Betty teaches second grade and dates a guy who doesn’t really care about her, and Veronica, richer than ever and four times divorced, lives in Paris helping run an arm of her father’s business empire. Predictably Betty and Veronica wind up resuming their pursuit of Archie despite knowing that he’s engaged.

When we first see Jughead he’s lying on a psychiatrist’s couch explaining that his ex-wife, who’s about to get remarried,  has just dumped their bratty son on him. He then apologizes for going on about his problems. It turns out he’s the psychiatrist and the guy he’s been talking to is his patient. The patient assures Jughead that that he benefits a lot from their sessions, because hearing about Jughead’s problems makes his own seem so minor by comparison. In the half-hour or so I sat through before giving up, this was the only attempt at comedy that I thought even slightly funny.

The cast included David Doyle (Bosley on Charlie’s Angels) as Mr Weatherbee, Matt McCoy (Lloyd Braun on Seinfeld) as Betty’s obnoxious boyfriend, Karen Kopins (Jim Carrey’s very cute non-vampire girlfriend in Once Bitten) as Veronica, and Lauren Holly (who also appeared opposite Jim Carrey in Dumb and Dumber and maintains to this day a very active career in guest star roles on television) as Betty.

As far as I can tell, the only place to see this is YouTube, where a few people have uploaded it from VHS copies, but I’m not recommending you take the time to look it up, but if you’re curious can find a short trailer below that gives the flavor of the thing.


Link: https://youtu.be/7Nz_B2B_QVg



Facebooktwitterredditpinterestlinkedintumblrmailby feather

Review: Shazam (2019 movie)

The original Captain Marvel was the star of a comic published by Fawcett Comics starting in the 1940. The premise was clever: A wizard bestowed on a young boy the power to turn into a Superman-like superhero by exclaiming the magic word “Shazam,” which was also the wizard’s name. So every red-blooded boy reading Captain Marvel could imagine himself turning into a superhero.

For a while Captain Marvel was the top-selling superhero comic, even beating Superman, until DC sued the character of existence for his obvious similarities to Superman. Later Marvel comics appropriated the name Captain Marvel for an entirely different superhero, and later still DC, which ended up buying the assets of its former competitor Fawcett comics, tried reviving the original Captain Marvel only to have Marvel sue them for use of the name as a matter of trademark law, and since Fawcett had let the name lapse, Marvel’s appropriation was deemed legitimate and DC was not allowed to keep selling a comic under that name.

If I’m not mistaken, DC actually does have the right to call the character Captain Marvel on the inside pages, just not on the cover, so they started publishing the comic under the name Shazam, which led a lot of younger readers to suppose that it was the name of the superhero as well.

In the new movie the character has trouble deciding on a name and is never called Captain Marvel or Shazam that I recall. Then again, the protagonist of Marvel’s Captain Marvel movie isn’t called “Captain Marvel” either. In fact, my movie ticket called the movie Captain Marv. I’m still going to call the hero of Shazam! Captain Marvel for purposes of this review.

As in the original comic, this Captain Marvel’s powers are magically bestowed on a 14-year-old boy named Billy Batson by an ancient wizard who is looking for a champion but isn’t wearing his glasses. Exclaiming the wizard’s name lets Billy turn into an adult superhero with the characteristic strengths of Solomon, Hercules, Atlas, Zeus, Achilles, and Mercury if I remember right. (Anyway, one Jewish guy and five Greeks, with the first letters spelling “Shazam.”)

I confess I got my hopes up about this film based on the trailer until during the opening credits I realized that it’s was a DC movie, at which point I became apprehensive. But to my amazement, it’s actually good. I liked it even more than the Marvel Captain Marvel, though I admit that the star of the other film is better looking.

As I said, the original premise of the Captain Marvel character strikes me as clever, but as far as I know, until this film nobody took it to the logical conclusion of letting Captain Marvel retain Billy Batson’s 14-year-old mind and personality in the body of an adult superhero. One might object that this is inconsistent with magic word’s supposedly bestowing upon him the wisdom of Solomon, but let me remind you that Solomon reportedly had 700 wives and 300 porcupines, and who but a 14-year-old boy would think that was a good idea?

Anyway,  as soon as he gets his mind around the fact that he can turn himself into a grown man with superpowers, Billy sets about doing what a 14-year-old boy would do with that ability: buy beer and junk food, figure out ways to get money to buy more fun stuff, and hit on girls in their late teens and early 20s. He even manages to succeed at the first two. When the bad guy asks him how old he is, Billy lies and says he’s “basically fifteen.”

His immaturity is initially his biggest problem, but then he encounter a bigger one, the evil supervillain Dr Sivana, who unlike the mad scientist character in the comics has powers very close to Captain Marvel’s own, as a consequence of being inhabited by the Seven Deadly Sins.

Said Sins had been imprisoned by the wizard Shazam until Dr Sivana came along and freed them, but aside from giving Dr Sivana superpowers, it’s hard to see what practical difference it makes whether the Sins are locked up or not. Even while they were still captive there wasn’t a noticeable lack of sinning anywhere, nor did their incarceration prevent their combined physical manifestation from becoming president of the United States.

Incidentally, some have pointed out that putting a boy’s personality into an adult body has been done before, with the most commonly cited example being the movie Big, starring Tom Hanks. The makers of this movie saw that observation coming, and they inserted a nice reference to Big in the film.

But of course variations on the idea have appeared before, often in the form of an adult and a child swapping bodies. And given the basic idea that Billy Batson turns into Captain Marvel, having him still be Billy Batson on the inside is just carrying the basic premise to its logical conclusion.

Shazam! has a pretty satisfying climax one in keeping with the spirit of the original Captain Marvel comic. I really enjoyed it.

Link: https://youtu.be/go6GEIrcvFY

By the way, a friend of mine reminded me that back in its early days, when it was still a comic book, Mad Magazine published a spoof by Harvey Kurtzman and Wally Wood poking fun at Superman, Captain Marvel, and in passing the DC-Fawcett lawsuit. In Mad‘s version, Captain Marbles’ magic word is Shazoom!, standing for Strength — Health — Aptitude — Zeal — Ox, Power Of — Ox, Power Of Another — Money. If you do an image search in Google for “Superduperman” you can find it on line. The parody proved so popular that it gave Mad a major sales boost.



Facebooktwitterredditpinterestlinkedintumblrmailby feather

No, Social Security isn’t going to run out of money

At the start of last week the latest report from the Social Security Trustees (link) was released, leading to predictable but misleading warnings that the system was going to “run dry.” For example, the headline on this article from the website of Boston’s WBUR public radio declares: “Unless Congress Acts, Social Security Will Run Out Of Money By 2035, Government Report Says.”

The first sentence under the headline seemingly confirms it: “According to an annual report from the government this week, Social Security will run out of money by 2035.” But then the second sentence reads, “It’s a worrisome reminder for soon-to-be retirees, who, unless Congress acts, will receive just three-quarters of their scheduled benefits.”

Most people are likely to find this confusing, and they’re right. Here’s the straight story:

Social Security is a pay-as-you-go system. That is, the benefits paid out this year are paid for mainly by Social Security taxes coming in this year. This is how the system is supposed to work, how it has always worked, and how it was originally advertised as working. Many people mistakenly think it’s a sort of enforced saving program, but if it had been set up that way it would have been decades before anybody received any significant benefits.

Of course, it’s highly unlikely that the taxes collected would happen to exactly match the benefits being paid out, so the Social Security retirement system has a trust fund that operates like a reservoir in a water system. taxes flow into the trust fund and are paid out from the trust fund, and if the taxes coming in exceed the benefits being paid out, the trust fund grows to build of a reserve to cover a future shortfall.

When people talk about Social Security “running out of money,” what they actually mean is that the reserve is projected to be depleted in about 2035 (give or take), at which point benefits would have to be cut to just the amount Social Security taxes would be able to cover with the taxes coming in, which the same report estimates would be about 77 percent of scheduled benefits.

This might sound a little worse than it is, in that future average benefits are supposed to be higher than benefits being paid out today, even after adjusting for inflation. That’s because Social Security benefits are based on wages, and over time wages tend to grow at least a little faster than inflation. For the same reason, today’s average benefits are likewise larger than they were in the past. So 77% of future average benefits will be more than 77% of current average benefits and more than 100% of average benefits being paid out at some point in the past. (No, I haven’t looked up exactly when that was.)

Now, this doesn’t mean that the reduction is nothing to be concerned about. Quite the contrary. If this scenario is allowed to play out, then people drawing Social Security would take a sudden big hit to their spending power, and that would hurt not just them but the businesses that depend on them and ultimately the whole economy. So definitely Congress ought to do something. I’m just pointing that even if they don’t (which, alas, lately seems a safe bet), it’s not like Social Security would literally run dry.

(Incidentally, there are actually two trust funds, one for retirement benefits and the other for disability benefits. The latter fund is in better shape, mainly because of a decline in disability claims.)

There are various ways of fixing the system so benefits don’t suddenly drive off a cliff. Benefit growth could be reduced or tax collections increased (for example by doing away with the current cap on what high-income people pay in Social Security taxes) or some combination of the two.

You might not have heard about it yet, but there is a concrete proposal on the table right now to address the problems far into the future and in a number of ways improve the system. It’s proposed by Representative John Larson (D-Connecticut) and colleagues in the House of Representatives. There’s a clear summary here with a link to the actual text of the bill if you want all the details.

Confusion about Social Security is nothing new, of course. Here, for example, is a more or less similar post I wrote back in 2012. A lot of myths turn out to arise from misconceptions about the meaning of life expectancy, leading for example to confused pronouncements on the topic from former Texas governor, former presidential candidate, and U.S. Energy Secretary Rick Perry (see this post) and from former Senator Alan Simpson (see this one). Senator Simpson in particular ought to have known better, since he had just co-chaired the National Debt Commission with Erskine Bowles.

Possibly the best succinct explanation of life expectancy was a four-minute video from Hank Green I previously highlighted in a 2017 post here.



Facebooktwitterredditpinterestlinkedintumblrmailby feather