The Reality of Minimum Wage

Rally_demanding_$15-hr_minimum_wage_(31326059165)There simply are not enough hours in the day to address every foolish thing you see posted on social media, but there are some things that I just cannot let go. In the words of Christian comedian Ken Davis, “You can’t let people fall in the stupid pit” without out at least trying to help them. The push for an increased minimum wage is one of those things that I have to address. Lately, it has gotten even worse. Ever since it passed in Seattle, the magic number for many proponents of a minimum wage increase seems to be $15 an hour. According to the Washington Post’s questionnaires sent to Democratic candidates for president, all eight candidates still in the race favor an increase in the federal minimum wage to $15 an hour. (That includes Tulsi Gabbard, despite her exclusion from the debates and the DNC doing its best to shut her out. Of course, Gabbard is also the only one still in the race to favor a universal basic income, an idea most prominently supported by former candidate Andrew Yang). Tom Steyer actually favors a minimum wage of $22/hour.

Sure, a $15/hour minimum wage sounds like a great idea. But in reality it is no better than just printing more money. Does an increased minimum wage put more money in the hands of the people? Yes. Will they spend it? Yes, they will have to, because prices will go up.

Let’s use fast food restaurant employees as an example. The web site fightfor15.org features this statement on its homepage: “McDonald’s: Fast-food workers deserve $15 an hour and a union so we can pay our rent and support our families. Agree? Add your name now.”

If McDonald’s workers get a pay increase to $15 an hour, what will that do? Well, my understanding is that McDonald’s franchises employ about 750,000 people in the U.S. and that there about 14,150 McDonald’s restaurants in the U.S. That works out to an average of 53 workers per McDonald’s. Let’s narrow it down even more and look specifically at McDonald’s workers in Illinois, since Illinois has passed a law increasing their minimum wage. There are about 650 McDonald’s in Illinois. That would equate to 34,450 McDonald’s workers if we use the average. Let’s suppose only 60% of them are earning minimum wage, though I imagine that is exceedingly low. That would be more than 20,000 people just at McDonald’s restaurants earning minimum wage, and in Illinois this year the minimum wage went up by $1.00 per hour in January and will go up by another 75 cents per hour in July.

So, imagine 20,000 workers working, for the purposes of this illustration, 20 hours per week, and, come July, making $1.75 per hour more than they were in July 2019. That equates to $700,000 per week in wages that have to be paid by Illinois McDonald’s, or more than $36 million over the course of a year. Are the various owners of McDonald’s restaurants on Illinois going to collectively eat that increase (pun intended)? Of course not. They will raise prices. And every industry that has minimum wage workers will raise prices. So, costs will go up and that nice minimum wage increase will be negated. Various studies project that the cost of a Big Mac would increase by 4.3% if McDonald’s workers were paid $15 an hour. The rate of inflation in the U.S. hasn’t been that high since 1990. And that’s just a Big Mac!

In Seattle, where the $15/hour minimum wage push all began, housing values have increased by an average of 5.49% annually since 2000. The median rent for a two-bedroom apartment in Seattle is more than $1,600 a month–about $500 per month higher than the national median. The median for a one-bedroom in Seattle is $1,332 a month. The recommended food spending per month for a Seattle resident is 23% above the national average. The cost of a dozen eggs in Seattle is 68 cents above the national average. On average, the price of gas in Seattle is the highest for all major cities in Washington. Oh, and Seattle also has a sales tax of 10.1%! True, Washington has no state income tax, but I doubt you’ll notice any benefit by the time you absorb all those other high rates.

The minimum wage is not necessarily beneficial. When the first minimum wage in America was implemented in 1938 it was twenty-five cents an hour. Had it been increased at the rate of inflation, it would be somewhere between $4.50 and $4.80 an hour today. Instead, it is $7.25 an hour, more than 50% higher than it should be if it was only intended to keep pace with inflation.

When the minimum wage was implemented it was one part of a sweeping piece of legislation, the Fair Labor Standards Act, designed to address a number of Depression-era workforce issues. Other elements of the law addressed overtime pay and child labor. According to the Legal Information Institute of Cornell Law School, “The minimum wage was designed to create a minimum standard of living to protect the health and well-being of employees.” Franklin Roosevelt, in his statement after signing the National Industrial Recovery Act in 1933, said of wages, “It seems to me to be equally plain that no business which depends for existence on paying less than living wages to its workers has any right to continue in this country” and that “by living wages I mean more than a bare subsistence level—I mean the wages of decent living.” That statement has been used by many to argue that the minimum wage was always intended to be at least a living wage.

There is room to debate that assertion too, but let’s suppose for a moment that that has been the intent all along. The current federal minimum wage of $7.25 an hour would pay a worker who works 40 hours a week for fifty weeks a year and annual pre-tax income of $14,500. According to the Department of Health and Human Services, the 2019 Poverty Guideline for the 48 Contiguous States and the District of Columbia was $12,490 for an individual. The 2020 guideline was just released in January 17, and it is $12,760. (And, in case you are wondering, the Census Bureau and programs based on poverty level, such as SNAP, are based on gross income).

Maybe you don’t like the notion of basing the minimum wage’s relation to a “living wage” on a single individual. Fair enough. Let us suppose, then, that there is a family with two wage earners, both working 40 hours a week, fifty weeks a year, for minimum wage. The income for that family would be $29,000. That exceeds the poverty level for a two-person, three-person and four-person family. (The 2020 Poverty Guideline for a four-person family is $26,200). And, in case you are still uncomfortable, there is government assistance available for those families, since eligibility for programs based on poverty level requires that a family’s income be at or below 130% of the poverty line.

According to the Census Bureau’s 2018 report, the official poverty rate was 11.8%, which was the fourth consecutive annual decline and the first since 2007 that the poverty rate was significantly lower than it was that year, which was the year before the last major recession. Another interesting fact: the drop in poverty rate from 2017 to 2018 was highest among African Americans and second-highest among Hispanics. The was a bigger drop for females than males. And when considering educational attainment, the biggest drop was among those with “some college” while the only area where the poverty rate went up was among those age 25 and older who had no high school diploma. The interesting facts are abundant, in fact. The lowest poverty rate when considering family characteristics was among married couples, who had a poverty rate of only 4.7% in 2018. The highest percentage of poverty was among female householders with no spouse present, but that demographic also had the largest drop in poverty percentage.

If you consider the three-year average (2016-2018) of percentage of people in poverty by state, the state with the lowest percentage was New Hampshire, at 6.4%–and New Hampshire uses the federal minimum wage of $7.25. The five states with the next lowest percentages of poverty were Maryland (7.1%), Utah (7.9%), Minnesota (8.7%), Colorado (8.9%) and New Jersey (9.1%). Among those five states only Utah has the $7.25 minimum wage, but the minimum wages of the other states are still modest, and the average minimum wage of those five states is $9.31/hour. The states with the highest minimum wages were New York ($13), California ($12), Washington ($11.50), Oregon ($11.25) and Colorado ($11.10). We already saw that Colorado was among the states with the lowest poverty rates, but the other four states on this list did not fare so well. New York (11.8%), California (12.5%), Washington (10.3%) and Oregon (10.6%) were not the worst by any means, but even with Colorado included the average percentage of poverty was 10.82%. That put those states at 1.5% below the U.S. average that year of 12.3%, but still well above the average percentage of 8.34% for states 2-6 on the list. In other words, the five states with an average minimum wage of $9.31 had a poverty percentage two and a half percentage points below the five states with an average minimum wage of $11.77. I don’t know about you, but I do find it at least noteworthy that the states with an average minimum wage that was $2.46 higher led to a poverty percentage that was 2.48 percentage points higher. Maybe it’s a coincidence, but you just can’t get much closer than that.

 

____________________________________________________________________________________________

https://www.census.gov

https://www.raisetheminimumwage.com

https://smartasset.com/mortgage/what-is-the-cost-of-living-in-seattle

 

Photo credit: By Fibonacci Blue from Minnesota, USA – Rally demanding $15/hr minimum wage, CC BY 2.0, https://commons.wikimedia.org/w/index.php?curid=53672511

President Hamilton?

Though the quote has appeared in several different forms over the years, philosopher George Santayana wrote this: “Those who cannot remember the past are condemned to repeat it.” If I may, I would like to reword this famous statement and apply it to a current event: “Those who never learn the past are condemned to misstate it.”

What has prompted me to mess with the immortal words of Santayana? A monumental President’s Day blunder by online coupon provider Groupon, that’s what. According to a plethora of major news outlets Groupon issued a news release last week promoting $10 off of local deals over $40, complete with this explanation of the deal: “The $10 bill, as everyone knows, features President Alexander Hamilton — undeniably one of our greatest presidents and most widely recognized for establishing the country’s financial system.”

Now, in Groupon’s defense, Hamilton is generally credited with laying the foundations of the nation’s financial system, having served as the first Secretary of the Treasury the U.S. ever had. However, as with Benjamin Franklin on the $100 bill, Hamilton never served as president of the United States.

Compounding the problem, Fox News has reported that upon being informed of the blunder Erin Yeager, Groupon spokesperson, told MyFoxNY.com, “We’ll just have to agree to disagree.” Agree to disagree? Whether or not someone was ever president of the United States is not a matter of opinion; it is historical fact, easily checked and verified.

Groupon’s press release–which, believe it or not, is still available on its web site–refers to Hamilton as president three times and refers to him once as “our money-minded commander-in-chief.”

In the grand scheme of things this is pathetic but not that big a deal. However, it is evidence of a greater problem. That problem is a two-edged sword of ignorance of and disrespect for U.S. history. There is no excuse for having multiple professionals at a major corporation failing to recognize that Alexander Hamilton was never president of the United States. (Presumably more than one person has to approve press releases and ad campaigns). There is no excuse for a company spokesperson responding “we’ll have to agree to disagree” when the error was identified. The error was a result of ignorance or stupidity (or both), and the explanation once the error was identified is a result of ignorance or stupidity (or both, but most likely the latter).

Furthermore, the explanation is a prime example of the foolishness of relativism. Relativism is the idea that there is no absolute truth, that all beliefs and points of view are relative, subjective, and based on the preferences and viewpoints of those who adhere to them. “Agree to disagree” is a shorthand definition of “tolerance” and it works fine for things like which baseball team has a better starting rotation, which fast food chain has the best French fries or even which U.S. president was the best president. Those are topics subject to legitimate differences of opinion and conviction. There are different ways of defining “best” and legitimate, cogent, rational arguments could be made for multiple answers to those questions. Relativism has its place. I see it demonstrated almost daily at family meal times, for example–particularly when it comes to the vegetable of the meal and the opinions of my children as to how good–or not good–the vegetable may be!

Relativism has no place, however, when it comes to verifiable facts. There can be a difference of opinion as to which fast food chain has the best French fries, but whether or not a fast food chain even exists or even serves French fries is not open for discussion; the answer can be found and proven. Which U.S. president was the best will bring plenty of different answers, and you will probably find plenty of them today in particular, since it is Presidents Day. At a minimum I can guarantee you will find arguments for George Washington, Abraham Lincoln, Franklin Roosevelt, Ronald Reagan and Barack Obama. There is no definitive standard by which one can determine “best president” so that range of opinion is fine–healthy, even. But there is no question as to whether or not Alexander Hamilton was a U.S. president.

It is a sad day when a major company errs on what should be basic elementary school history. My favorite professor in college used to refer to some things by saying, “Every good schoolboy or schoolgirl should know this….” Sadly, the number of things every good schoolboy or schoolgirl knows is rapidly diminishing. That is due in no small part to an observation regularly made by my favorite graduate school professor: “Sometimes there is nothing common about common sense.”