201402

February 28, 2014 2:28PM

A Closer Look at Congress’s Views on Trade

Cato’s congressional trade votes database now includes votes from last year on major trade bills and amendments in both houses of Congress. The purpose of the database is to educate the public about the trade policy preferences of individual members. We do that by recording their votes on major trade bills and amendments and using the data to map a broader ideological profile.

Whether a particular member qualifies as a free trader, an isolationist, an internationalist, or an interventionist based on our methodology depends on their support for (or opposition to) trade barriers and subsidies.

 

Trade Votes Matrix

In previous years, the farm bill and its various amendments have provided a treasure trove of vote data to pin down members’ proclivities on specific commodities and willingness to use public money to distort the economy for the benefit of select cronies. This year was no different, except that votes taken in the House of Representatives on the full package bill have been excluded. Those votes hinged almost entirely on the issue of food stamps, and because the purpose of the database is to reveal members’ trade policy positions, including them in the database would be inappropriate. 

That doesn’t mean, of course, that you shouldn’t be dismayed by Republicans who, after successfully removing food stamps from the bill so that productive debate could be had on reforming farm programs, nevertheless voted en masse to continue our Soviet-style agriculture policy with no significant change.

The new votes on the site include the Senate farm bill, failed votes in both houses to reform the sugar program, an amendment to avoid protectionist regulations on imported olive oil, an extension of "Buy American" policies in government procurement, and a continuation of export marketing subsidies for wealthy agribusiness.

I encourage you to check out the site, read up on our unique methodology, and find out just how protectionist your favorite (or least favorite) member of Congress really is.

February 28, 2014 1:23PM

Some Like It Hot

The Current Wisdom is a series of monthly articles in which Patrick J. Michaels and Paul C. “Chip” Knappenberger, from Cato’s Center for the Study of Science, review interesting items on global warming in the scientific literature that may not have received the media attention that they deserved, or have been misinterpreted in the popular press.

 

With all the stern talk about global warming and widespread concern over climate change, you would think that we humans would have a propensity for cooler temperatures. Everywhere you look, the misery that rising temperatures (and the associated evils) will supposedly heap upon us seems to dominate reports about the coming climate. But do patterns of population movement really support the idea that we prefer cooler locations?

Increased Mobility

Since 1900, the population of the United States increased from about 76 million people to about 309 million people in 2010. Accompanying that population growth were major advances in technology and industry, including vast improvements in our nation’s system of transportation. As planes, trains, and automobiles replaced the horse and buggy, Americans became more mobile, and where we live was no longer connected primarily with proximity to where we were born. Instead, we became much freer to choose our place of residence based on considerations other than ease of getting there.

Where has our new-found freedom of mobility led us? Figure 1 shows the rate of population change from 1900 to 2010 for each of the contiguous 48 states. Notice the increases in states with warm climates such as Florida, Texas, and California, and also in states with big industry (that is, jobs), such as New York, Michigan, and Ohio for example.

 

Media Name: cw_022814_fig1.jpg

 

Figure 1. The state-by-state population trend (people/year) from 1900 to 2010 (data from U.S. Census Bureau).

Which states are people less likely to choose to live in? States such as North Dakota, South Dakota, Montana, Maine, Vermont—all of which have harsh climates and low temperatures.

Comparing a map of the change in population (Figure 1) with a map depicting the average temperature of each state (Figure 2) reveals a pretty strong indication that people seem to be seeking out warmer states.

 

Media Name: cw_022814_fig2.jpg


Figure 2. The state-by-state average annual temperature for the period 1900-2010 (statewide temperture data available from the U.S. National Climatic Data Center).

Experiential Temperature

Another way of looking at human temperature preferences is to calculate what we’ll call the “average experiential temperature”—that is, the annual temperature that the average person living in the lower 48 states experiences each year. We can calculate this value by first multiplying the average temperature in each state during a particular year by the state’s population in the same year. Then we sum this product across the 48 contiguous states, and finally divide this sum by the total population of the country. In other words, the temperature in states with larger populations weigh more heavily on the national composite experiential temperature than does the temperature in those states with sparser populations. As the population of the country redistributes itself over time, we can track how the average person’s climate changes.

When we do that for each year from 1900 to 2013, we get the result shown in Figure 3—a steadily rising temperature. In fact, the average experiential temperature has risen by a total of about 3.85ºF over the course of the last 114 years (a rate of 0.34ºF per decade).

Media Name: cw_022814_fig3.jpg

 

Figure 3. The average experiential temperature of the population of the United States, 1900 to 2013.

But the history of experiential temperatures alone can’t tell us whether the increase has been unwillingly forced upon us by a large-scale warming of the climate from, say, an enhanced greenhouse effect, or whether the change results from Americans seeking out warmer locales on their own accord.

U.S. Average Temperature

To answer this question, we must calculate the area-weighted average temperature of the United States—that is, the combination of the yearly average temperature within each state weighted by that state’s total area. In this case, it is the size of the state, rather than the size of its population, that matters—the bigger the state, the bigger its contribution to the nationwide average.

The result of this calculation is a quite different looking temperature history. In Figure 4, we included the annual U.S. average temperature history along with the annual U.S. “experiential” temperature from Figure 3. We see that, while the United States actual temperature has fluctuated a bit, experiencing warm decades such as the 1930s and 1990s and cold ones such as the 1910s and 1970s, it has increased only slightly during the 20th century—about 0.90ºF (a rate of 0.08ºF/decade).

Media Name: cw_022814_fig4.jpg

Figure 4. Average temperature of the United States, 1900 to 2013.

For what it’s worth, when you calculate the national temperature this way (using the state-by-state temperature data from the National Climatic Data Center, NCDC), you get a heckuva lot less warming than is in the “official” NCDC record put out by the U.S. Department of Commerce.  The difference lies in the “adjustments” plastered on to the original data.  Both records are adjusted for a bias known as “time of day” when the previous 24-hour highs and lows recorded.  It’s complicated, but it also does slightly alter the data.

But the official version is additionally massaged more than—well, we can’t say in polite company.  A laundry list can be found here. The sum of all of those adjustments is to put about twice as much warming in the record as is in our state-averaged plot.

Seeking the Heat

Although there has been a slight warm-up of the actual temperature, that rise is nowhere near the increase in the experiential temperature. In fact, the average experiential temperature has climbed at a rate more than four times that of the U. S. average temperature—which is the experiential temperature had the population distribution not changed at all. That means that Americans have actively been moving to warmer climates. And there is every indication that they are continuing to do so, as evidenced by the strong rise in experiential temperatures during the past 20 or 30 years.

While climatologists have not generally appreciated this fact, it has been long recognized and appreciated by sociologists.  As both people's mobility and their ability to select the climate they prefer have increased throughout this past century, the core of the U.S. population has moved southward—into warmer climates. The overall migration of people into the southern "Sunbelt" states has created a temperature change over time for the "average American" that far outstrips the most pessimistic measurements of global warming for the past century, and rivals the projections for the next!

Apparently, people--or Americans at least--seem to prefer a warmer climate to a cooler one. Next time climate prognosticators warn of the perils of rising temperatures, remember this: when given the means and a choice, some (or rather, most) like it hot!

(Special thanks to Robert C. Balling Jr. and Randy Cerveny, who assisted with early versions of this research.)

February 28, 2014 12:14PM

Krugman on the TPP

Paul Krugman weighed in yesterday on the Trans Pacific Partnership (TPP). I agree with one of his points; I disagree with another.

First, the disagreement: Krugman claims protectionism is mostly gone, and thus the TPP is not all that important:

The first thing you need to know about trade deals in general is that they aren’t what they used to be. The glory days of trade negotiations—the days of deals like the Kennedy Round of the 1960s, which sharply reduced tariffs around the world—are long behind us.

Why? Basically, old-fashioned trade deals are a victim of their own success: there just isn’t much more protectionism to eliminate. Average U.S. tariff rates have fallen by two-thirds since 1960. The most recent report on American import restraints by the International Trade Commission puts their total cost at less than 0.01 percent of G.D.P.

He said the same thing a while back, but it's just as wrong now as it was then. Here's what I said at the time:

Tariffs on certain goods are still quite high. A publication called World Tariff Profiles illustrates this nicely. If you look at p. 170 for U.S. statistics, you will see tariff duties for four general product categories of over 10%. You’ll also see maximum tariffs (i.e., the high tariff on particular products) of over 100%!

And if you look at the duty rates for other countries, they are generally much higher.

And none of that includes special “trade remedy” tariffs (anti-dumping, countervailing duties, safeguards), subsidies, discriminatory government procurement, or domestic laws and regulations that discriminate (such as local content requirements).

So, protectionism is alive and well. 

Turning to the part where I agree with him, he says:

But the fact remains that, these days, “trade agreements” are mainly about other things. What they’re really about, in particular, is property rights—things like the ability to enforce patents on drugs and copyrights on movies. And so it is with T.P.P.

... Is this a good thing from a global point of view? Doubtful. The kind of property rights we’re talking about here can alternatively be described as legal monopolies. True, temporary monopolies are, in fact, how we reward new ideas; but arguing that we need even more monopolization is very dubious—and has nothing at all to do with classical arguments for free trade.

Now, the corporations benefiting from enhanced control over intellectual property would often be American. But this doesn’t mean that the T.P.P. is in our national interest. What’s good for Big Pharma is by no means always good for America.

I don't have much to add to his points, which I think are pretty good ones. In my view, there's a need for a real debate on how much intellectual property protection is appropriate (and, in fact, we will be discussing this here at Cato next week). Unfortunately, that's not what we are getting either domestically or in the international trade context, where it seems that more is always better.

February 28, 2014 12:01PM

Grading the Camp Tax Reform Plan

To make fun of big efforts that produce small results, the Roman poet Horace wrote, "The mountains will be in labor, and a ridiculous mouse will be brought forth."

That line sums up my view of the new tax reform plan introduced by Rep. Dave Camp (R-Mich.), chairman of the House Ways and Means Committee.

Media Name: macnellybig.jpg

To his credit, Chairman Camp put in a lot of work. But I can't help but wonder why he went through the time and trouble. To understand why I'm so underwhelmed, let's first go back in time.

Back in 1995, tax reform was a hot issue. The House Majority Leader, Dick Armey, had proposed a flat tax. Congressman Billy Tauzin was pushing a version of a national sales tax. And there were several additional proposals jockeying for attention.

To make sense of the clutter, I wrote a paper for the Heritage Foundation that demonstrated how to grade the various proposals that had been proposed.

As you can see, I included obvious features such as low tax rates, simplicity, double taxation, and social engineering, but I also graded plans based on other features such as civil liberties, fairness, and downside risk.

Tax Reform Grading Matrix



There obviously have been many new plans since I wrote this paper, most notably the Fair Tax (a different version of a national sales tax than the Tauzin plan), Simpson-Bowles, the Ryan Roadmap, Domenici-Rivlin, the Heritage Foundation's American Dream proposal, the Baucus-Hatch blank slate, and—as noted above—the new tax reform plan by Chairman Camp.

Given his powerful position as head of the tax-writing committee, let's use my 1995 methodology to assess the pros and cons of Camp's plan.

Rates: The top tax rate for individual taxpayers is reduced from 39.6 percent to 35 percent, which is a disappointingly modest step in the right direction. The corporate tax rate falls from 35 percent to 25 percent, which is more praiseworthy, though Camp doesn't explain why small businesses (which file using the individual income tax) should pay higher rates than large companies.

Simplicity: Camp claims that he will eliminate 25 percent of the tax code, which certainly is welcome news since the code has swelled to 70,000-plus pages of loopholes, exemptions, deductions, credits, penalties, exclusions, preferences, and other distortions. And his proposal does eliminate some deductions, including the state and local tax deduction (which perversely rewards states with higher fiscal burdens).

Media Name: double-taxation-chart.jpg

Saving and Investment: Ever since Reagan slashed tax rates in the 1980s, the most anti-growth feature of the tax code is probably the pervasive double taxation of income that is saved and invested. Shockingly, the Camp plan worsens the tax treatment of capital, with higher taxation of dividends and capital gains and depreciation rules that are even more onerous than current law.

Social Engineering: Some of the worst distortions in the tax code are left in place, including the health care exclusion for almost all taxpayers. This means that people will continue to make economically irrational decisions solely to benefit from certain tax provisions.

Civil Liberties: The Camp plan does nothing to change the fact that the IRS has both the need and power to collect massive amounts of private financial data from taxpayers. Nor does the proposal end the upside-down practice of making taxpayers prove their innocence in any dispute with the tax authorities.

Fairness: In a non-corrupt tax system, all income is taxed, but only one time. On this basis, Camp's plan is difficult to assess. Loopholes are slightly reduced, but double taxation is worse, so it's hard to say whether the system is more fair or less.

Media Name: greece-vat.jpg

Risk: There is no value-added tax, which is a critically important feature of any tax reform plan. As such, there is no risk the Camp plan will become a Trojan Horse for a massive expansion of the fiscal burden.

Evasion: People are reluctant to comply with the tax system when rates are punitive and/or there's a perception of rampant unfairness. It's possible that the slightly lower statutory rates may improve incentives to obey the law, but that will be offset by the higher tax burden on saving and investment.

International Competitiveness: Reducing the corporate tax rate will help attract jobs and investment, and the plan also mitigates some of worst features of America's "worldwide" tax regime.

Now that we've taken a broad look at the components of Camp's plan, let's look at the grades in comparison to the other plans I've reviewed over the years:

Camp Tax Matrix



You can see why I'm underwhelmed by his proposal.

Camp's proposal may be an improvement over the status quo, but my main reaction is, what's the point?

In other words, why go through months of hearings and set up all sorts of working groups, only to propose a timid plan?

Media Name: irs-cartoon-3.jpg

Now, perhaps, readers will understand why I'm rather pessimistic about achieving real tax reform.

We know the right policies to fix the tax code.

And we have ready-made plans—such as the flat tax and national sales tax—that would achieve the goals of tax reform.

Camp's plan, by contrast, simply rearranges the deck chairs on the Titanic.

P.S.: If you need to be cheered up after reading all this, here’s some more IRS humor to brighten your day, including the IRS version of the quadratic formula, a new Obama 1040 form, a list of tax day tips from David Letterman, a cartoon of how GPS would work if operated by the IRS, a sale on 1040-form toilet paper (apparently a real product), and two satirical songs about the tax agency (here and here).

February 28, 2014 9:22AM

Political Poster Week Concludes: Posters and Press Freedom

Poster with "chained pen" image for freedom of the pressWhatever its words, a poster without a striking image is a missed opportunity, and incongruous, vaguely disturbing images often work best. (The snake is among the most unsettling creatures on earth to gaze at, yet it figures as the sympathetic subject in not one but two great American political images, the "Don't Tread on Me" Gadsden flag and Ben Franklin's "Join or Die.")  For World Press Freedom Day last year, a journalists'-advocacy group in Jordan came up with this simple design. Yes, today's tyrants are more interested in clamping controls on keyboards, blogs, and cellphone transmissions, but for evocativeness it's hard to beat the chained nib of an old-style fountain pen, trembling somewhat as if in resistance.  

Today, social media and meme culture endlessly rework classic posters and poster genres for purposes of commentary and satire. That stands in a great tradition: as a means of persuasion, posters are themselves a powerful part of the press. Use them in a good cause, and enjoy them too. [Earlier entries in this series: Monday, Tuesday, Wednesday, Thursday]

February 28, 2014 8:47AM

More Evidence for a Low Climate Sensitivity

Global Science Report is a feature from the Center for the Study of Science, where we highlight one or two important new items in the scientific literature or the popular media. For broader and more technical perspectives, consult our monthly “Current Wisdom.”

We have two new entries to the long (and growing) list of papers appearing the in recent scientific literature that argue that the earth’s climate sensitivity—the ultimate rise in the earth’s average surface temperature from a doubling of the atmospheric carbon dioxide content—is close to 2°C, or near the low end of the range of possible values presented by the U.N.’s Intergovernmental Panel on Climate Change (IPCC).  With a low-end warming comes low-end impacts and an overall lack of urgency for federal rules and regulations (such as those outlined in the President’s Climate Action Plan) to limit carbon dioxide emissions and limit our energy choices.

The first is the result of a research effort conducted by Craig Loehle and published in the journal Ecological Modelling. The paper is a pretty straightforward determination of the climate sensitivity.  Loehle first uses a model of natural modulations to remove the influence of natural variability (such as solar activity and ocean circulation cycles) from the observed temperature history since 1850. The linear trend in the post-1950 residuals from Loehle’s natural variability model was then assumed to be largely the result, in net, of human carbon dioxide emissions.  By dividing the total temperature change (as indicated by the best-fit linear trend) by the observed rise in atmospheric carbon dioxide content, and then applying that relationship to a doubling of the carbon dioxide content, Loehle arrives at an estimate of the earth’s transient climate sensitivity—transient, in the sense that at the time of CO2 doubling, the earth has yet to reach a state of equilibrium and some warming is still to come. 

Loehle estimated the equilibrium climate sensitivity from his transient calculation based on the average transient:equilibrium ratio projected by the collection of climate models used in the IPCC’s most recent Assessment Report. In doing so, he arrived at an equilibrium climate sensitivity estimate of 1.99°C with a 95% confidence range of it being between 1.75°C and 2.23°C.

Compare Loehle’s estimate to the IPCC’s latest assessment of the earth’s equilibrium climate sensitivity which assigns a 66 percent or greater likelihood that it lies somewhere in the range from 1.5°C to 4.5°C. Loehle’s determination is more precise and decidedly towards the low end of the range.

The second entry to our list of low climate sensitivity estimates comes from  Roy Spencer and William Braswell and published in the Asia-Pacific Journal of Atmospheric Sciences. Spencer and Braswell used a very simple climate model to simulate the global temperature variations averaged over the top 2000 meters of the global ocean during the period 1955-2011. They first ran the simulation using only volcanic and anthropogenic influences on the climate. They ran the simulation again adding a simple take on the natural variability contributed by the El Niño/La Niña process. And they ran the simulation a final time adding in a more complex situation involving a feedback from El Niño/La Niña onto natural cloud characteristics. They then compared their model results with the set of real-world observations.

What the found, was the that the complex situation involving El Niño/La Niña feedbacks onto cloud properties produced the best match to the observations.  And this situation also produced the lowest estimate for the earth’s climate sensitivity to carbon dioxide emissions—a value of 1.3°C.

Spencer and Braswell freely admit that using their simple model is just the first step in a complicated diagnosis, but also point out that the results from simple models provide insight that should help guide the development of more complex models, and ultimately could help unravel some of the mystery as to why full climate models produce  high estimates of the earth’s equilibrium climate sensitivity, while estimates based in real-world observations are much lower.

Our Figure below helps to illustrate the discrepancy between climate model estimates and real-world estimates of the earth’s equilibrium climate sensitivity. It shows Loehle’s determination as well as that of Spencer and Braswell along with 16 other estimates reported in the scientific literature, beginning in 2011. Also included in our Figure is both the IPCC’s latest assessment of the literature as well as the characteristics of the equilibrium climate sensitivity from the collection of climate models that the IPCC uses to base its impacts assessment.

[caption]

Media Name: gsr_022714.gif

Figure 1. Climate sensitivity estimates from new research beginning in 2011 (colored), compared with the assessed range given in the Intergovernmental Panel on Climate Change (IPCC) Fifth Assessment Report (AR5) and the collection of climate models used in the IPCC AR5. The “likely” (greater than a 66% likelihood of occurrence)range in the IPCC Assessment is indicated by the gray bar. The arrows indicate the 5 to 95 percent confidence bounds for each estimate along with the best estimate (median of each probability density function; or the mean of multiple estimates; colored vertical line). Ring et al. (2012) present four estimates of the climate sensitivity and the red box encompasses those estimates. The right-hand side of the IPCC AR5 range is actually the 90% upper bound (the IPCC does not actually state the value for the upper 95 percent confidence bound of their estimate). Spencer and Braswell (2013) produce a single ECS value best-matched to ocean heat content observations and internal radiative forcing.[/caption]

Quite obviously, the IPCC is rapidly losing is credibility.

As a result, the Obama Administration would do better to come to grips with this fact and stop deferring to the IPCC findings when trying to justify increasingly  burdensome  federal regulation of  carbon dioxide emissions, with the combined effects of manipulating markets and restricting energy choices.

References:

Loehle, C., 2014. A minimal model for estimating climate sensitivity. Ecological Modelling, 276, 80-84.

Spencer, R.W., and W. D. Braswell, 2013. The role of ENSO in global ocean temperature changes during 1955-2011 simulated with a 1D climate model. Asia-Pacific Journal of Atmospheric Sciences, doi:10.1007/s13143-014-0011-z.

February 27, 2014 4:15PM

Drug Warrior Stumbles During Marijuana Legalization Hearing

From the Washington Post:

Annapolis Police Chief Michael A. Pristoop thought he came prepared when he testified before a Maryland State Senate panel on Tuesday about the perils of legalizing marijuana.

In researching his testimony against two bills before the Judicial Proceedings Committee, Pristoop said, he had found a news article to illustrate the risks of legalization: 37 people in Colorado, he said, had died of marijuana overdoses on the very day that the state legalized pot....

Trouble is, the facts were about as close to the truth as oregano is to pot. After a quick Google search on his laptop, [State Senator Jamin] Raskin—the sponsor of the legalization bill that was the subject of the Senate hearing—advised the chief that the Colorado overdose story, despite its deadpan delivery, had been made up for laughs by The Daily Currant, an online comedy magazine.

Ouch! For more on the momentum of marijuana law reform, check out today's New York Times.