Mercatus Site Feed en Overwhelming Evidence That ACA Caused Premiums to Increase Substantially <h5> Expert Commentary </h5> <p class="p1"><span class="s1">Two scholars at the renowned Brookings Institution, Loren Adler and Paul Ginsburg, have published an <a href=""><span class="s2">analysis</span></a> finding that “average premiums in the individual market actually dropped significantly upon implementation of the ACA [Affordable Care Act].” This contrasts with a plethora of <a href=""><span class="s2">evidence</span></a>, including a rigorous 2014 Brookings <a href=""><span class="s2">study</span></a>, showing that the ACA significantly increased premiums. In this post, I discuss methodological concerns with the Adler and Ginsburg approach as well as evidence that leads most scholars to reach a very different conclusion.</span></p> <p class="p1"><span class="s1">While I will discuss the relevant evidence of the ACA’s effect on premiums in depth, but there are three data points worth emphasizing. First, unlike Adler and Ginsburg’s approach, Brookings 2014 <a href=""><span class="s2">study</span></a>&nbsp;used actual data and found that “enrollment-weighted premiums in the individual health insurance market increased by 24.4 percent beyond what they would have had they simply followed…trends.” Second, S&amp;P Global Institute <a href=""><span class="s2">found</span></a> that average individual market medical costs increased substantially between 2013 and 2015, up an estimated 69%. Third, 2014 insurer data shows that premiums for individual market Qualified Health Plans (QHPs), ACA-compliant plans certified to be sold on exchanges, were much higher than premiums for individual market non-QHPs, mostly plans in existence before 2014 that did not comply with the ACA. Relative to non-QHPs, insurers collected more than $1,000 per enrollee in higher premiums and more than $2,300 in higher premium revenue per enrollee in 2014 after accounting for large premium subsidy programs for their QHPs.</span></p> <p class="p1"><span class="s1">Adler and Ginsburg do not discuss this previous research, and their analysis also fails to account for other crucial factors. They do not account appropriately for substantial subsidies insurers received to discount individual market ACA plan premiums. They also do not consider the trend in medical claims costs, which is presumably a better measure of the ACA’s effect on the individual market thus far given both the large subsidies and large losses insurers have incurred selling ACA plans. Instead, most of their analysis relies upon crude back-of-the-envelope estimates for the average individual market premium in 2009 as well as well as for an annual growth rate to inflate their 2009 premium estimate.</span></p> <p class="p1"><span class="s1">Finally, it is worth noting that if Adler and Ginsburg were correct in their analysis, we would expect to see very different results than have occurred. If, as they claim, the ACA was delivering better coverage at lower cost, the&nbsp;exchanges would have attracted a wider cross-section of enrollees and more insurers would be looking to enter these markets. Instead we see adverse selection in the individual market, with <a href=""><span class="s2">spiraling premiums</span></a>, <a href=""><span class="s2">sizeable insurer exits</span></a>, and enrollees <a href=""><span class="s2">generally attracted</span></a> to ACA plans only if they are either highly subsidized or relatively old or unhealthy.</span><span style="font-size: 12px; background-color: white;">&nbsp;</span></p> <p class="p1"><span class="s1"><b>Ample Evidence Shows Average Premiums and Spending Exploded After ACA Implementation in 2014</b></span></p> <p class="p1"><span class="s1">While it is important to look at data for several years after 2013 to assess the impact of the ACA, comparing individual market premiums in 2013 with those in 2014—the year its key changes took effect—provides an approximation of the initial change. The Manhattan Institute compared the average of the five least expensive pre-ACA plans in 2013 with the least expensive plans available on exchanges in 2014. Manhattan’s researchers adjusted the pre-ACA plan premiums upward to account for the population facing surcharges or denied coverage because of a pre-existing condition. Manhattan <a href=""><span class="s2">estimated</span></a> that the average state individual market premium increased 41% between 2013 and 2014. A county-level <a href=""><span class="s2">analysis</span></a> suggested that premiums increased by 49%.</span></p> <p class="p1"><span class="s1">The 2014&nbsp;Brookings <a href=""><span class="s2">study</span></a> on this same subject by Amanda Kowalski—and unaddressed by Adler and Ginsburg—used actual pre-ACA individual market premium data, finding that “[a]cross all states, from before the reform to the first half of 2014, enrollment-weighted premiums in the individual health insurance market increased by 24.4 percent beyond what they would have had they simply followed state-level seasonally adjusted trends.” According to Kowalski, the Manhattan Institute estimates are higher likely because they were not enrollment-weighted, and individuals in areas with high premiums likely selected cheaper plans.</span></p> <p class="p1"><span class="s1">Economists at the University of Pennsylvania, also using actual pre-ACA individual market data, <a href=""><span class="s2">estimated</span></a> that the total expected price of individual market coverage (premiums plus out-of-pocket payments) increased by 14% to 28% as a result of the ACA. According to their findings, “the pre-ACA average premium was lower than the lowest silver plan premium.” Penn’s economists also<a href=""><span class="s2">estimated</span></a> that plans in the individual market before the ACA had similar actuarial value to silver plans, not an average actuarial value that was 10 percentage points less (and 17% less) than assumed by Adler and Ginsburg. Importantly, all of these studies compare gross premiums before and after the ACA without accounting for subsidies that lowered ACA plan premiums.</span></p> <p class="p1"><span class="s1">Perhaps the most striking visual data that suggest Adler and Ginsburg’s conclusions are wrong is an S&amp;P Global Institute<a href=""><span class="s2">analysis</span></a> of individual market per member per month (PMPM) costs from May. The figure below from the S&amp;P report shows trends in PMPM costs for the individual and employer-provided markets. The ACA is responsible for the huge spike, clearly shown in the figure, in the individual market PMPM costs, which by early 2015 exceeded PMPM costs in the employer markets.</span></p> <p class="p1"><a href=""><img height="431" width="575" src="" /></a></p> <p class="p1"><span class="s1">The data shows a huge increase in PMPM costs in the individual market between 2013 and 2015. According to S&amp;P, PMPM costs increased 38% between 2013 and 2014, and another 23% between 2014 and 2015. The two-year increase (69%) is the product of the two single-year increases.</span></p> <p class="p1"><a href=""><img height="489" width="575" src="" /></a></p> <p class="p1"><span class="s1">The comparable PMPM cost increase in the employer market, which the ACA affected much less, amounted to about 11%. Assuming an 11% increase would have happened in the individual market absent the ACA, a very rough initial guess would be that the ACA increased individual market PMPM costs by about 58% between 2013 and 2015. As previously noted, premiums did not increase by this full 58% because of insurers’ large losses and the large subsidies they received.</span></p> <p class="p1"><span class="s1">It is worth noting that the individual market includes both ACA-compliant plans as well as non-ACA-compliant plans. If only ACA-compliant plans were included in the post-2013 data, the spike would likely be much larger.</span></p> <p class="p1"><span class="s1"><b>Brookings Comparison Fails to Account for Important Subsidy Program that Reduced ACA Premiums</b></span></p> <p class="p1"><span class="s1">In their analysis, Adler and Ginsburg fail to account for the ACA’s reinsurance program—a program that allowed insurers to reduce premiums since it compensated them&nbsp;for a large share of the cost of their most expensive enrollees. In an April Mercatus <a href=""><span class="s2">working paper</span></a> I coauthored, we computed that net reinsurance payments paid by government to insurers selling individual market QHPs totaled about 20.4% of gross premiums. Therefore, regardless of the other problems with Adler and Ginsburg’s methodology, just accounting for reinsurance payments negates their central finding that “2014 premiums in the ACA market were 10-21 percent lower than 2013 individual market premiums.”</span></p> <p class="p1"><span class="s1">In addition to the need to account for government subsidy programs, it is also important to consider that insurers incurred substantial losses in both 2014 and 2015, which I have <a href=""><span class="s2">estimated</span></a>, after accounting for reinsurance, averaged about $400 per enrollee in 2014 and about $1,000 per enrollee in 2015. The fact that ACA plan premiums have been much too low to cover insurer expenses is another reason that it is problematic to compare face value premiums of ACA plans with pre-ACA individual market plans.<b>&nbsp;</b></span></p> <p class="p1"><span class="s1"><b>2014&nbsp;Insurer Data Shows QHPs&nbsp;Much More Expensive than Other Individual Market Plans</b></span></p> <p class="p1"><span class="s1">Individual market QHP premiums were much higher than individual market non-QHP premiums in 2014, a data point that is further evidence that the ACA increased premiums. As noted earlier, QHPs are plans that satisfy all the ACA requirements and are certified to be sold on exchanges. Non-QHPs include ACA-compliant plans but mostly consist of plans in effect prior to 2014 that did not conform to the ACA’s rules and regulations. The following table contrasts premium income for individual market QHPs and non-QHPs.</span></p> <p class="p1"><a href=""><img height="208" width="575" src="" /></a></p> <p class="p1"><span class="s1">The table shows that even when excluding reinsurance payments and cost sharing reduction (CSR) payments, &nbsp;average QHP premiums exceeded average non-QHP premiums by more than $1,000 in 2014. CSR payments essentially represent additional government premium payments for insurers to reduce deductibles and other copayment amounts for certain exchange enrollees with lower incomes. When net reinsurance and CSR payments are included, average QHP premium revenue exceeded average non-QHP premium revenue by more than $2,300, or about 76%.</span></p> <p class="p1"><span class="s1">It is necessary to include CSR payments in comparing total effective premiums between pre-ACA individual market plans and ACA plans. Adler and Ginsburg’s decision to not include CSR payments in their analysis is understandable given that they are mainly looking at the price of the second lowest cost silver (SLS) plan.</span></p> <p class="p1"><span class="s1"><b>Brookings Study Compares Lower-Cost ACA Plans to Inflated Estimates of Average Pre-ACA Costs&nbsp;</b></span></p> <p class="p1"><span class="s1">In their analysis, Adler and Ginsburg compare fundamentally different measures of premiums by contrasting the cost of the SLS plan available in ACA exchanges to their estimated measure of the average individual market premium that would have existed without the ACA. Importantly, the SLS plan is a lower than average price exchange plan. As shown in the table, the SLS plan premium they use ($3,800) is about&nbsp;10% less than the average 2014 individual market QHP gross premium and 43% lower than the premium income received by insurers when accounting for reinsurance and CSR payments.</span></p> <p class="p1"><span class="s1"><b>Brookings Failed to Properly Account for CBO’s Premium Projections</b></span></p> <p class="p1"><span class="s1">Adler and Ginsburg support their claim of the ACA leading to lower premiums by pointing to the fact that the 2014 SLS plan was 15% below CBO’s 2009 <a href=""><span class="s2">estimate</span></a> of what it would be in 2014. This comparison is problematic and misleading because CBO’s 2009 estimate involved significant and generally unforeseeable errors in key underlying assumptions.</span></p> <p class="p1"><span class="s1">First, starting in the mid-2000s, health care inflation slowed and the extent of this slowdown, largely the result of the recession and weak economic recovery, was not anticipated by CBO in 2009. Adler and Ginsburg cite that the 2014 per enrollee cost of workplace coverage, which was much less affected by the ACA, was 12% lower than CBO expected in 2009. Second, CBO also erred in <a href=""><span class="s2">projecting</span></a> that the reinsurance program would only reduce premiums by about 10% in 2014; in fact, the per enrollee effect of the reinsurance program <a href=""><span class="s2">was double</span></a> what CBO expected. Third, as late as February 2014, CBO <a href=""><span class="s2">projected</span></a> that insurers would be profitable in the aggregate selling ACA plans—meaning they thought premiums would adequately cover expenses. Finally, CBO <a href=""><span class="s2">did not anticipate</span></a> how narrow ACA plan provider networks would be, and provider network size is positively related to average premiums, all else equal. It’s uncertain what CBO assumed about ACA plan deductibles, but they are <a href=""><span class="s2">higher than many ACA advocates expected</span></a>, and deductible size is inversely related to premiums, all else equal.</span></p> <p class="p1"><span class="s1">Without adjusting for all of CBO’s inaccurate assumptions—and I have listed at least four major ones—it is misleading to compare CBO’s 2009 projections for SLS premiums with actual SLS premiums. Properly adjusting for all the above mentioned factors would plausibility result in CBO’s 2009 estimate of future SLS effective premiums being too low rather than too high.</span></p> <p class="p1"><span class="s1"><b>Conclusion</b></span></p> <p class="p1"><span class="s1">Adler and Ginsburg argue that the ACA both improved coverage and lowered costs. If true, the exchanges would undoubtedly be more successful than they are. Thus far, the result is <a href=""><span class="s2">significantly worse than expected</span></a>, with far fewer enrollees—particularly younger and healthier enrollees—than projected, and substantial market instability. Another Brookings Institution scholar, Stuart Butler, recognizes the problem, <a href=""><span class="s2">writing</span></a> just two weeks ago that “the ACA might be more appropriately labeled the ‘Medicaid Expansion Act’” because “enrollment in the ACA exchanges has been disappointing, with an estimated 10 million fewer people enrolled compared with earlier expectations.”</span></p> <p class="p1"><span class="s1">The new Brookings study does not mention the numerous studies, including the rigorous 2014 Brookings study, that come to opposite conclusions, does not use actual pre-ACA individual market data, does not consider the huge increase in the cost of medical claims in the individual market after 2013, and makes a number of questionable methodological choices. The authors arrive at a different conclusion than most scholars who have examined the effect of the ACA on health insurance premiums. Most scholars and analysts conclude that, particularly when fully accounting for the various government subsidies for individual insurance coverage, the ACA significantly increased individual market premiums.</span></p> Thu, 28 Jul 2016 14:33:36 -0400 The Mythology of Minimum Wages <h5> Expert Commentary </h5> <p class="p1">The theory of minimum wages says that mandating an increase in the wage paid to workers will increase their total income. But does this theory hold true when subjected to scrutiny?&nbsp;</p> <p class="p1">Election campaigns invariably raise the debate over minimum wages, and this Presidential campaign is no different except perhaps that both campaigns favor increasing the minimum wage and appear to be in a bidding war to see who can offer the most. Politically offering increases to the federal minimum wage has a lot of attraction; it gives the impression of helping low income people—however research on the effect of raising minimum wages tells a different story.&nbsp;</p> <p class="p1">In reality, wage rates are set by internationally competitive forces. Increases in the cost of producing goods and services without accompanying improvements in productivity lead to a loss of competitive edge to the employer who is now forced to pay non-competitive wage rate in a competitive world.&nbsp;</p> <p class="p1">In the current economy, that frequently means investing in technology that automates these jobs resulting in the jobs being replaced by capital invested in machines. Alternatively the employer cancels overtime or reduces employees work hours resulting in a loss of total income for the employee. In the worst case scenario the employer reduces production or goes out of business consequently destroying some, or all of the jobs.</p> <p class="p1">In an economy where government mandates set wages, who wins and who loses? The employer’s first response to this increased cost is to increase prices for their goods and services; this means that the consumers of those goods and services pay higher prices. Given that the goods and services produced from minimum wages are predominantly consumed by middle and low income people the burden of this new cost falls on them, hardly the outcome they intended. Alternatively the local goods and services are replaced by goods and services from other places that do not have to pay these higher wages causing the local jobs to be lost.</p> <p class="p1">However, there is another group of people who are adversely effected by these policies as well and that is the unemployed who are looking for work and the underemployed who are seeking full time employment. A higher cost of employing immediately dims the chances of these individuals achieving their ambition of becoming independent self-supporting workers because it’s illegal to work for less than the mandated wage.</p> <p class="p1">So who are the winners under this policy? Well they are foreign employers whose goods and services have just become more competitive and can now replace locally produced goods. A second winner is the producer of capital intensive automating systems that replace workers totally.</p> <p class="p1">The labor market is just that—a market—where we as individuals trade our skills and time for remuneration. As a market, it is a sensitive self-balancing mechanism of incredible complexity responding to millions of actions taken around the world by millions of participants. Artificially interfering in the sensitive balance of this mechanism will create undesirable outcomes.</p> Thu, 28 Jul 2016 14:22:24 -0400 US Antipoverty Policy and Reform <h5> Publication </h5> <p class="p1">After more than half a century, America’s welfare system can boast few clear successes. Despite decades of proliferating programs, expanding goals, and ballooning budgets, the official poverty rate in the United States has stubbornly refused to break from its narrow historical range.</p> <p class="p1">A new study published by the Mercatus Center at George Mason University provides a history of the welfare system, discusses various reform efforts, and proposes implementing a system of block grants to states to finance income support programs.</p> <p class="p1"><b style="font-family: inherit; font-style: inherit;">KEY POINTS</b></p> <ul class="ul1"> <li class="li4">True measures of poverty have declined considerably over the past five decades. This trend has been driven largely by economic growth, not by government welfare programs.</li> <li class="li4">The current welfare system is needlessly complex and paternalistic, it misaligns incentives for both administrators and participants, and it requires policymakers to make plans based on knowledge that is by nature inaccessible to them.</li> <li class="li4">Several “basic income” and “negative income tax” reforms have been proposed over the past five decades, many of which have been field tested and show varying degrees of success.</li> <li class="li4">Policymakers should learn from past welfare reform successes and replace the current patchwork of programs with block grants to states, leveraging the superior knowledge and incentives of local governments to provide a safety net that is appropriate for the needs of the individuals it serves.</li> </ul> <p class="p2"><span style="font-size: 12px; background-color: white;"><b>THE EVOLUTION OF THE CURRENT SYSTEM</b></span></p> <p class="p1">The federal government first assumed dominance in welfare spending during the Great Depression with the advent of the Aid to Dependent Children program (later renamed Aid to Families with Dependent Children, or AFDC) which largely supported single mothers who were unable to provide for their children. Spending on welfare accelerated during the 1960s with President Johnson’s Great Society reforms. Welfare reform finally came in 1995, replacing AFDC with the more work-oriented Temporary Assistance for Needy Families (TANF), which operates as block grants to states. The TANF reform’s effects on program outcomes were broadly positive. Still, the sprawling, piecemeal mess of other federal antipoverty programs went largely untouched, and it has grown significantly under the Obama administration.</p> <p class="p1">The reforms of the 1990s were a tenuous step in the right direction, but the road to a better welfare system is longer still. The problems with the current US system include the following:</p> <ul class="ul1"> <li class="li4"><span class="s4"><i>Complexity.</i> The system’s piecemeal nature and programmatic complexity expensively reduce the utility and welfare of the poor. Many low-income Americans only enroll in one or two of the programs for which they qualify, a problem attributable to welfare complexity.</span></li> <li class="li4"><i>Misaligned bureaucratic incentives.</i> Social planners sometimes craft programs to serve other interests before those of the poor, allowing the state to maintain power over poor recipients by placing strings on the monopoly of public assistance.</li> <li class="li4"><span class="s4"><i>Disincentives to work.</i> The structure of marginal tax rates and the number of state-subsidized alternatives to gainful employment generate considerable disincentives to work that leave beneficiaries uncomfortably dependent on public programs.</span></li> <li class="li4"><i>The knowledge problem.</i> Policymakers assume that they have the proper knowledge to effectively engineer social outcomes. Unfortunately, individuals’ varied needs and means cannot be beneficially distilled into the format that central planning requires.</li> </ul> <p class="p2"><span style="font-size: 12px; background-color: white;"><b>REFORMS</b></span></p> <p class="p5"><span class="s5"><b><i>Income Support Proposals</i></b></span></p> <p class="p1"><span class="s4">Potential reforms to welfare include income support proposals such as guaranteed minimum income and negative income taxes. These proposals are theoretically attractive but contain major flaws.</span></p> <ul class="ul1"> <li class="li4">Guaranteed minimum incomes are cash transfers that would replace the byzantine maze of welfare programs with a single cash transfer to members of qualifying groups.</li> <li class="li4">Negative income taxes aim to provide a basic income through the tax code by paying impoverished people who make less than the lowest tax bracket. The earned income tax credit acts as a negative income tax for working people.</li> </ul> <p class="p1">Such proposals could reaffirm the autonomy of the poor while reducing bureaucratic stress and expanding low-income families’ options. They would also better address the knowledge problem that jeopardizes contemporary welfare outcomes.</p> <p class="p1">However, experiments show that income support schemes often create a disincentive to work unless they are combined with explicit work requirements. Furthermore, the purported cost savings that such reforms could yield are dubious, and these programs are vulnerable to political manipulation.</p> <p class="p5"><span class="s5"><b><i>Block Grants</i></b></span></p> <p class="p1">Rather than burdening the federal government with the responsibility to properly target spending, block grants could be issued directly to states, enabling them to tackle poverty themselves.</p> <ul class="ul1"> <li class="li4">Block granting allows the federal government to entrust a lump-sum welfare payment to each state to allocate according to certain conditions.</li> <li class="li4">Such a reform would combine the deep pockets of the federal government with the increased discretion and better knowledge of more localized government bodies.</li> <li class="li4">TANF is a federal experiment in block granting that has been a remarkable success, meeting and exceeding the expectations of supporters and detractors alike.</li> </ul> <p class="p1">States can use block grants to pursue empirically proven strategies that meet the specific needs that come with the varying contexts of poverty.</p> <ul class="ul1"> <li class="li4">Low-income Americans who are incapable of providing for themselves could receive a single payment from their state government to fund living expenses, an improvement over the myriad confusing programs these individuals must navigate today.</li> <li class="li4">Low-income Americans who are simply experiencing temporary hardship could receive state transfers that come with the condition of work requirements that are structured to minimize or eliminate the disincentives to work.</li> </ul> Thu, 28 Jul 2016 14:14:05 -0400 Can You Have Your Subsidized Peanut Butter Cake and Eat It, Too? <h5> Expert Commentary </h5> <p class="p1"><span class="s1">The federal government is packed full of crony programs, such as the Export-Import Bank and the ethanol mandate. When it comes to the unhealthy marriage between government and the private sector, however, the U.S. Department of Agriculture may take the cake.</span></p> <p class="p1"><span class="s1">With the exception of food stamps, which should have nothing to do with the farm bill, every program in the agency is meant to subsidize or boost the profits of farmers. We have such programs as the Dairy Margin Protection Program and the Dairy Market Stabilization Program. The former effectively guarantees profits for dairy farmers, and the latter is a complicated program meant to drive up milk prices to benefit small-scale dairy farmers. Then there are sugar tariffs, which are meant to artificially boost the profits of a few companies by keeping the price of sugar high in the United States at the expense of consumers and taxpayers.</span></p> <p class="p1"><span class="s1">In the same vein, we also have the peanut programs. The USDA recently announced that U.S. peanut farmers will produce 6.1 billion pounds this fall, on top of 2.9 billion pounds in leftover stockpiles. Total peanut demand isn't that high, and we will start the next year with a 3.2 billion-pound stockpile. And unless you like your peanut butter and jelly sandwich with a side of government subsidies, you should care.</span></p> <p class="p1"><span class="s1">An excessive supply of peanuts depresses the price of peanuts. Unfortunately for us, Uncle Sam won't let the market work its magic to eliminate excess supply. Instead, it subsidizes the farmers whose income might have otherwise suffered from the reduction in price by paying them most of the difference between a reference price of $535 per ton and the market price. Obviously, the lower the price the higher the payout. That's a terrible incentive structure, if you ask me.</span></p> <p class="p1"><span class="s1">In addition, the government takes on the extra peanuts and stockpiles them at a cost. Finally, the loans that peanut farmers may have taken based on the value of their crops, which they may not be able to repay, are guaranteed by your hard-earned tax dollars.</span></p> <p class="p2"><span class="s1">&nbsp;</span></p> <p class="p1"><span class="s1">Let me sum this up for you: First, the government gives an incentive to peanut farmers to take on loans that they wouldn't have to pay back if the price of peanuts were to fall under an arbitrary price set by politicians and bureaucrats in Washington. The result of this mad-scientist peanut plan is something like this: Overproduce peanuts and depress peanut prices; don't pay your loans; and collect taxpayers' money. According to the Congressional Budget Office, this year's overflow of peanuts will cost about $2 billion in subsidy payments to farmers by 2018.</span></p> <p class="p1"><span class="s1">But misery, as we know, loves company. In a May article for The Wall Street Journal, James Bovard documents how the U.S. government uses its stockpiles of peanuts to decimate peanut farmers' earnings in poor countries by dumping free peanuts under the cover of foreign aid. In particular, he writes about the devastating impact of a plan by the Obama administration to unload a million pounds of its excess peanuts in Haiti through its food donation program.</span></p> <p class="p1"><span class="s1">The country has its own peanut farmers — about 150,000 — and they provide the foundation for the livelihood of about a half-million Haitians. Image what happens to these people when the U.S. government floods the market with free peanuts. Who will buy relatively expensive peanuts if they can buy subsidized American ones? This has been going on for so long that farmers have long known not to even bother bringing their crops to market when the American government is feeling "generous."</span></p> <p class="p1"><span class="s1">But the USDA's bureaucrats don't want to hear it. They claim that American "peanuts won't hurt Haitian farmers because they will be packaged in one-ounce bags that 'are to be consumed at school only,'" Bovard reports. Never mind that after the massive 2010 earthquake in Haiti, the then-president of Haiti begged the United States to "stop sending food aid" so that Haiti's economy could "recover and create jobs."</span></p> <p class="p1"><span class="s1">Make no mistake; all of the wasted money and the devastation created in Haiti only serve American farmers. And it all happens at the expense of taxpayers, consumers and competitors around the world.</span></p> Wed, 27 Jul 2016 16:13:32 -0400 Get Bureaucrats out of Medical Decisions <h5> Expert Commentary </h5> <p class="p1"><span class="s1">The stories are heartbreaking and horrifying. People who are dying and just want to live go before the Food and Drug Administration in hopes that the FDA will listen to them and approve a drug that may give them a chance to live.</span></p><p class="p1"><span class="s1">But as a former FDA medical officer <a href=""><span class="s2">recently revealed</span></a>, when the FDA holds a public speaking session where these desperate people come to make their case at advisory committee meetings, “It is all for show. I can recall my FDA supervisors and colleagues checking their emails, doodling, texting, and the like to avoid listening to individuals who would often travel great distances to pour out their hearts to the advisory committee and FDA reviewers, literally begging them to approve a new drug.”</span></p><p class="p1"><span class="s1">Yet Right to Try legislation, which allows terminally ill patients to access drugs that have only passed the safety tests required by FDA, is now the law in 28 states (and introduced in 15 more). As the former FDA official said, “It makes sense for people who don’t understand the complexities of the drug development process.”</span></p><p class="p1"><span class="s1">By making sense of course, he was referring to the pleading for the chance to save a life. The real question is: Why is the federal bureaucracy entitled to make a life or death decision about an individual?</span></p><p class="p1"><span class="s1">Even if your state — or a state you travel to — has a lawful Right to Try law, the odds are against a company giving you a drug that’s still undergoing trials, which average 12 years in length. In the current system, drug companies can only lose by giving patients, particularly dying patients, a drug before it is approved by the FDA.</span></p><p class="p1"><span class="s1">If the patient dies, even if they were already very sick, the company <a href=""><span class="s2">must report it to the FDA</span></a> as an “adverse event.” The agency can then suspend the clinical trials for the drug and the company loses, even though the company has already invested a considerable amount of money in testing the product, possibly millions of dollars. If they recover, it is not counted as a success by FDA. Investors may also walk away from the drug, and the company receives no compensation for creating a small batch.</span></p><p class="p1"><span class="s1">Drug companies don’t keep statistics on the number of people they turn down — so out of the potentially millions of possible applicants — only about 1,200 applications (accepted by companies) are actually sent to the FDA each year. Doctors, who must fill out the FDA Compassionate use form, may not find it worthwhile, even though the FDA has now thoughtfully reduced the work required to fill out the forms.</span></p><p class="p1"><span class="s1">After a very small study to treat victims of Duchene’s disease, a fatal illness that affects about <a href=""><span class="s2">9,000 to 12,000 Americans</span></a> each year, mostly boys, an FDA advisory panel voted it down because it did not meet the “gold standard” of a double-blind clinical trial. Because it affects relatively few people, only 12 boys were treated with 13 in the control group. Of the 12 boys, 10 regained the ability to walk.</span></p><p class="p1"><span class="s1">Hundreds of patients and family members showed up at the hearing with their doctors, begging for approval. The director of FDA’s division of neurology products <a href=""><span class="s2">informed them</span></a>, “Anecdote and emotion do not change the data with which we are confronted, no matter the attendance.” It’s odd that the FDA chose to call this “compassionate use.”</span></p><p class="p1"><span style="font-size: 12px; background-color: white;">And yet, the FDA says in its own </span><a href="" style="font-size: 12px; background-color: white;"><span class="s2">guidance</span></a><span style="font-size: 12px; background-color: white;"> that the “FDA recognizes that patients have a unique and valuable perspective on these considerations and believes that drug development and FDA’s review process could benefit from a more systematic and expansive approach to obtaining the patient perspective.</span></p><p class="p2"><span style="font-size: 12px; background-color: white;">There is currently a bill introduced in the House of Representatives to make Right to Try a </span><a style="font-size: 12px; background-color: white;" href=""><span class="s2">federal law</span></a><span style="font-size: 12px; background-color: white;">. As long as the FDA is able to stop clinical trials after the death of a “Right to Try” patient, however, it’s unlikely that a federal law is going to make much difference.</span></p><p class="p1"><span class="s1">The real question bears repeating: Why is any federal bureaucrat allowed to come between a doctor and her patient and prevent a chance at life? It’s time to try another way.</span></p> Wed, 27 Jul 2016 10:40:44 -0400 What Not to Do as Robots Take More Jobs <h5> Expert Commentary </h5> <p class="p1"><span class="s1">Minimum wage hikes will likely be touted at the Democratic National Convention this week, but such policies can potentially divide society more than unify it. Raising the minimum wage is not the best way to fight income inequality because it will increase the rate of job automation that already disproportionately affects people with less education and lower incomes.</span></p> <p class="p1"><span class="s1">Leading computer scientists at the Future of Life Institute last year wrote an open letter warning policymakers of the risks posed by artificial intelligence. A primary concern is that many jobs will become automated, filled machines. It's a problem if people do not adapt, but one piece of good news is that high-paying jobs in the tech sector are motivating millions of people to learn how to use computers.</span></p> <p class="p1"><span class="s1">Until more people leave low-wage jobs for high-wage jobs, we will have income inequality. Because of income inequality, jobs exist for people who are not ahead of the technological revolution, and higher wages in the tech sector are directing people toward the jobs that will be more profitable and secure in the future. A young person who is about to go to college can see that a computer programmer can make a lot of money. This motivates students to work through difficult technical courses. And you don't have to go to college if you do college-level courses online for free.</span></p><p class="p1"><span class="s1"><a href="">Continue reading</a></span></p> Tue, 26 Jul 2016 17:30:44 -0400 How the FAA Killed Supersonic Flight—and How It Can Revive It <h5> Expert Commentary </h5> <p><span style="font-size: 12px; background-color: white;">Flying is the worst. With each commercial flight, Americans get groped, jostled, cramped, and corralled like cattle. But the Transportation Security Administration (TSA) isn't the only government agency that needlessly adds to our jet-setting woes. If not for Federal Aviation Administration (FAA) meddling in supersonic flight innovation, we could zip around the world in a fraction of the time.</span></p><p>Five decades ago, the future of aerospace engineering was incredibly bright. As international rivals raced to put humans into space, scientists applied new technologies to improve the speed, comfort, and safety of terrestrial air travel. The U.S.-based McDonnell Douglas and Boeing spearheaded the era of commercial jet transportation with the introduction of the Douglas DC-8 and Boeing 707 in the late 1950s. Our European friends got into the aviation game in the 1970s with the Airbus A300, while Lockheed Martin followed up with the L-1011 TriStar around the same time.With competition and research came innovation. By the mid-1970s, engineers were building "supersonic" aircraft capable of traveling well above 767 miles per hour (mph), the speed of sound. In one decade, the top airplane speed on record more than doubled, from 698.5 mph in 1952 to 1,665.9 mph in 1962. By 1976, U.S. military pilot Eldon W. Joerz managed to navigate the Lockheed SR-71 "Blackbird" to a blazing 2,193 mph, almost three times the speed of sound.</p><p>Commercial crafts never traveled quite that fast. But supersonic civilian flight was indeed once a reality. From 1976 to 2003, passengers in a hurry could hop on a majestic Concorde supersonic airliner in London's Heathrow Airport and arrive in New York City in a little over three measly hours. Try the same on a boring old Boeing 747 and you're looking at a travel time of at least 7 or 8 hours.</p><p>Very few of us have enjoyed the thrill and convenience of a supersonic flight. This is partly due to economics: Both the Concorde (1,350 mph) and the Soviet Union's Tupolev Tu-44 (1,200 mph) faced early retirements for financial reasons.</p><p>But this does not explain why top airplane speeds have lagged behind for so many decades. Today, most airlines cruise at altitudes well below the speed of sound, with a standard Boeing 747-B clocking in at a ho-hum 570 mph cruising speed. My Mercatus Center colleagues Eli Dourado and Michael Kotrous recently dug into airspeed data compiled by Fédération Aéronautique Internationale (FAI), an international sports aviation measurement and standard-setting body. Their finding: Innovation in air travel speeds came to a grinding halt after Joerz's still-unbroken record-setting feat in 1976.&nbsp;</p><p>You can thank the FAA for this continued mediocrity in air travel. In 1973, amid ample developments in supersonic flight, the FAA bizarrely decided to prohibit supersonic travel (SST) over the US. Why? When an aircraft travels faster than the speed of sound, it generates shock waves that become compressed into one super-loud "sonic boom." The FAA and other civilian activists were concerned about the potential damage that SST flights could do to the environment or to civil infrastructure.</p><p>Unfortunately, evidence-based policy-making did not guide the FAA's supersonic ban. Knee-jerk techno-skepticism did.</p><p>When the development of the Concorde was announced in 1962, a group of anti-SST scientists and concerned laypeople rallied to stop the march of progress in aviation. A Swedish aeronautics engineer named Bo Lundberg provided much of the academic antagonism, publishing articles through his aviation research institute suggesting that the public would reject the nuisance of SST sonic booms.</p><p>A British primary-school teacher and environmentalist named Richard Wiggs also activated the public in an anti-SST campaign. His Anti-Concorde Project took out full-page ads in papers of record, alarming readers that the "sonic bangs" would be "by far the loudest noise they have ever heard." His other large argument—that British and French taxpayers should not have been compelled to subsidize the development of the Concorde—will be more sympathetic to Reason readers. But unfortunately, the misleading and paranoiac public-outrage campaign disseminated by the Anti-Concorde Project and the allied Citizens League Against the Sonic Boom served as the catalyst for innovation-killing policies.</p><p>Obviously, very few of us would tolerate aviation technology that—elegant and rapid though it may be—ripped holes through homes and shattered windows with each passing flight. Supersonic flights that merely made annoying, but non-destructive, booms every now and then might generate some heated discussion, but that's something that more people could live with. What is imperative is that scientists and policymakers have enough time to sift through factual evidence before assessing what level of risk or downside the public is willing to tolerate.</p><p>To their credit, the FAA did team up with NASA and the U.S. Air Force to preemptively test and measure the outcomes of SST from 1958 through 1968. Contrary to the city-shattering rhetoric emanating from the anti-SST crowd, the sonic booms generated by SST over land in the U.S. were predictable and mild. In several of the tests, people down below did not notice that anything was amiss at all. In one test over St. Louis in 1962, only about 35 percent of the interviewees were even mildly annoyed by the sonic booms at all and less than 10 percent thought about registering a formal complaint (only 1 percent bothered to actually do it).</p><p>In another series of tests over Oklahoma City in 1964, the scientists carefully selected a representative sample of buildings to observe the effects of regular SST flights on day-to-day life for half a year. A surprising 73 percent of the participants felt they could live with the everyday effects of SST travel without any issue. Around 40 percent of respondents said they believed SST caused structural damage to their home, but the scientists were unable to find physical evidence for this themselves. Perhaps the participants were simply imaging things, or perhaps the scientists were not being as judicious as they should in finding damage. More research was needed.</p><p>To better understand the specific structural effects of SST travel, scientists constructed an entire miniature town of fake "houses"—like those atomic testing sites so favored in Hollywood movies— at the White Sands Missile Range in New Mexico in 1964 and 1965 to get an inside peak into just how well those windows and tiles would hold up. Sixteen types of buildings were created, fitted out with standard construction materials of varying quality to get a better idea of how homes of all income brackets would fare. After almost 1,500 booms, the results were broadly encouraging: plasters of all kinds were just fine, glass only chipped when poorly mounted (and not by much), and stucco held up. And if you were worried about the chickens, fear not—sonic booms do not negatively affect egg hatchabliity.</p><p>Alas, paranoid conjectures about crumbling cities and bursting eardrums proved more salient to policymakers than measured data on plasterboard resiliency. The FAA ban on supersonic flight over the U.S. has persisted to this day, and with it, our continued misery in the skies.</p><p>But things could be changing. The Denver-based start-up Boom Technology is developing aircraft to travel at speeds up to 1,451 mph—twice the speed of sound. Airspace veteran Virgin is teaming up with Boom to revive the London-New York SST route blazed by Concorde so many decades ago. And speaking of Concorde, its passionate fans in Club Concorde just might pull off their crazy dream of crowdfunding enough money to put the beloved beauty back in the sky.</p><p>Japan and NASA are jumping into the heady world of SST travel as well.&nbsp;Yet all of this capital investment will be for nothing if policy remains the same. Allowing SST over U.S. territory would do a lot to improve the profitability odds for this historically-challenging market. The FAA has an opportunity to open the gates to major innovation by making one small policy change: Remove the outright ban on supersonic travel. Instead, designate a minimum noise level that entrepreneurs must target to be airworthy. That's it!</p><p>The agency has considered this policy before, but remained unfortunately stubborn in a 2008 statement: it will not issue a noise standard until the "noise impacts of supersonic flight are shown to be acceptable." But this Kafkaesque stance is virtually no different from an outright ban: entrepreneurs can't show which noise impacts of SST flight are "acceptable" without have a definition of "acceptable" to shoot for.</p><p>Fortunately, as technologists and even policymakers begin to see airspace as another platform for innovation, the appetite for "permissionless innovation" policy-reforms is growing. The FAA could be a champion for the next great American industry as soon as it decides to change.&nbsp;</p> Tue, 26 Jul 2016 17:26:46 -0400 Seeing China through Its Economic History <h5> Expert Commentary </h5> <p class="p1"><span class="s1">Is it possible to better understand China today by looking back to the country's economic history? I don't mean the years of communism under Chairman Mao, but rather earlier times, those which seem to many Western observers like a blurred sequence of one dynasty after another.</span></p> <p class="p1"><span class="s1">Enter Richard von Glahn's “The Economic History of China: From Antiquity to the Nineteenth Century,” a <a href=""><span class="s2">book</span></a> likely to go down as one of the year's best. Over the last 15 years, the economics profession has gone from a poor understanding of China's economic history to knowing quite a bit. Von Glahn's exhaustive but readable book is the best guide to this rapidly growing body of knowledge.</span></p> <p class="p1"><span class="s1">I took away several overall lessons, noting these are my extrapolations and not necessarily the opinions of von Glahn, a professor at the University of California at Los Angeles.</span></p> <p class="p2"><span class="s1"><a href=""><b>China's Economic Data</b><span class="s3"><b>&nbsp;</b></span></a></span></p> <p class="p1"><span class="s1">First, in thousands of years of Chinese history there isn't much of a trend toward democracy or representative government. In an age when Turkey and Russia have been rejecting open and transparent representation, it hardly seems obvious that China will move toward greater political freedom. When Chinese leaders tell their citizens that <a href=""><span class="s2">Brexit</span></a> and the Trump candidacy represent failures of democracy in action, a lot of Chinese citizens believe them.&nbsp;</span></p> <p class="p3"><span style="font-size: 12px; background-color: white;">Furthermore, a lot of autocratic Chinese regimes in history have proven stable even in periods of fairly slow economic growth. It can take them centuries to fall and be replaced, and even then a foreign invasion, like ones by the Mongols or Manchus, may be required.</span></p> <p class="p1"><span class="s1">From today’s media, one sometimes receives the impression that a Chinese growth rate below 4 or 6 percent could mean radical instability and a rapid fall of the government, but Chinese history does not show this pattern. That is hardly proof of how things will run in the future, but it should shift our expectations in the direction of greater Chinese political stability.</span></p> <p class="p1"><span class="s1">It is striking how many contemporary Chinese economic policy ideas have parallels in earlier times. Going through this history, von Glahn explains the importance of state-owned enterprises, the use of fiscal policy to keep people working, commodity monopolies (then tea, now cigarettes) and population registration across many centuries. I take those continuities as signs that China today is embodying what the country was for a long time, rather than inhabiting a transitional state before morphing into something different.</span></p> <p class="p1"><span class="s1">The book also explains how China adopted an earlier series of modernizing, market-oriented reforms during what is called the Tang-Song transition of 755-1127. The most striking feature of this relatively successful time is that it lasted for almost 400 years, in spite of periodic territorial losses to outside conquerors. The extreme instability of the 19th and much of the 20th century in China is the historical outlier, not the norm, and so China today may have fallen back into one of its relatively stable episodes.</span></p> <p class="p1"><span class="s1">If there is a single common theme running through the many centuries covered by this book, it is the never-fully-successful quest of the Chinese state for revenue and fiscal stability. One reason China fell behind Western Europe in the 18th century is simply that the Chinese state spent less on creating valuable public goods and infrastructure.</span></p> <p class="p1"><span class="s1">In 1993, 15 years after it began making market-oriented reforms, the Chinese central government’s direct revenue was only 3 percent<sup>1</sup>&nbsp;of gross domestic product, with the usual caveat that no Chinese numbers should be taken as exact measures. Only in the last 10 years has that <a href=""><span class="s2">revenue share</span></a> exceeded 10 percent of GDP; by comparison, in the U.S. in normal times that number sits in the range of <a href=""><span class="s2">17 to 18 percent</span></a>. For all the images Americans might have of China’s government as a communist behemoth, the country’s political order is better understood as still somewhat immature.</span></p> <p class="p4"><span style="font-size: 12px; background-color: white;">State-owned enterprises, local governments and direct Communist Party control all filled the gap to boost the power of the rulers, and that helps explain why the Chinese find it hard to fully modernize their economy. Their government has had too little facility in grabbing flows of revenue and thus it has overspecialized in taking, owning and controlling assets. Given that choice, the central government is reluctant to reform, shut down or unload its state-owned enterprises, if only for fear that too much bankruptcy and unemployment would result, not to mention a broader loss of control. Furthermore, local Chinese governments often still do not have enough revenue, and so they rely too heavily on sales or rental income from land to keep things up and running, a revenue model that cannot last forever.</span></p> <p class="p6"><span class="s1">In the view of this reader, China’s most likely near-term economic future is that of a relatively stable political regime whose artificial schemes for raising revenue and staying in power keep the economy distorted. In this perspective, China’s recent move toward a more comprehensive value-added tax was bigger news than many observers realized, and mostly positive, though the country has not yet achieved fiscal maturity.</span></p> <p class="p6"><span class="s1">Like the country’s ultimate resiliency, China's most serious economic problems may prove to be some of its most longstanding features. “The Economic History of China” may be an academic tome, but it is also an acute lens on the Middle Kingdom that you won't find in your daily news feed.</span></p> <p class="p7"><span class="s1"><i>1. See page 440 of “China’s Fiscal System: A Work in Progress” By Christine P.W. Wong and Richard M. Bird, included in “China’s Great Transformation” edited by Loren Brandt and Thomas Rawski.</i></span></p> Tue, 26 Jul 2016 17:19:59 -0400 'Pokemon Go' Represents the Best of Capitalism <h5> Expert Commentary </h5> <p class="p1"><span class="s1">A recent article uploaded to by Timothy Lee, “<a href=""><span class="s2">Pokémon Go is everything that is wrong with late capitalism,”</span></a> has caused quite a stir, since it was fairly critical of the “Pokémon Go economy.” Given the popularity of the game and our concern that some players would be alarmed that their lighthearted entertainment was somehow destroying the economy, we wanted to offer a different perspective on&nbsp;some of the points made in the article.</span></p> <p class="p1"><span class="s1">In fact, we think “Pokémon Go” actually represents the&nbsp;<i>best</i>&nbsp;of capitalism. In less than a week, the game topped&nbsp;<a href=""><span class="s2"><i>15 million</i>&nbsp;downloads</span></a>&nbsp;and the&nbsp;<a href=""><span class="s2">21 million active daily users</span></a>&nbsp;spend an average of&nbsp;<a href=""><span class="s2">33 minutes a day playing</span></a>. That amounts to&nbsp;<i>more than 11.5 million hours</i>&nbsp;of playing per day, and those numbers only look to increase. The app doesn’t cost anything to download and play, which means Nintendo and Niantic (the game developer) are essentially giving away tens of millions of dollars of value to the eager players.</span></p> <p class="p1"><span class="s1">We know that’s a bold statement. But this is why it’s true: A person’s time is scarce and valuable. Every moment he&nbsp;spends playing “Pokémon Go” he&nbsp;could instead be doing something else. The fact that people are&nbsp;voluntarily choosing to play means the benefit of playing is more than the cost.</span></p> <p class="p3"><span class="s1"><b>At Least $11.5 Million in Value Every Day</b></span></p> <p class="p1"><span class="s1">Economists call this the&nbsp;<a href=""><span class="s2">“consumer surplus”</span></a></span><span class="s3"> </span><span class="s1">—</span><span class="s3"> </span><span class="s1">the difference between a customer’s willingness to pay for a good or service and the price it actually costs. It’s a measurement of the dollar value the consumer gains in the exchange. If a person were to buy a game of bowling for $5 that he valued at $7 instead of playing an hour of Pokémon that he valued at $3 for free, that person would lose out on value that would have made his life better.</span></p> <p class="p1"><span class="s1">So even if the average consumer surplus is only a measly dollar an hour, consumers are getting $11.5 million in value each day. The fact that customers are buying special items to use in the game, spending upwards of&nbsp;<a href=""><span class="s2">$1.6 million each day</span></a>, implies that the value players receive from the game is actually higher.</span></p> <p class="p1"><span class="s1">The article laments local economies are harmed because people are turning toward forms of entertainment that don’t have high production costs, like movie theaters or bowling alleys that need expensive buildings or numerous employees selling buckets of popcorn. It misses that the economic activity associated with traditional entertainment options represent the&nbsp;<i>costs</i>&nbsp;of providing the entertainment. The reality we have now is much better, since we not only gain the value of the entertainment, but we have the money we would have paid for it to purchase other things as well. It’s almost like getting something for nothing, and our lives</span><span class="s3"> </span><span class="s1">—</span><span class="s3"> </span><span class="s1">and the economy in general</span><span class="s3"> </span><span class="s1">—</span><span class="s3"> </span><span class="s1">are better as a result.</span></p> <p class="p3"><span class="s1"><b>Yay: Less Scarcity!</b></span></p> <p class="p1"><span class="s1">This is the core of economic growth</span><span class="s3"> </span><span class="s1">—</span><span class="s3"> </span><span class="s1">decreasing the scarcity of goods and services that limits our lives. The article makes it seem as if economic growth comes from&nbsp;<a href=""><span class="s2">simply spending money</span></a>. This view can lead us astray because it ignores the importance of entrepreneurs, whose role is critical in creating new products and services that improve everyone’s well-being.</span></p> <p class="p1"><span class="s1">“Pokémon Go” is actually a great example of this. The game developers and their investors thought they could make something customers might like, and they took the entrepreneurial risk to create the game without the certainty that it was going to be a success. Obviously, it was a good gamble, but we’re sure <a href=""><span class="s2">even they are amazed at the results</span></a>. Imagine if the game development funds had been used to build some bowling alleys instead. Wow. What fun.</span></p> <p class="p1"><span class="s1">Think of&nbsp;<a href=""><span class="s2">what would have</span></a>&nbsp;<a href=""><span class="s2">been lost to society</span></a>&nbsp;if entrepreneurs didn’t have the funds and the freedom to take that gamble. Their success has also spawned a sub-industry of&nbsp;<a href=""><span class="s2">“Poképreneurs”</span></a>&nbsp;who are&nbsp;<a href=""><span class="s2">selling drinks</span></a>&nbsp;and&nbsp;<a href=""><span class="s2">providing rides</span></a>&nbsp;to Pokémon players. Economic growth</span><span class="s3"> </span><span class="s1">—</span><span class="s3"> <a href=""><span class="s4">and our increased social well-being</span></a> </span><span class="s1">—</span><span class="s3"> </span><span class="s1">depends on this kind of <a href=""><span class="s2">permissionless innovation</span></a>.</span></p> <p class="p1"><span class="s1">In short, “Pokémon Go” represents the very&nbsp;<i>best</i>&nbsp;of capitalism because it’s premised on voluntary exchange</span><span class="s3"> </span><span class="s1">—</span><span class="s3"> </span><span class="s1">no one is forced to download the game, players can stop playing any time they like, and if they value the special items available in the game store they can buy them to enhance their fun. Furthermore, the entrepreneurs who had the foresight and the guts to dare to make the world a better place are being rewarded for their accomplishment.</span></p> <p class="p1"><span class="s1">Most importantly, that success only comes about because they have made people’s lives better in the process. That’s something Team Rocket could never learn to do.</span></p> Tue, 26 Jul 2016 17:08:18 -0400 Worry about Lack of New Banks, Not 'Record Profits' <h5> Expert Commentary </h5> <p class="p1"><span class="s1">The House Committee on Oversight and Government Reform just had a <a href=""><span class="s2">hearing</span></a> about the Federal Deposit Insurance Corporation's (FDIC) application process for <i>de novo</i> (new) banks. The purpose of the hearing was to uncover why we have so few new banks of late: Is it the slow economy and low interest rates, the regulations or something else driving away new applicants? Maybe the answer is: all of the above.</span></p> <p class="p2"><span style="font-size: 12px; background-color: white;">As a recent Federal Reserve Bank of Richmond </span><a style="font-size: 12px; background-color: white;" href=""><span class="s2">study</span></a><span style="font-size: 12px; background-color: white;"> showed, the number of new banks has declined significantly since 2009. The decline coincided with the end of the crisis and the FDIC's prolonging of the de novo period, during which new banks must adhere to their capital plans, from three to seven years. (Interestingly, an earlier Richmond Fed </span><a style="font-size: 12px; background-color: white;" href=""><span class="s2">study</span></a><span style="font-size: 12px; background-color: white;"> observed how the FDIC similarly tried to restrict entry after the much larger number of Depression-era failures.) That the FDIC </span><a style="font-size: 12px; background-color: white;" href=""><span class="s2">reversed</span></a><span style="font-size: 12px; background-color: white;"> this rule earlier this year could mean staff there acknowledge the rule's adverse effects.</span></p> <p class="p1"><span class="s1">On the other hand, another <a href=""><span class="s2">study</span></a> by Federal Reserve Board economists found that non-regulatory economic factors, such as low interest rates and other measures of economic activity that drive down profitability, explained at least 75 percent of the decline in new banks. The authors do not measure the effects of specific regulations, merely the increase or decrease in the number of de novo banks after the passage of a law. Still, the study does find that the <a href=""><span class="s2">Dodd-Frank Act</span></a>, together with the FDIC's rule change, had the largest (and most negative) effect of any previous law on new bank applications. That means laws and regulations probably have some effect, even if we cannot precisely measure it.</span></p> <p class="p1"><span class="s1">Moreover, as my colleague Patrick McLaughlin shows in a recent co-authored <a href=""><span class="s2">study</span></a>, the regulatory restrictions embodied in the Code of Federal Regulations may have reduced gross domestic product (GDP) in recent years by as much as $4 trillion. With a relatively smaller economy, it's easy to believe fewer opportunities would exist for bankers.</span></p> <p class="p1"><span class="s1">A related and unresolved issue that came up during the recent congressional hearing was the "record profits" in the banking industry and whether those profits could be consistent with reports of declining profitability. Yes, industry profits in dollars may be at record high levels, but so are total U.S. banking assets and GDP (which might be higher with fewer regulatory restrictions). "Record profits" in dollar terms could just reflect the fact that the economy as a whole (as measured in dollar values), not just the banking industry, is larger than ever before. Just as people may experience a decline in income while GDP rises to record highs, banks may experience declining profitability even with "record profits" for the industry.</span></p> <p class="p1"><span class="s1">Finally, some at the hearing voiced concerns about the decline in the total number of banks. Barriers to entry exist, which could present problems for the longer-term performance of the industry, but the total number of banks is also affected by industry consolidation.</span></p> <p class="p1"><span class="s1">Written before the <a href=""><span class="s2">1994 Riegle-Neal Act</span></a>, which paved the way for interstate banking and consolidation at the federal level, a Richmond Fed study <a href=""><span class="s2">conjectured</span></a> that interstate banking would mean the number of banks would decline. The study suggested, as a simple thought experiment, comparing the number of banks to the size of the population to get a rough estimate of how many banks might exist without the geographic barriers to banking.</span></p> <p class="p1"><span class="s1">According to the study, inferring from California's banking industry and population, the number of banks under interstate banking might be as high as 3,700. On the other hand, inferring from Canada's banking industry and population, the number of banks under interstate banking might be as low 75 banks (with most operating branches nationwide). All this is to say, we're likely still way above the range we might expect to see with competitive interstate banking.</span></p> <p class="p1"><span class="s1">Going forward, proposed laws and regulations that favor banks of a certain size or scope, such as the <a href=""><span class="s2">GOP's new plan</span></a> to bring back the Glass-Steagall Act, reflect a return to our error-prone past. Neither the Great Depression nor the recent crisis were about the <a href=""><span class="s2">divide between commercial and investment banking</span></a>. And as Professors Charles Calomiris and Stephen Haber wrote about in their book "<a href=""><span class="s2">Fragile by Design</span></a>," we've had many banking crises throughout U.S. history, largely because of such policies. <a href=""><span class="s2">Let's let customers decide</span></a> the number, size and scope of their banks, not lawmakers and regulators, no matter how well-intentioned.</span></p> Tue, 26 Jul 2016 16:59:29 -0400 Preparing for the Future of Artificial Intelligence <h5> Publication </h5> <p class="p1">The Office of Science and Technology Policy (OSTP) has requested comments pertaining to the governance of artificial intelligence (AI) technologies.&nbsp;</p> <p class="p2"><span style="font-size: 12px; background-color: white;">The Technology Policy Program of the Mercatus Center at George Mason University is dedicated to advancing knowledge of the impact of regulation on society. It conducts careful and independent analyses employing contemporary economic scholarship to assess policy issues from the perspective of the public interest.&nbsp;</span></p> <p class="p1">We write here to comment on the appropriate policy framework for artificial intelligence (AI) technologies at this nascent stage of their development and to make the case for prudence, patience, and a continuing embrace of “permissionless innovation.” Permissionless innovation refers to the idea that “experimentation with new technologies and business models should generally be permitted by default. Unless a compelling case can be made that a new invention will bring serious harm to society, innovation should be allowed to continue unabated and problems, if they develop at all, can be addressed later.”&nbsp;</p> <p class="p1">Policymakers may be tempted to preemptively restrict AI technologies out of an abundance of caution for the perceived risks these new innovations might seem to pose. However, an examination of the history of US technology policy demonstrates that these concerns can be adequately addressed without quashing a potentially revolutionary new industry.&nbsp;</p> <p class="p1">Specifically, as policymakers consider the governance of AI, they would be wise to consider the lessons that can be drawn from our recent experience with the Internet. The United States made permissionless innovation the basis of Internet policy beginning in the early 1990s, and it soon became the “secret sauce” that propelled the rise of the modern digital revolution.&nbsp;</p> <p class="p1">If policymakers wish to replicate America’s success with the Internet, they need to adopt a similar “light-touch” approach for the governance of AI technologies. To highlight the benefits of permissionless innovation, the Mercatus Center at George Mason University has recently published a book, a series of law review articles, and several agency filings that explain what this policy vision entails for different technologies and sectors. A summary of the major insights from these studies can be found in a recent Mercatus Center paper called “Permissionless Innovation and Public Policy: A 10-Point Blueprint.”&nbsp;</p> <p class="p1">If one’s sole conception of a technology comes from Hollywood depictions of dystopian science fiction or killer robotic systems run amok, it is understandable that one might want to use the force of regulation to clamp down decisively on these “threats.” But these fictional representations are just that: fictional. AI technologies are both much more benign and fantastic in reality.</p> <p class="p1">The economic benefits of AI are projected to be enormous. One recent study used benchmarks derived from methodologically conservative studies of broadband Internet, mobile phones, and industrial robotics to estimate that the economic impact of AI could be between $1.49 trillion and $2.95 trillion over the next ten years. With less strict assumptions, the economic benefits could be greater still.</p> <p class="p1">However, some skeptics are already making the case for a preemptive regulation of AI technologies. The rationales for control are varied, including concerns ranging from deindustrialization to dehumanization, as well as worries about the “fairness” of the algorithms behind AI systems. &nbsp;</p> <p class="p1">Due to these anxieties associated with AI, some academics argue that policymakers should “legislate early and often” to “get ahead of” these hypothetical problems. Specifics are often in short supply, with some critics simply hinting that “something must be done” to address amorphous concerns.&nbsp;</p> <p class="p1">Other scholars have provided more concrete regulatory blueprints, however. They propose, among other things, the passage of broad-based legislation such as an “Artificial Intelligence Development Act,” as well as the creation of a federal AI agency or possibly a “Federal Robotics Commission” or “National Algorithmic Technology Safety Administration.” These proposed laws and agencies would establish a certification process requiring innovators to subject their technologies to regulatory review to “ensure the safety and security of their A.I.” Or, at a minimum, such agencies would advise other federal, state, and local officials and organizations on how to craft policy for AI and robotics.&nbsp;</p> <p class="p1">Such proposals are based on “precautionary principle” reasoning. The precautionary principle refers to the belief that new innovations should be curtailed or disallowed until their developers can prove that they will not cause any harms to individuals, groups, specific entities, cultural norms, or various existing laws, norms, or traditions.</p> <p class="p1">It is certainly true that AI technologies might give rise to some of the problems that critics suggest. And we should continue to look for constructive solutions to the potentially thorny problems that some of these critics discuss. That does not mean that top-down, technocratic regulation is sensible, however.&nbsp;</p> <p class="p1">Traditional administrative regulatory systems have a tendency to be overly rigid, bureaucratic, and slow to adapt to new realities. This is particularly problematic as it pertains to the governance of new, fast-moving technologies.</p> <p class="p1">Prior restraints on innovative activities are a recipe for stagnation. By focusing on preemptive remedies that aim to predict hypothetical problems that may not ever come about, regulators run the risk of making bad bets based on overconfidence in their ability to predict the future. Worse yet, by preempting beneficial experiments that yield new and better ways of doing things, administrative regulation stifles the sort of creative, organic, bottom-up solutions that will be needed to solve problems that may be unforeseeable today.</p> <p class="p1">This risk is perhaps more pronounced when dealing with AI technologies. <i>How </i>“artificial intelligence” is regulated makes little sense until policymakers define <i>what </i>it actually entails. The boundaries of AI are amorphous and ever changing. AI technologies are already all around us—examples include voice-recognition software, automated fraud detection systems, and medical diagnostic technologies—and new systems are constantly emerging and evolving rapidly. Policymakers should keep in mind the rich and distinct variety of opportunities presented by AI technologies, lest regulations more appropriate for one kind of application inadvertently stymie the development of another.</p> <p class="p1">Toward that end, we suggest that a different policy approach for AI is needed, one that is rooted in humility and a recognition that we possess limited knowledge about the future.&nbsp;</p> <p class="p1">This does not mean there is no role for government as it pertains to AI technologies. But it does mean that policymakers should first seek out less restrictive remedies to complex social and economic problems before resorting to top-down proposals that are preemptive and proscriptive.&nbsp;</p> <p class="p1">Policymakers must carefully ensure they have a full understanding of the boundaries and promises of all of the technologies they address. Many AI technologies pose little or no risks to safety, fair market competition, or consumer welfare. These applications should not be stymied due to an inappropriate regulatory scheme that seeks to address an entirely separate technology. They should be distinguished and exempted from regulations as appropriate.</p> <p class="p1">Other AI technologies may warrant more regulatory consideration if they generate substantial risks to public welfare. Still, regulators should proceed cautiously.&nbsp;</p> <p class="p1">To the extent that policymakers wish to spur the development of a wide array of new life-enriching technologies, while also looking to devise sensible solutions to complex challenges, policymakers should consider a more flexible, bottom-up, permissionless innovation approach as the basis of America’s policy regime for AI technologies.</p> <p class="p2">&nbsp;</p> Fri, 22 Jul 2016 13:56:24 -0400 Government Report Finds That ACA Medicaid Enrollees Much More Expensive Than Expected <h5> Expert Commentary </h5> <p class="p1"><span class="s1">The Department of Health and Human Services’ (HHS) annual <a href=""><span class="s2">report</span></a> on Medicaid’s finances contains a stunning update: the average cost of the Affordable Care Act’s Medicaid expansion enrollees was nearly 50% higher in fiscal year (FY) 2015 than HHS had projected just one year prior. Specifically, HHS found that the ACA’s Medicaid expansion enrollees cost an average of $6,366 in FY 2015—49% higher than the $4,281 amount&nbsp;that the agency projected in last year’s report.</span></p> <p class="p1"><span class="s1">The government’s chief financial experts appear not to have anticipated how states would respond to the federal government’s 100% financing of the cost of people made eligible for Medicaid by the ACA. It appears that the enhanced federal funding for the ACA expansion population has led states to set outrageously high capitation rates—the amount government pays insurers—for the ACA Medicaid expansion population. The rates are much higher than the amounts&nbsp;for previously eligible Medicaid adult enrollees and suggest that states are inappropriately funneling federal taxpayer money to insurers, hospitals, and other health care interests through the ACA Medicaid expansion.</span></p> <p class="p1"><span class="s1">The magnitude of HHS’ error reveals a major flaw in the government’s ability to estimate the ACA’s costs, and worse, that the actual costs of the ACA’s Medicaid expansion appear much higher than expected. Both problems require the immediate attention of policymakers.<b>&nbsp;</b></span></p> <p class="p1"><span class="s1"><b>Medicaid Expansion Enrollees Are Much More Expensive Than Expected</b></span></p> <p class="p1"><span class="s1">Most experts, particularly proponents of the ACA, <a href=""><span class="s2">projected</span></a> that newly eligible adult enrollees would be less expensive than previously eligible adult enrollees. For example, HHS’ financial and actuarial experts <a href=""><span class="s2">projected</span></a> that adult Medicaid enrollees made eligible by the ACA would be 30% less costly than previously eligible adults enrolled in Medicaid. Apparently, their models did not account for states responding to the incentive of the elevated reimbursement rate to spend freely.</span></p> <p class="p1"><span class="s1">In last year’s Medicaid <a href=""><span class="s2">report</span></a>, HHS estimated that newly eligible adults had an average cost 19% higher ($5,517) in FY 2014 than the average cost for previously eligible adults ($4,650). In projecting future per enrollee costs, HHS’ experts assumed “that the effects of pent-up demand and adverse selection” would substantially diminish after 2014. HHS projected that the per enrollee cost of the newly eligible adults would decline by 22% in FY 2015 and would be about 11% less than those for other previously eligible adults.</span></p> <p class="p1"><span class="s1">It turns out that those projections were way off. Instead of a decline in per enrollee costs from FY 2014 to FY 2015, the newly eligible adult per enrollee cost increased significantly, reaching an estimated $6,366. HHS now projects that the newly eligible adult Medicaid enrollees will cost about 23% more than the previously eligible Medicaid enrollees in FY 2015. It is worth noting that pregnant women are included in the previously eligible Medicaid enrollment category; without them, the differences would be even more pronounced.</span></p> <p class="p1"><span class="s1">According to the 2015 report, HHS’ actuaries and financial experts expected much lower managed care capitation rates for the ACA expansion population than occurred. But, the higher payment rates should not be too surprising given the incentives created for states by the elevated federal reimbursement rate for the expansion population.</span></p> <p class="p1"><span class="s1">The elevated rate presents states with an incentive to create high fees for services commonly used by expansion enrollees, as well as high capitated payment rates for the insurers participating in Medicaid managed care. The health care interest groups within the states, particularly hospitals and insurers, benefit from the higher rates while federal taxpayers are left footing the bill. Although HHS expects that a risk sharing program will return money to the government (HHS expects the federal government will receive 9% of the payments back to&nbsp;lower the per enrollee costs of newly eligible enrollees to $5,001 and $5,796 respectively for FY 2014 and FY 2015), risk sharing creates an incentive for insurers to spend freely since unspent funds generally have to be returned. Moreover, the elevated federal reimbursement rate removes the incentives for states to make sure that insurers are not overspending on providers since overpayments come at the expense of federal, not state, taxpayers.</span></p> <p class="p1"><span class="s1"><b>ACA’s Medicaid Expansion Costs Are Increasing</b></span></p> <p class="p1"><span class="s1">In the 2015 report, HHS indicated that Medicaid spending reached $554.3 billion in FY 2015—5% higher than its&nbsp;projection ($529.0 billion) for FY 2015 from the previous year’s report and 12.1% above FY 2014 spending. In March, before the release of this new data, the Congressional Budget Office (CBO) increased its projection of federal spending for Medicaid by $146 billion over the 2016-2025 period—a substantial escalation from its projection just one year earlier. According to CBO, “the number of people estimated to have been enrolled in Medicaid in 2015 who were made eligible for the program by the ACA was significantly higher than … previously projected.” As CBO digests the much higher per expansion enrollee costs, its future estimates of Medicaid’s costs will undoubtedly increase.</span></p> <p class="p1"><span class="s1"><b>Congress Needs to Act</b></span></p> <p class="p1"><span class="s1">Recent <a href=""><span class="s2">evidence</span></a> that new Medicaid enrollees only receive about 20 to 40 cents of benefit for each dollar of spending on their behalf and that Medicaid expansion in Oregon was <a href=""><span class="s2">not</span></a> related to significant health improvements have already prompted major concerns about the ACA Medicaid expansion. The much higher than expected costs of ACA Medicaid expansion enrollees should increase concern about the ACA Medicaid expansion and should prompt robust oversight from federal policymakers. There is much work to be done after these troubling findings. As a start, HHS should make all the data available, including the average costs of expansion enrollees by state. In addition,&nbsp;Congress should closely scrutinize managed care contracts that states are making with insurers as well as any actions that HHS is taking to guard against outrageously high federal payments for the expansion population.</span></p> Thu, 21 Jul 2016 10:47:57 -0400 Stop Bleeding Red Ink, Make America Sustainable Again <h5> Expert Commentary </h5> <p class="p1"><span class="s1">The Congressional Budget Office recently released its long-term budget outlook. There isn't much new there; we are still in the red, and it will only continue to get worse. Considering the extent of the problem, you would think someone on the campaign trail would pay attention. Yet no presidential candidate really is.</span></p> <p class="p1"><span class="s1">First, CBO projects that the federal public debt-to-GDP ratio will go from its current 75 percent (up from 39 percent in 2008) to 86 percent in 2026 and 141 percent in 2046. On the deficit side, CBO projects that by 2020, our deficit level will reach $1 trillion, up from its current level of $534 billion. Today's deficit-to-GDP ratio is 2.9 percent, and it may be close to 5 percent in 10 years and 8.8 percent in 2046.</span></p> <p class="p1"><span class="s1">There are a lot of assumptions going into these projections. As we know, a small change in these assumptions can have a significant impact. For instance, the newest projections show a slight improvement over previous projections because of lower-than-expected interest rates. However, CBO warns, a 1 percent increase in interest rates would propel the debt-to-GDP level to 188 percent. Gross debt would be much higher.</span></p> <p class="p1"><span class="s1">In addition, we know that many of these assumptions (e.g., that there will not be a depression in the next 30 years and that the unemployment rate will stay consistently at 5 percent over the next 30 years) are unlikely to materialize, which would make the final numbers look way worse than they do now.</span></p> <p class="p1"><span class="s1">But even without assuming the worst, CBO talks about our dire fiscal outlook, "with debt growing larger in relation to the economy than ever recorded in U.S. history." Indeed, down the road, debt is projected to reach much higher levels than in the aftermath of World War II, when it stood at 106 percent of gross domestic product. But these levels of debt today are more worrisome than in the 1940s. For one thing, the debt levels in the '40s were the product of significant increases in war spending, which naturally went down after the war. In addition, the postwar era experienced a fast-growing economy, which also helped lead to major reductions in debt levels.</span></p> <p class="p1"><span class="s1">That is not going to happen today. CBO projects meek economic growth all the way to 2046, along with large increases in spending levels. That means that unless we get a major breakthrough in technology or a life-altering discovery (which could happen, of course), I wouldn't count on post-WWII reduction in deficits and debt and growth levels.</span></p> <p class="p1"><span class="s1">But debt and deficits are only a symptom of a deeper problem; spending is growing faster than revenue. While revenue will grow from 18.2 percent of GDP today to 19.4 percent by 2046 (when the 50-year average will be 17.4 percent), spending will explode from 21.1 percent of GDP today to 28.2 percent of GDP in 2046 (when the 50-year average will 20.2 percent).</span></p> <p class="p1"><span class="s1">The drivers of our future debt, CBO reminds us, are still the so-called entitlement programs — government-provided health care spending, in particular. It doesn't mean that Social Security is not a problem, because it is — as is the large growth in interest payments on our debt. But you wouldn't know that by listening to the vague policy options on the campaign trail or in Washington, where talks of expanding Social Security, adding a public option to the Affordable Care Act and not touching Medicare are very popular.</span></p> <p class="p1"><span class="s1">Each day of the Republican National Convention had a different theme. Monday's theme was "Make America Safe Again." Another was "Make America First Again." Maybe someone should suggest that we "Make America Sustainable Again."</span></p> Thu, 21 Jul 2016 10:40:43 -0400 Recessions Don't Have The Same Impact On Every City <h5> Expert Commentary </h5> <p class="p1"><span class="s1">It has been just over seven years <a href=""><span class="s2">since the Great Recession</span></a> ended. The national economy has been expanding—<a href=""><span class="s2">albeit slowly</span></a>—over the last seven years, but there are still some measures, such as the <a href=""><span class="s2">labor force participation rate</span></a>, that have yet to fully bounce back. Even though the national statistics indicate a growing economy, <a href=""><span class="s2">some areas are still struggling.</span></a></span></p> <p class="p1"><span class="s1">The U.S. is a large country made up of hundreds of local economies, each with its own mix of industries and residents along with different local policies. The economic fluctuations of these local economies determine the business cycle of the country as a whole, and during any national recession some local economies may be doing fine. But exactly how much local variation is concealed by the national measures?</span></p> <p class="p1"><span class="s1">A <a href=""><span class="s2">new paper published in the Journal of Urban Economics</span></a> sheds some light on this question by examining the business cycles of local economies. The authors—Maria Arias and Charles Gascon of the St. Louis Fed and David Rapach of Saint Louis University— have created monthly economic activity indices for the 50 most populated Metropolitan Statistical Areas (MSA) using data from 1990 to 2015. This period <a href=""><span class="s2">covers three national recessions</span></a> (July 1990 – March 1991, March 2001 – Nov. 2001, and Dec. 2007 – June 2009) and the indices show how the economies of each of these MSAs performed during these economic downturns.</span></p> <p class="p1"><span class="s1">The table below is taken from the paper and lists the 50 MSAs along with the dates of their recessions over this 26 year period.</span></p> <p class="p1"><a href=""><img src="" width="575" height="533" /></a></p> <p class="p1"><span class="s1">The table shows that many of these MSAs did not experience a recession in the early 1990s. In fact, only 26 of the 50 had their first recession (columns 2 and 3) in the 90s. The later recessions were more widespread: 32 out of 50 in the early 2000s and 49 out of 50 during the most recent recession. The only exception during the Great Recession was Oklahoma City, which according to this study has not experienced one recession since 1990.</span></p> <p class="p1"><span class="s1">In addition to the variation across recessions, there is also variation across MSAs within a recession. In the early 90s, Los Angeles’ recession lasted over three years (3/90 t0 4/93) while Memphis’ started a month later and only lasted a year (4/90 to 4/91). Detroit’s version of the Great Recession lasted nearly four years (10/05 – 6/09) while Boston’s was just over a year (7/08 – 8/09).</span></p> <p class="p1"><span class="s1">There is even variation across MSAs in the same state: Neither Columbus nor Cincinnati experienced a recession in the early 90s while Cleveland had one that lasted almost two years. The authors also point out that the “” recession of the early 2000s had a large effect on the tech hubs of San Francisco and San Jose while Sacramento, also in California, did not experience a recession during that time period.</span></p> <p class="p1"><span class="s1">Table 4 from the paper gives us a clearer picture of the overlap between each MSA’s economy and the national economy.</span></p> <p class="p1"><a href=""><img src="" width="575" height="339" /></a></p> <p class="p1"><span class="s1">The columns show the number of months (out of 305) that the local economy was expanding when the national economy was expanding (columns 1 and 8); was in a recession when the national economy was in a recession (columns 3 and 9); was in a recession when the national economy was expanding (columns 4 and 10); and was expanding when the national economy was in a recession (columns 5 and 11). Columns 6 and 12 show the percentage of months in which the local and national economy were in the same phase of the business cycle.</span></p> <p class="p1"><span class="s1">Atlanta (96%) and Charlotte (95%) closely track the national economy while Detroit (78%) and Hartford (80%) were in the recession phase more often than the national economy. New Orleans’ match rate of only 35% is an outlier and the authors attribute some of its poor economic performance <a href=""><span class="s2">to Hurricane Katrina</span></a>, which occurred near the middle of the sample period (2005).</span></p> <p class="p1"><span class="s1">The authors also examine the relationship between the severity of local economic recessions and various MSA characteristics. They find robust evidence that MSAs with less-educated populations and with <a href=""><span class="s2">more inelastic housing supplies</span></a> experience more severe recessions. An inelastic supply of housing means that the quantity of housing is not very responsive to changes in the price of housing.</span></p> <p class="p1"><span class="s1">There is a large body of evidence showing that the proportion of educated residents in a city <a href=""><span class="s2">has a positive effect on subsequent economic and population growth</span></a>. It is not surprising to see that the economies of MSAs with more educated populations are more resilient as well.</span></p> <p class="p1"><span class="s1">The authors note that the elasticity of housing supply effect is also consistent with other studies. Places with more <a href=""><span class="s2">inelastic housing supplies experience larger housing price fluctuations</span></a> which can make these areas more susceptible to “boom-bust” housing cycles. There is also evidence<a href=""><span class="s2">that households decrease their spending in response to changes in housing net worth</span></a>, and since economic shocks that decrease housing demand will have larger price effects in areas with more inelastic housing supplies, it follows that these areas will experience larger declines in spending and thus longer and more severe recessions on average.</span></p> <p class="p1"><span class="s1">A map (below) from the study shows that areas that experienced some of the largest increases and subsequent decreases in housing prices prior to the Great Recession—Jacksonville, Tampa, and Orlando in Florida and Riverside, Sacramento, and Las Vegas in the West—were in a recession 6 months before the country as a whole (shaded indicates in recession according to study).</span></p> <p class="p1"><a href=""><img src="" width="575" height="271" /></a></p> <p class="p1"><span class="s1">The finding that an inelastic housing supply can deepen or prolong a recession is another reason for local policy makers to free up their housing markets. <a href=""><span class="s2">Other research</span></a> argues that local housing restrictions make housing more expensive, especially in the most productive areas of the country, which makes it harder for people to migrate to those areas. Ultimately this results <a href=""><span class="s2">in less GDP and makes us all worse off</span></a>.</span></p> <p class="p1"><span class="s1">The economic variation across the U.S. is considerable. Such variation calls into question the usefulness of top-down, federal fiscal policy designed to smooth out recessions. At any given time only some local economies are contracting, and <a href=""><span class="s2">the haphazard way in which fiscal stimulus is often implemented</span></a> may lead to some economies overheating while others are left languishing.</span></p> Thu, 21 Jul 2016 10:33:07 -0400 Against Regulatory Complexity <h5> Expert Commentary </h5> <p class="p1"><span class="s1">With the July 21 anniversary of the Dodd-Frank Wall Street Reform and Consumer Protection Act now upon us, it’s a good time to reflect on how this type of Byzantine legislation spawns a convoluted network of tangled regulations.&nbsp;</span></p> <p class="p1"><span class="s1">When recently unveiling his Financial CHOICE Act, House Financial Services Committee Chairman Jeb Hensarling <a href=""><span class="s2">highlighted</span></a> a key principle behind his efforts to combat this overgrowth: “Simplicity must replace complexity.” The chairman’s focus on regulatory complexity is appropriate.</span></p> <p class="p1"><span class="s1">In many ways, regulations are like a computer’s operating system, establishing processes and parameters within which programs must operate. But anyone who has undergone the experience of “upgrading” an operating system only to find her computer sluggish and unresponsive knows that complexity is not always a desirable feature. Steven Teles, a political scientist with Johns Hopkins, made a similar comparison when he <a href=""><span class="s2">famously referred</span></a> to American policy as a “kludgeocracy,” an ever-expanding series of “inelegant patch(es)” meant to solve short-term problems, but which ultimately hinder system performance.</span></p> <p class="p1"><span class="s1">A recent <a href=""><span class="s2">analysis</span></a> showed that Dodd-Frank accounted for nearly 30,000 new regulatory restrictions — more than all other laws passed during the Obama administration combined. These new regulations, authorized by a Congress in crisis mode, were piled on top of more than one million existing regulatory restrictions. Even former Senator Chris Dodd, one of the bill’s namesakes, <a href=""><span class="s2">admitted</span></a> just after the bill’s passage that “no one will know until this is actually in place how it works.” Scholars subsequently <a href=""><span class="s2">argued</span></a>that the regulatory uncertainty exacerbated by Dodd-Frank could explain the slow recovery. At the time, however, some facts were clear: Dodd-Frank would increase regulatory complexity, induce uncertainty, and line the pockets of regulatory compliance experts.</span></p> <p class="p1"><span class="s1">To an unprecedented degree, simply ascertaining the relevance of regulations stemming from an act of Congress now requires regulatory compliance expertise. To illustrate, consider a simple visualization of regulatory restrictions originating from another major financial regulatory law, the Sarbanes-Oxley Act of 2002. Sarbanes-Oxley, which dealt with audits and financial reporting, affected public companies in all sectors of the economy and induced some regulations that specifically targeted a handful of industries. Textual analysis of those regulations shows that five industries were directly targeted by regulations from two federal agencies:</span></p> <p class="p1"><img height="388" width="500" src="" /></p> <p class="p1"><span class="s1">Sarbanes-Oxley was, of course, a significant regulatory overhaul in its own right. In 2012, the Wall Street Journal Editorial Board went so far as to <a href=""><span class="s2">call it</span></a> one of the reasons for slow economic growth. Furthermore, much of the effect of Sarbanes-Oxley stems from the creation of the Public Company Accounting Oversight Board, a regulatory entity that awkwardly straddles the public-private divide with considerable control over auditing firms and — indirectly — the public companies they audit.</span></p> <p class="p1"><span class="s1">Nonetheless, even allowing for the additional complexity of referencing accounting standards that are not formally published as regulations, Sarbanes-Oxley is a model of simplicity compared to Dodd-Frank. Consider a similar visualization of the agency-industry relationships emerging from Dodd-Frank — which, for the sake of visualization, is limited to only 10 agencies and 10 industries. In fact, at least 32 different agencies have promulgated rules under the statutory authority of Dodd-Frank:</span></p><p><img height="492" width="500" src="" /></p> <p class="p1"><span class="s1">In the post-Dodd-Frank world, understanding which regulations are relevant to a business’s activities has become immensely more difficult. Many sectors of the economy were newly exposed to regulations from a multitude of unfamiliar agencies. Duplicative and contradictory rules became a <a href=""><span class="s2">fact of life</span></a>.</span></p> <p class="p1"><span class="s1">In 1788, James Madison <a href=""><span class="s2">worried</span></a> that laws may become “so voluminous that they cannot be read, or so incoherent that they cannot be understood.” He was right to worry: current regulatory code is so complex and voluminous that, rather than spend <a href=""><span class="s2">three years</span></a> reading it, I helped create text analysis software that uses machine learning to assess the probability that a given regulatory restriction targets a specific industry. But even with the insights of machine learning and text analysis software — or regulatory compliance experts who bill by the hour — considerable uncertainty remains. Regulatory agencies, themselves, are, increasingly, <a href=""><span class="s2">unfamiliar</span></a> with their own regulations.</span></p> <p class="p1"><span class="s1">When there are more rules in place than anyone can read, and interpretation of those rules and their scope is determined by the regulators themselves, businesses must pay for experts to filter the rules that are truly relevant from the rest. Meanwhile, businesses must also keep an eye on new rules coming down the pipeline and the possibility of reinterpretation of old rules. For both federal regulations and statutes, an irrelevant requirement only remains irrelevant until a bureaucrat, or a <a href=""><span class="s2">federal prosecutor</span></a>, decides otherwise.</span></p> <p class="p1"><span class="s1">Regulatory complexity engenders uncertainty. That may not be a problem for some politicians; but for anyone who must comply with regulations, complexity and uncertainty can be paralyzing. Simplifying the complex regulatory regime imposed by Dodd-Frank is an application of another lesson from the world of computer programming: iterative design can correct serious errors and reduce unnecessary complexity.</span></p> Thu, 21 Jul 2016 10:20:22 -0400 Where The Financial CHOICE Act Goes Wrong <h5> Expert Commentary </h5> <p class="p1"><span class="s1">New landmark financial services legislation recently introduced by Chairman Jeb Hensarling (R., Texas) of the House Financial Services Committee would go a long way toward addressing Dodd-Frank’s mistaken approach to financial regulation. Dodd-Frank assumes government can run financial markets through a combination of micromanagement, regulatory discretion and high penalties.</span></p> <p class="p1"><span class="s1">The Financial CHOICE Act looks to make market actors—not government bureaucrats—responsible for running financial firms and picking up the pieces when they run them into the ground. Out of step with the rest of the legislative package, however, is a plan to hike corporate penalties.</span></p> <p class="p1"><span class="s1">The desire to appear “tough on Wall Street” is understandable given critics’ predictable mischaracterization of the bill as “a wet kiss for Wall Street.” These same critics welcome corporate penalties even when innocent shareholders foot the bill. Rather than embrace the critics’ flawed logic, Congress may want to look at how such policies worked in the past.</span></p> <p class="p1"><span class="s1">When Congress passed a massive bailout for the Savings and Loan Industry in the Financial Institutions Recovery Act (FIRREA) in 1989, the legislation included banking sector penalties of up to $1 million per day for violations of banking rules. The structure of those penalties allowed for draconian recoveries for small infractions—the banking law equivalent of a million dollar ticket for speeding on the highway.</span></p> <p class="p1"><span class="s1">And in recent years, the Department of Justice has abused FIRREA by bringing multi-billion dollar cases built on increasingly novel interpretations of the act. Just a couple of months ago the Second Circuit rejected one of these cases and overturned over a billion dollar FIRREA penalty against Bank of America for lack of culpable intent (even under the low culpability standards of that law).</span></p> <p class="p1"><span class="s1"><b>Corporate penalty increases are not the right answer</b></span></p> <p class="p1"><span class="s1">Increasing those penalties by 50%, as the House draft would do, is not the answer. If it is, why stop the bidding there? If raising financial sector civil penalties is an absolute good, then why not a 100% increase in daily penalties? This legislation includes a requirement that financial regulatory agencies conduct cost-benefit analysis for new rules. Where is the cost-benefit analysis for these penalty enhancements?</span></p> <p class="p1"><span class="s1">The draft legislation also includes provisions enhancing penalty powers for the Securities and Exchange Commission that build on the controversial SEC enforcement powers and penalties granted in the Dodd-Frank Act and the Sarbanes-Oxley Act.</span></p> <p class="p1"><span class="s1">The draft increases corporate fraud penalties such that they mirror investor harm. It’s difficult to argue against that concept on its face, but corporate penalties are almost always paid by shareholders—the very same shareholders who were harmed by wrongdoing in the first place.</span></p> <p class="p1"><span class="s1">The draft contains enhanced penalties for insider trading as well. Again, Congress could learn from the past. While insider trading is illegal, it’s not clearly defined in statute. The SEC has a long history of expanding the doctrine of insider trading to include activity which most reasonable people wouldn’t consider culpable.</span></p> <p class="p1"><span class="s1">The SEC’s enforcement of the Foreign Corrupt Practices Act (FCPA) is another case in point for the dangers of penalties poorly metered to harm. The typical FCPA case has resulted in corporate legal and compliance expenses of <a href=""><span class="s2">nearly 10 times the actual fine</span></a>. In the 35 years of enforcing this law, and after billions in settlements and hundreds of cases, <a href=""><span class="s2">only one has actually gone to trial.</span></a></span></p> <p class="p1"><span class="s1"><b>Optimal penalty design is a dangerous job to leave in the hands of the Congress</b></span></p> <p class="p1"><span class="s1">While the bill contains helpful due process reforms, like requiring the SEC chief economist to weigh in on the impact of settlements on shareholders, the reforms will not counteract leverage that poorly metered penalties offer in settlement discussions outside of trial. A comprehensive examination of the balance between financial penalties on the books and due process might begin with the FCPA.</span></p> <p class="p1"><span class="s1">The financial penalty enhancements being considered in this legislation must be considered in light of this history. When the law is defined one case at a time, in a highly charged polemical environment that encourages pursuit of non-culpable activity, the rule of law comes under threat.</span></p> <p class="p1"><span class="s1">The fact is that an optimal penalty design is a dangerous job to leave in the hands of the Congress. Over the last 25 years polemics have always trumped policy on this issue in the legislative branch. A better approach would be to order the financial regulators to impanel an advisory commission of enforcement experts, including government lawyers, defense counsel, and economists to advise the financial regulators in both the optimal design of civil and criminal penalties and the problem of unpredictable doctrine creep in the law.</span></p> <p class="p1"><span class="s1">The CHOICE Act has much to commend, but incorporating historical lessons would make it more effective at ensuring that the right parties are held accountable for their misdeeds without harming the innocent in the process.</span></p> Thu, 21 Jul 2016 10:03:53 -0400 Airplane Speeds Have Stagnated for 40 Years <h5> Publication </h5> <p class="p1">This year marks the 40th anniversaries of two of the greatest achievements in manned flight. In 1976, US military pilot Eldon W. Joersz set the still-standing airspeed record of 2,193.2 mph in the Lockheed SR-71 “Blackbird.” That same year, the Concorde introduced the world to supersonic commercial travel with the first passenger flights to break the sound barrier.</p> <p class="p3"><a href=" copy.jpg"><img src=" copy.jpg" width="575" height="444" /></a></p> <p class="p1">In the decades to follow, the speed of aviation stagnated—and even regressed. The SR-71 retired from service in 1999, and no commercial airliner in service today flies at Mach 1 (the speed of sound), much less the Mach 2 speeds reached by the Concorde. The time required to fly from Los Angeles to New York or across the Atlantic Ocean is no different than it was 40 years ago for the average airline passenger.<span style="font-size: 12px; background-color: white;">&nbsp;</span></p> <p class="p1">The initial progress and current stagnation of airplane speeds is plain to see by looking at manned, air-breathing flight airspeed records since the Wright brothers’ first flight on December 17, 1903, which is estimated to have flown at 6 mph. Airspeed record data have been gathered since 1905 by the Fédération Aéronautique Internationale (FAI), an international organization that sets standards for tracking, measuring, and verifying aviation records.</p> <p class="p1">The blue trend line shows the rapid pace at which the airspeed record was bested in the first three-fourths of the 20th century. The fastest growth in achievable airspeeds occurred in the decades following the Second World War. The sound barrier was famously first broken by Chuck Yeager in 1947, but the FAI did not recognize Yeager’s flight for the record because the plane was rocket-powered and launched by a drop from a B-29 bomber. The first supersonic flight recorded by the FAI was made in 1955 by US Air Force Colonel Horace Hanes.</p> <p class="p1">Shortly thereafter, the FAI recorded its first Mach 2 flight in 1958. That speed was matched in commercial flight less than two decades later, a testament to the high level of innovation in air travel during the mid-20th century.</p> <p class="p1">What happened to this high level of innovation in air travel? Civil supersonic aviation was banned over the United States in 1973 because of fears that sonic booms would damage buildings and constitute an intolerable nuisance. The outright ban limited the market for the Concorde to transoceanic routes and destroyed incentives for research and development of new supersonic transports. Since 1973, airplane manufacturers have innovated on margins other than speed, and as a result, commercial flight is safer and cheaper than it was 40 years ago. But commercial flight isn’t any faster—in fact, today’s flights travel at less than half the Concorde’s speed.</p> <p class="p1">If we want to restore mid-century levels of aviation innovation and break the sound barrier again, we must first break regulatory barriers. The FAA should lift its ban on civil supersonic flight. Legitimate concerns about supersonic flight can be handled by specific policies that address concerns directly, such as a clear standard from the FAA for acceptable noise levels. It would be a shame to suffer another four decades of aviation stagnation.&nbsp;</p> Wed, 20 Jul 2016 14:29:17 -0400 Thirty Years after the Nobel: James Buchanan's Virginia Political Economy ( <h5> Events </h5> <p>Thirty years ago, in October 1986, James M. Buchanan was awarded the Nobel Prize in economics “for his development of the contractual and constitutional bases for the theory of economic and political decision-making.” His contributions in these areas as well as those in methodology, social philosophy, public policy economics, and political science continue to have a lasting influence on scholarship today.</p> <p>Please join us on October 6, 2016 for a keynote speech and panel discussion to reflect on the significance of Buchanan’s Nobel Prize and the various strands of influence his work has had in subsequent decades of scholarship. We will discuss his contributions in the fields of social and political philosophy, social contract theory, and constitutional political economy, together with his influence on the research of other prominent economic thinkers. In keeping with the F. A. Hayek Program’s view of political economy as a progressive research program, we will explore key themes in Buchanan’s research and see where they may lead us for the future of the discipline.</p> <p>From 2:00 to 3:30 p.m., <b><a href="">Michael Munger</a></b>, Professor of Political Science, Public Policy, and Economics at Duke University, will deliver a keynote address on James Buchanan’s contributions to political economy and social philosophy.</p> <p>From 4:00 to 5:30 p.m., we will have a roundtable discussion with:</p><ul><li><b style="font-family: inherit; font-style: inherit; background-color: white;"><a href="">David Schmidtz</a></b><span style="font-size: 12px; background-color: white;">, Kendrick Professor of Philosophy at the University of Arizona</span></li><li><b style="font-family: inherit; font-style: inherit; background-color: white;"><a href="">Barry Weingast</a></b><span style="font-size: 12px; background-color: white;">, Ward C. Krebs Family Professor of Political Science at Stanford University</span></li><li><b style="font-family: inherit; font-style: inherit; background-color: white;"><a href="">Luigi Zingales</a></b><span style="font-size: 12px; background-color: white;">, Robert C. McCormack Distinguished Service Professor of Entrepreneurship and Finance at the University of Chicago Booth School of Business</span></li></ul> <p>If you have any questions, please contact Courtney Dunn at <a href=""></a></p> <p><b>About the Buchanan Speaker Series</b></p> <p>The Buchanan Speaker Series promotes Nobel laureate James Buchanan’s intellectual legacy by applying Buchanan’s ideas to the pressing matters of our time.</p> <p>James Buchanan moved to George Mason University in the early 1980s. His influence on the developing agenda at the Mercatus Center has been important in at least two ways. One is how it fostered a broad research and educational vision that seeks to embrace both political economy and social philosophy. As Buchanan once put it when establishing his first academic center at the University of Virginia in the late 1950s—the Thomas Jefferson Center for Studies in Political Economy—the faculty will</p> <blockquote><p style="padding-left: 30px;">“strive to carry on the honorable tradition of ‘political economy’—the study of what makes for a ‘good society.’ Political economists stress the technical economic principles that one must understand in order to assess alternative arrangements for promoting peaceful cooperation and productive specialization among free men. Yet political economists go further and frankly try to bring out into the open the philosophical issues that necessarily underlie all discussions of the appropriate functions of government and all proposed economic policy measures.”</p></blockquote> <p>Buchanan’s other lasting influence is his motto “dare to be different.” Mercatus is grounded in the intellectual traditions best exemplified by F. A. Hayek, but our scholars also draw from the best work in contemporary social science and the humanities. As Buchanan noted in a 1979 essay honoring Hayek, “The diverse approaches of the intersecting ‘schools’ must be the bases for conciliation, not conflict. We must marry the property-rights, law-and-economics, public-choice, Austrian subjectivist approaches.” At George Mason and the Mercatus Center this intellectual marriage has taken place.</p> Mon, 25 Jul 2016 13:56:52 -0400 The Rationalia Fallacy <h5> Expert Commentary </h5> <p class="p1"><span class="s1">Science is a noble witness, but a nefarious judge.</span></p> <p class="p1"><span class="s1">Celebrity astrophysicist Neil deGrasse Tyson recently launched a hail of hubris across the internet with <a href=""><span class="s2">this tweet</span></a>: "Earth needs a virtual country: #Rationalia, with a one-line Constitution: All policy shall be based on the weight of evidence."</span></p> <p class="p1"><span class="s1">Tyson's fantasy – a rational, science-based society – is nothing new. France's Reign of Terror near the close of the 18th century was sanctified in the revolution's Temples of Reason. Stalin and Mao murdered tens of millions in pursuit of Marx's scientific socialism. In the half-century before World War II, the world's greatest scientific minds conjured up eugenics – a dark pseudoscience of human breeding – and browbeat servile policymakers into a scourge of forced sterilizations in America and genocide in Europe. Nazi Deputy Führer Rudolf Hess stated – probably sincerely – that "National Socialism is nothing but applied biology."</span></p> <p class="p1"><a href="">Continue reading</a></p> Mon, 18 Jul 2016 19:01:25 -0400 Why Banks Should Beware of “Misbehaving” <h5> Publication </h5> <p class="p1"><b>Introduction</b></p> <p class="p1">Over the last few decades, psychologists have challenged economists on the notion that people always make rational decisions. Economists, of course, recognize that people are not always perfectly rational. Modeling them as such often adds to the precision of the model’s result, without reducing its relevance. Put another way, economists assume that most of the time people act rationally enough that modeling them as perfectly rational does not get in the way of discovering new insights into human behavior.</p> <p class="p1">Nevertheless, behavioral psychologists found this rational choice–based method wanting and have amassed a sizeable body of research demonstrating certain “anomalies” in laboratory studies that break from rational choice predictions. For example, behavioral psychologists Amos Tversky and Daniel Kahneman famously claimed that people are susceptible to certain biases that make them more risk averse to gaining wealth (and more risk seeking in losing it) than the standard rational choice model would predict. Furthermore, they claimed that framing choices in different ways elicits inconsistent behavior.</p> <p class="p1">These ideas eventually coalesced into the field known as “behavioral economics” and have since made their way into public policy. An example of this is the Consumer Financial Protection Bureau (CFPB), which regulates consumer credit products, such as mortgages and credit cards, and consumer credit providers, such as banks, payday lenders, and cell phone providers. This agency was largely influenced by behavioral economics in setting its organizational mission and goals, such as protecting consumers from exploitation and manipulation by credit providers.</p> <p class="p1">Despite these behavioral-based foundations (or perhaps because of them, as I will explain below), the CFPB has been criticized from both sides of the political divide for its aggressive bureaucratic expansion and failure to adhere to its original congressional mandate. Furthermore, the actions of the agency have directly led to the significant reduction in volume of certain credit products (e.g., residential mortgages, auto loans) in a manner that calls into question whether the agency is helping or harming consumers.</p> <p class="p1">The purpose of this paper is to outline the impact of behavioral economics on public policy by examining its central influence on the CFPB. In particular, it explains how behavioral ideas have been converted into policies that fail to account for actual government practice, which has led to mixed results for consumers. While understanding just how people are susceptible to market influence is important, the premature application of behavioral economics to public policy risks undermining the goal of helping consumers.</p> <p class="p1"><b>What is Behavioral Economics?</b></p> <p class="p1">Behavioral economics, simply put, is psychology applied to traditional economic concepts. What is novel about this approach is that it couches its critique in a language economists can understand. So, for example, when people are more likely to insure against risk because they fear losses more than they enjoy gains, behavioral economists position this outcome within the standard utility maximization framework employed by economists, but with the added flourish of describing such behavior as exhibiting “loss aversion.”</p> <p class="p1">Best-selling books, including <i>Nudge</i>, <i>Predictably Irrational</i>, and <i>Thinking, Fast and Slow,</i> have provided the public with accessible entries into the world of behavioral economics. Be it by showing how we process information and awareness through two corresponding mental systems (<i>Thinking, Fast and Slow</i>) or exposing why we react differently while in a panicked state (<i>Blink</i>) or demonstrating how government can be used to improve our everyday choices (<i>Nudge</i>), these books represent a growing and popular topic of inquiry among academics, policymakers, and even the general public.</p> <p class="p1">Whether this is a fad or something deeper, behavioral economics is already making a noticeable impact on several regulatory fronts, most notably in consumer finance, enough to be labeled by some as the “new paternalism.” For example, the CFPB implemented a provision that defined so-called “qualified mortgages,” a category of loans in which lenders adhere to certain parameters such as setting nonadjustable interest rates, determining the borrower’s ability to repay, etc. This is all predicated on the assumption that consumers do not understand what they are agreeing to—and that assumption, at the very least, constitutes a departure from the traditional justification for regulatory intervention, which is market failure.</p> <p class="p1">This policy outcome, like others from the CFPB as noted below, can be traced back to behavioral roots. In this case, it is from the book <i>Nudge</i>, which outlines a number of possible “soft” interventions into the marketplace to correct for common mistakes people make. The authors of <i>Nudge</i>, Richard Thaler and Cass Sunstein, have done more than anyone else to bring behavioral economics from mere laboratory studies of human behavior out into the world of policy. In chapter 8 of <i>Nudge</i>, they criticize mortgage products with low introductory interest rates and balloon payments as being too complicated for consumers to understand. They argue that products with simpler terms and conditions (e.g., a 30-year fixed mortgage) make choices easier for consumers and thus provide the standard by which all alternatives should be compared.</p> <p class="p1">The 2008 scribbling of two behavioral economists has become our new reality, as the financial industry must now work within regulations that penalize mortgage products that fail to adhere to federal guidelines. Qualified mortgages are restricted to those with fixed terms and interest rates. Mortgage products with features like adjustable rates and amortization fees are unlikely to pass muster. Banks can, of course, offer nonqualified mortgages, but they risk being sued by the borrower if they default, and there is a stigma associated with such a product label.</p> <p class="p1">Moreover, much of the financial industry has reacted to the implementation of the rule by withdrawing from the mortgage market altogether. The figure below, generated from data provided by the Mortgage Bankers Association, shows a steep reduction in the volume of residential loans in 2013, as originally reported by This decline is most pronounced with refinancing mortgage loans. This trend is in tandem with the implementation of the Ability-to-Repay and Qualified Mortgage Rule on January 10, 2014, which formally defined “qualified mortgages.”</p> <p class="p2">&nbsp;</p> <p class="p1">Figure 1. US Residential Loan Origination Trends</p> <p class="p3"><img height="444" width="575" src=" jpg.jpg" /></p> <p class="p1">Source: Mortgage Bankers Association, “Annual Mortgage Origination Estimates,” February 2016.</p> <p class="p1">The recent uptick in these originations indicates that the market may be finally adjusting to the new rules, though it is unclear whether volume will return to its previous level.</p> <p class="p1">This may be all well and good for those who believe we should all consume “plain vanilla” products. But for those who understand—and indeed, conduct their business on—the flexibility that alternative credit products provide, the new regulations stifle credit markets in a way that most assuredly hurts not just the financial industry but consumers, too.</p> <p class="p1"><b>Behavioral Economics in the Consumer Finance Industry</b></p> <p class="p1">These interventions into what behavioral economists describe as the “choice architecture” of the marketplace constitute a very real and problematic constraint for the financial industry, as the figure above illustrates. This stems from the work of the CFPB—one of Washington’s newest agencies and a major part of the larger Dodd-Frank Wall Street Reform and Consumer Protection Act—which regulates virtually any consumer practice in the financial industry. Even practices that the agency was explicitly told to ignore, such as auto lending, have become significant targets for the agency’s efforts.</p> <p class="p1">That this new agency is so aggressive should be no surprise given its lack of congressional oversight or budgetary approval. Indeed, these features are so extraordinary that the constitutionality of the CFPB is now facing a challenge in the US Court of Appeals for the DC Circuit. While Congress is certainly no safety valve against bureaucracy, it can create limits to certain excesses, particularly when those excesses affect the interests of constituents. The fact that the CFPB need not concern itself with the interests of the market participants it regulates, or the full range of consumers these regulations ultimately impact, is in large part responsible for the mixed results.</p> <p class="p1">What is novel about the agency is that its roots go deeply into the world of behavioral economics. Senator Elizabeth Warren, who was the driving force behind the agency’s inception, relied on behavioral assumptions in making her original case for the agency. Senator Warren later teamed up with well-known behavioral economist Oren Bar-Gill in an expanded law review article to make the case for the need for an agency dedicated to consumer finance. They cited cognitive shortcomings that people exhibit—including dealing poorly with complicated information, displaying inertia in switching to new products, and not providing for their true long-term interests—as justifications for such an agency. The agency would, therefore, be justified in targeting products based upon a preconceived notion of what is best for the consumer (as occurred in the example above with qualified mortgages).</p> <p class="p1">What is “best” for the consumer is defined by the agency itself. Rulemaking has largely been opaque at best, not transparent, though one interesting fact is that <i>Nudge </i>coauthor Richard Thaler is an official member of the agency’s advisory board. In fact, many of the targets of <i>Nudge </i>have become the targets of the agency. In addition to complex mortgages, the agency has targeted add-on products like credit card insurance and overdraft protection. These latter products represent what behavioral economists call “shrouded fees,” which they claim are meant to mislead consumers into making unwise purchases.</p> <p class="p1">In congressional testimony, law professor Todd Zywicki explained how the resulting decline in overdraft fees has also caused a precipitous decline in free checking accounts. Since the passage of Dodd-Frank, the number of banks offering free checking accounts has declined by half, with the accompanying result that banks have doubled the required account balance needed to maintain these checking accounts.</p> <p class="p1">The underlying trouble with closing credit channels is that this does nothing to boost consumer income. Instead, it simply takes away “undesirable” choices, as defined by bureaucrats, without replacing those choices with more viable ones.</p> <p class="p1">Overdraft protection in general has been a constant source of discussion within the CFPB and the Federal Reserve (Fed), where the CFPB is located. In 2010, the Fed required all banks to ask their customers to “opt in” to continue using overdraft services. Behavioral economists claimed that survey evidence indicated people do not really value the service and would not opt in if asked to use the service directly. But in fact, the opposite occurred, as those who most used the service were three times as likely to opt in as normal users. The regulation of overdraft protection has since passed on directly to the CFPB, which has only continued the trend the Fed started.</p> <p class="p1">This example would seem to challenge the claim that people are not rational in their decision-making. Either people are using a service that does not benefit them because they are not rational, and therefore should become aware of this when given information required by regulators, or they were rational in using the service in the first place and would, therefore, obviously choose to opt back into the service when asked.</p> <p class="p1">Some behavioral economists have instead argued that these supposed “high-value” users are only highly deluded and are now calling for restrictions on overdraft fees in general, despite the fact that other evidence shows closing off such channels encourages the use of payday lenders, another perennial target of the CFPB, or even loan sharks.&nbsp;</p> <p class="p1">This exposes a larger issue with using behavioral economics as a platform for policy prescription—it is not clear ex ante what behavior is considered rational and what is not. Shifting the definition of what constitutes rational behavior undermines the scientific basis for behavioral remedies. The result is a series of just-so stories that can appeal to the very same biases behavioral economics seeks to redress (i.e., confirmation bias among regulators).</p> <p class="p1"><b>Why Should the Financial Industry Care?</b></p> <p class="p1">The massive number of financial regulations that emerged from Dodd-Frank has perhaps obscured the growing influence of behavioral economics in this policy arena. But once adopted, regulations can become very hard to undo, particularly when they reflect a larger political movement, in this case propelled by a growing portion of the academy. Bottom line, behavioral economics is here to stay and will likely continue to drive regulatory reform in financial markets.</p> <p class="p1">While this will be welcomed by some who appreciate a more nuanced framework for addressing consumer missteps, others will be troubled by the idea that an agency can target products based on bureaucrats’ ideas of what is best for the consumer. The examples above show how this heavy-handed approach, guided by academic thinking, can lead to poor outcomes—not only for the financial industry but for financial consumers as well.</p> <p class="p1">The example of overdraft protection specifically demonstrates the growing influence of behavioral economics in this policy area. Use of overdraft protection declined in 2010, resulting in a loss of $2 billion to the industry itself. This is why a better framework is needed for dealing with consumer finance. Behavioral economics will most assuredly be part of the discussion. But that should be tempered by an understanding of how politics and regulatory reform work in practice and what constraints on government activity are needed to keep consumers’ true interests at heart.</p> <p class="p1">The financial industry can provide help in this area in a number of ways. First, it can provide its own set of private nudges to help consumers make choices—but in a way that reflects the greater feedback and competitive pressures of the marketplace, as opposed to the less nuanced direction of government bureaucracy. To a large extent, the industry already does provide such nudges, but pointing to examples where “choice architecture” is clearly improved for the consumer could in part help challenge the notion that only government can improve people’s choices.</p> <p class="p1">On that note, the financial industry should be prepared to show evidence that its consumers are indeed happy with the products they receive. Marketplace solutions have already arrived with rating sites like Angie’s List, Yelp, etc. But the industry has a still greater burden to bear. Instead, a product must coincide with what regulators believe to be appropriate products. Greater specificity from regulators in what they are looking for in the set of choices open to the consumer would be ideal, though they rarely pronounce this explicitly. Firms face the uncertainty of what regulators will “choose” for consumers based on the questionable advice of behavioral economists.</p> <p class="p1">This brings us to regulators themselves. Regulators should be challenged on what criteria they use to define their “normal” consumer. Studying just how people arrive at their choices and what parts of the environment trigger different responses is fascinating work that can potentially lead to a better set of choices for consumers. But premature emphasis on policy solutions risks stretching this new work beyond its competence. After all, if people have limited abilities to make decisions, then we must understand not just the behavior of financial companies and financial consumers, but the behavior of financial regulators as well. Only when we study the “choice architecture” of all three can we begin to understand how to arrive at better choices in practice.</p> Mon, 18 Jul 2016 15:53:16 -0400