Mercatus Site Feed en Preparing for the Future of Artificial Intelligence <h5> Publication </h5> <p class="p1">The Office of Science and Technology Policy (OSTP) has requested comments pertaining to the governance of artificial intelligence (AI) technologies.&nbsp;</p> <p class="p2"><span style="font-size: 12px; background-color: white;">The Technology Policy Program of the Mercatus Center at George Mason University is dedicated to advancing knowledge of the impact of regulation on society. It conducts careful and independent analyses employing contemporary economic scholarship to assess policy issues from the perspective of the public interest.&nbsp;</span></p> <p class="p1">We write here to comment on the appropriate policy framework for artificial intelligence (AI) technologies at this nascent stage of their development and to make the case for prudence, patience, and a continuing embrace of “permissionless innovation.” Permissionless innovation refers to the idea that “experimentation with new technologies and business models should generally be permitted by default. Unless a compelling case can be made that a new invention will bring serious harm to society, innovation should be allowed to continue unabated and problems, if they develop at all, can be addressed later.”&nbsp;</p> <p class="p1">Policymakers may be tempted to preemptively restrict AI technologies out of an abundance of caution for the perceived risks these new innovations might seem to pose. However, an examination of the history of US technology policy demonstrates that these concerns can be adequately addressed without quashing a potentially revolutionary new industry.&nbsp;</p> <p class="p1">Specifically, as policymakers consider the governance of AI, they would be wise to consider the lessons that can be drawn from our recent experience with the Internet. The United States made permissionless innovation the basis of Internet policy beginning in the early 1990s, and it soon became the “secret sauce” that propelled the rise of the modern digital revolution.&nbsp;</p> <p class="p1">If policymakers wish to replicate America’s success with the Internet, they need to adopt a similar “light-touch” approach for the governance of AI technologies. To highlight the benefits of permissionless innovation, the Mercatus Center at George Mason University has recently published a book, a series of law review articles, and several agency filings that explain what this policy vision entails for different technologies and sectors. A summary of the major insights from these studies can be found in a recent Mercatus Center paper called “Permissionless Innovation and Public Policy: A 10-Point Blueprint.”&nbsp;</p> <p class="p1">If one’s sole conception of a technology comes from Hollywood depictions of dystopian science fiction or killer robotic systems run amok, it is understandable that one might want to use the force of regulation to clamp down decisively on these “threats.” But these fictional representations are just that: fictional. AI technologies are both much more benign and fantastic in reality.</p> <p class="p1">The economic benefits of AI are projected to be enormous. One recent study used benchmarks derived from methodologically conservative studies of broadband Internet, mobile phones, and industrial robotics to estimate that the economic impact of AI could be between $1.49 trillion and $2.95 trillion over the next ten years. With less strict assumptions, the economic benefits could be greater still.</p> <p class="p1">However, some skeptics are already making the case for a preemptive regulation of AI technologies. The rationales for control are varied, including concerns ranging from deindustrialization to dehumanization, as well as worries about the “fairness” of the algorithms behind AI systems. &nbsp;</p> <p class="p1">Due to these anxieties associated with AI, some academics argue that policymakers should “legislate early and often” to “get ahead of” these hypothetical problems. Specifics are often in short supply, with some critics simply hinting that “something must be done” to address amorphous concerns.&nbsp;</p> <p class="p1">Other scholars have provided more concrete regulatory blueprints, however. They propose, among other things, the passage of broad-based legislation such as an “Artificial Intelligence Development Act,” as well as the creation of a federal AI agency or possibly a “Federal Robotics Commission” or “National Algorithmic Technology Safety Administration.” These proposed laws and agencies would establish a certification process requiring innovators to subject their technologies to regulatory review to “ensure the safety and security of their A.I.” Or, at a minimum, such agencies would advise other federal, state, and local officials and organizations on how to craft policy for AI and robotics.&nbsp;</p> <p class="p1">Such proposals are based on “precautionary principle” reasoning. The precautionary principle refers to the belief that new innovations should be curtailed or disallowed until their developers can prove that they will not cause any harms to individuals, groups, specific entities, cultural norms, or various existing laws, norms, or traditions.</p> <p class="p1">It is certainly true that AI technologies might give rise to some of the problems that critics suggest. And we should continue to look for constructive solutions to the potentially thorny problems that some of these critics discuss. That does not mean that top-down, technocratic regulation is sensible, however.&nbsp;</p> <p class="p1">Traditional administrative regulatory systems have a tendency to be overly rigid, bureaucratic, and slow to adapt to new realities. This is particularly problematic as it pertains to the governance of new, fast-moving technologies.</p> <p class="p1">Prior restraints on innovative activities are a recipe for stagnation. By focusing on preemptive remedies that aim to predict hypothetical problems that may not ever come about, regulators run the risk of making bad bets based on overconfidence in their ability to predict the future. Worse yet, by preempting beneficial experiments that yield new and better ways of doing things, administrative regulation stifles the sort of creative, organic, bottom-up solutions that will be needed to solve problems that may be unforeseeable today.</p> <p class="p1">This risk is perhaps more pronounced when dealing with AI technologies. <i>How </i>“artificial intelligence” is regulated makes little sense until policymakers define <i>what </i>it actually entails. The boundaries of AI are amorphous and ever changing. AI technologies are already all around us—examples include voice-recognition software, automated fraud detection systems, and medical diagnostic technologies—and new systems are constantly emerging and evolving rapidly. Policymakers should keep in mind the rich and distinct variety of opportunities presented by AI technologies, lest regulations more appropriate for one kind of application inadvertently stymie the development of another.</p> <p class="p1">Toward that end, we suggest that a different policy approach for AI is needed, one that is rooted in humility and a recognition that we possess limited knowledge about the future.&nbsp;</p> <p class="p1">This does not mean there is no role for government as it pertains to AI technologies. But it does mean that policymakers should first seek out less restrictive remedies to complex social and economic problems before resorting to top-down proposals that are preemptive and proscriptive.&nbsp;</p> <p class="p1">Policymakers must carefully ensure they have a full understanding of the boundaries and promises of all of the technologies they address. Many AI technologies pose little or no risks to safety, fair market competition, or consumer welfare. These applications should not be stymied due to an inappropriate regulatory scheme that seeks to address an entirely separate technology. They should be distinguished and exempted from regulations as appropriate.</p> <p class="p1">Other AI technologies may warrant more regulatory consideration if they generate substantial risks to public welfare. Still, regulators should proceed cautiously.&nbsp;</p> <p class="p1">To the extent that policymakers wish to spur the development of a wide array of new life-enriching technologies, while also looking to devise sensible solutions to complex challenges, policymakers should consider a more flexible, bottom-up, permissionless innovation approach as the basis of America’s policy regime for AI technologies.</p> <p class="p2">&nbsp;</p> Fri, 22 Jul 2016 13:56:24 -0400 Government Report Finds That ACA Medicaid Enrollees Much More Expensive Than Expected <h5> Expert Commentary </h5> <p class="p1"><span class="s1">The Department of Health and Human Services’ (HHS) annual <a href=""><span class="s2">report</span></a> on Medicaid’s finances contains a stunning update: the average cost of the Affordable Care Act’s Medicaid expansion enrollees was nearly 50% higher in fiscal year (FY) 2015 than HHS had projected just one year prior. Specifically, HHS found that the ACA’s Medicaid expansion enrollees cost an average of $6,366 in FY 2015—49% higher than the $4,281 amount&nbsp;that the agency projected in last year’s report.</span></p> <p class="p1"><span class="s1">The government’s chief financial experts appear not to have anticipated how states would respond to the federal government’s 100% financing of the cost of people made eligible for Medicaid by the ACA. It appears that the enhanced federal funding for the ACA expansion population has led states to set outrageously high capitation rates—the amount government pays insurers—for the ACA Medicaid expansion population. The rates are much higher than the amounts&nbsp;for previously eligible Medicaid adult enrollees and suggest that states are inappropriately funneling federal taxpayer money to insurers, hospitals, and other health care interests through the ACA Medicaid expansion.</span></p> <p class="p1"><span class="s1">The magnitude of HHS’ error reveals a major flaw in the government’s ability to estimate the ACA’s costs, and worse, that the actual costs of the ACA’s Medicaid expansion appear much higher than expected. Both problems require the immediate attention of policymakers.<b>&nbsp;</b></span></p> <p class="p1"><span class="s1"><b>Medicaid Expansion Enrollees Are Much More Expensive Than Expected</b></span></p> <p class="p1"><span class="s1">Most experts, particularly proponents of the ACA, <a href=""><span class="s2">projected</span></a> that newly eligible adult enrollees would be less expensive than previously eligible adult enrollees. For example, HHS’ financial and actuarial experts <a href=""><span class="s2">projected</span></a> that adult Medicaid enrollees made eligible by the ACA would be 30% less costly than previously eligible adults enrolled in Medicaid. Apparently, their models did not account for states responding to the incentive of the elevated reimbursement rate to spend freely.</span></p> <p class="p1"><span class="s1">In last year’s Medicaid <a href=""><span class="s2">report</span></a>, HHS estimated that newly eligible adults had an average cost 19% higher ($5,517) in FY 2014 than the average cost for previously eligible adults ($4,650). In projecting future per enrollee costs, HHS’ experts assumed “that the effects of pent-up demand and adverse selection” would substantially diminish after 2014. HHS projected that the per enrollee cost of the newly eligible adults would decline by 22% in FY 2015 and would be about 11% less than those for other previously eligible adults.</span></p> <p class="p1"><span class="s1">It turns out that those projections were way off. Instead of a decline in per enrollee costs from FY 2014 to FY 2015, the newly eligible adult per enrollee cost increased significantly, reaching an estimated $6,366. HHS now projects that the newly eligible adult Medicaid enrollees will cost about 23% more than the previously eligible Medicaid enrollees in FY 2015. It is worth noting that pregnant women are included in the previously eligible Medicaid enrollment category; without them, the differences would be even more pronounced.</span></p> <p class="p1"><span class="s1">According to the 2015 report, HHS’ actuaries and financial experts expected much lower managed care capitation rates for the ACA expansion population than occurred. But, the higher payment rates should not be too surprising given the incentives created for states by the elevated federal reimbursement rate for the expansion population.</span></p> <p class="p1"><span class="s1">The elevated rate presents states with an incentive to create high fees for services commonly used by expansion enrollees, as well as high capitated payment rates for the insurers participating in Medicaid managed care. The health care interest groups within the states, particularly hospitals and insurers, benefit from the higher rates while federal taxpayers are left footing the bill. Although HHS expects that a risk sharing program will return money to the government (HHS expects the federal government will receive 9% of the payments back to&nbsp;lower the per enrollee costs of newly eligible enrollees to $5,001 and $5,796 respectively for FY 2014 and FY 2015), risk sharing creates an incentive for insurers to spend freely since unspent funds generally have to be returned. Moreover, the elevated federal reimbursement rate removes the incentives for states to make sure that insurers are not overspending on providers since overpayments come at the expense of federal, not state, taxpayers.</span></p> <p class="p1"><span class="s1"><b>ACA’s Medicaid Expansion Costs Are Increasing</b></span></p> <p class="p1"><span class="s1">In the 2015 report, HHS indicated that Medicaid spending reached $554.3 billion in FY 2015—5% higher than its&nbsp;projection ($529.0 billion) for FY 2015 from the previous year’s report and 12.1% above FY 2014 spending. In March, before the release of this new data, the Congressional Budget Office (CBO) increased its projection of federal spending for Medicaid by $146 billion over the 2016-2025 period—a substantial escalation from its projection just one year earlier. According to CBO, “the number of people estimated to have been enrolled in Medicaid in 2015 who were made eligible for the program by the ACA was significantly higher than … previously projected.” As CBO digests the much higher per expansion enrollee costs, its future estimates of Medicaid’s costs will undoubtedly increase.</span></p> <p class="p1"><span class="s1"><b>Congress Needs to Act</b></span></p> <p class="p1"><span class="s1">Recent <a href=""><span class="s2">evidence</span></a> that new Medicaid enrollees only receive about 20 to 40 cents of benefit for each dollar of spending on their behalf and that Medicaid expansion in Oregon was <a href=""><span class="s2">not</span></a> related to significant health improvements have already prompted major concerns about the ACA Medicaid expansion. The much higher than expected costs of ACA Medicaid expansion enrollees should increase concern about the ACA Medicaid expansion and should prompt robust oversight from federal policymakers. There is much work to be done after these troubling findings. As a start, HHS should make all the data available, including the average costs of expansion enrollees by state. In addition,&nbsp;Congress should closely scrutinize managed care contracts that states are making with insurers as well as any actions that HHS is taking to guard against outrageously high federal payments for the expansion population.</span></p> Thu, 21 Jul 2016 10:47:57 -0400 Stop Bleeding Red Ink, Make America Sustainable Again <h5> Expert Commentary </h5> <p class="p1"><span class="s1">The Congressional Budget Office recently released its long-term budget outlook. There isn't much new there; we are still in the red, and it will only continue to get worse. Considering the extent of the problem, you would think someone on the campaign trail would pay attention. Yet no presidential candidate really is.</span></p> <p class="p1"><span class="s1">First, CBO projects that the federal public debt-to-GDP ratio will go from its current 75 percent (up from 39 percent in 2008) to 86 percent in 2026 and 141 percent in 2046. On the deficit side, CBO projects that by 2020, our deficit level will reach $1 trillion, up from its current level of $534 billion. Today's deficit-to-GDP ratio is 2.9 percent, and it may be close to 5 percent in 10 years and 8.8 percent in 2046.</span></p> <p class="p1"><span class="s1">There are a lot of assumptions going into these projections. As we know, a small change in these assumptions can have a significant impact. For instance, the newest projections show a slight improvement over previous projections because of lower-than-expected interest rates. However, CBO warns, a 1 percent increase in interest rates would propel the debt-to-GDP level to 188 percent. Gross debt would be much higher.</span></p> <p class="p1"><span class="s1">In addition, we know that many of these assumptions (e.g., that there will not be a depression in the next 30 years and that the unemployment rate will stay consistently at 5 percent over the next 30 years) are unlikely to materialize, which would make the final numbers look way worse than they do now.</span></p> <p class="p1"><span class="s1">But even without assuming the worst, CBO talks about our dire fiscal outlook, "with debt growing larger in relation to the economy than ever recorded in U.S. history." Indeed, down the road, debt is projected to reach much higher levels than in the aftermath of World War II, when it stood at 106 percent of gross domestic product. But these levels of debt today are more worrisome than in the 1940s. For one thing, the debt levels in the '40s were the product of significant increases in war spending, which naturally went down after the war. In addition, the postwar era experienced a fast-growing economy, which also helped lead to major reductions in debt levels.</span></p> <p class="p1"><span class="s1">That is not going to happen today. CBO projects meek economic growth all the way to 2046, along with large increases in spending levels. That means that unless we get a major breakthrough in technology or a life-altering discovery (which could happen, of course), I wouldn't count on post-WWII reduction in deficits and debt and growth levels.</span></p> <p class="p1"><span class="s1">But debt and deficits are only a symptom of a deeper problem; spending is growing faster than revenue. While revenue will grow from 18.2 percent of GDP today to 19.4 percent by 2046 (when the 50-year average will be 17.4 percent), spending will explode from 21.1 percent of GDP today to 28.2 percent of GDP in 2046 (when the 50-year average will 20.2 percent).</span></p> <p class="p1"><span class="s1">The drivers of our future debt, CBO reminds us, are still the so-called entitlement programs — government-provided health care spending, in particular. It doesn't mean that Social Security is not a problem, because it is — as is the large growth in interest payments on our debt. But you wouldn't know that by listening to the vague policy options on the campaign trail or in Washington, where talks of expanding Social Security, adding a public option to the Affordable Care Act and not touching Medicare are very popular.</span></p> <p class="p1"><span class="s1">Each day of the Republican National Convention had a different theme. Monday's theme was "Make America Safe Again." Another was "Make America First Again." Maybe someone should suggest that we "Make America Sustainable Again."</span></p> Thu, 21 Jul 2016 10:40:43 -0400 Recessions Don't Have The Same Impact On Every City <h5> Expert Commentary </h5> <p class="p1"><span class="s1">It has been just over seven years <a href=""><span class="s2">since the Great Recession</span></a> ended. The national economy has been expanding—<a href=""><span class="s2">albeit slowly</span></a>—over the last seven years, but there are still some measures, such as the <a href=""><span class="s2">labor force participation rate</span></a>, that have yet to fully bounce back. Even though the national statistics indicate a growing economy, <a href=""><span class="s2">some areas are still struggling.</span></a></span></p> <p class="p1"><span class="s1">The U.S. is a large country made up of hundreds of local economies, each with its own mix of industries and residents along with different local policies. The economic fluctuations of these local economies determine the business cycle of the country as a whole, and during any national recession some local economies may be doing fine. But exactly how much local variation is concealed by the national measures?</span></p> <p class="p1"><span class="s1">A <a href=""><span class="s2">new paper published in the Journal of Urban Economics</span></a> sheds some light on this question by examining the business cycles of local economies. The authors—Maria Arias and Charles Gascon of the St. Louis Fed and David Rapach of Saint Louis University— have created monthly economic activity indices for the 50 most populated Metropolitan Statistical Areas (MSA) using data from 1990 to 2015. This period <a href=""><span class="s2">covers three national recessions</span></a> (July 1990 – March 1991, March 2001 – Nov. 2001, and Dec. 2007 – June 2009) and the indices show how the economies of each of these MSAs performed during these economic downturns.</span></p> <p class="p1"><span class="s1">The table below is taken from the paper and lists the 50 MSAs along with the dates of their recessions over this 26 year period.</span></p> <p class="p1"><a href=""><img src="" width="575" height="533" /></a></p> <p class="p1"><span class="s1">The table shows that many of these MSAs did not experience a recession in the early 1990s. In fact, only 26 of the 50 had their first recession (columns 2 and 3) in the 90s. The later recessions were more widespread: 32 out of 50 in the early 2000s and 49 out of 50 during the most recent recession. The only exception during the Great Recession was Oklahoma City, which according to this study has not experienced one recession since 1990.</span></p> <p class="p1"><span class="s1">In addition to the variation across recessions, there is also variation across MSAs within a recession. In the early 90s, Los Angeles’ recession lasted over three years (3/90 t0 4/93) while Memphis’ started a month later and only lasted a year (4/90 to 4/91). Detroit’s version of the Great Recession lasted nearly four years (10/05 – 6/09) while Boston’s was just over a year (7/08 – 8/09).</span></p> <p class="p1"><span class="s1">There is even variation across MSAs in the same state: Neither Columbus nor Cincinnati experienced a recession in the early 90s while Cleveland had one that lasted almost two years. The authors also point out that the “” recession of the early 2000s had a large effect on the tech hubs of San Francisco and San Jose while Sacramento, also in California, did not experience a recession during that time period.</span></p> <p class="p1"><span class="s1">Table 4 from the paper gives us a clearer picture of the overlap between each MSA’s economy and the national economy.</span></p> <p class="p1"><a href=""><img src="" width="575" height="339" /></a></p> <p class="p1"><span class="s1">The columns show the number of months (out of 305) that the local economy was expanding when the national economy was expanding (columns 1 and 8); was in a recession when the national economy was in a recession (columns 3 and 9); was in a recession when the national economy was expanding (columns 4 and 10); and was expanding when the national economy was in a recession (columns 5 and 11). Columns 6 and 12 show the percentage of months in which the local and national economy were in the same phase of the business cycle.</span></p> <p class="p1"><span class="s1">Atlanta (96%) and Charlotte (95%) closely track the national economy while Detroit (78%) and Hartford (80%) were in the recession phase more often than the national economy. New Orleans’ match rate of only 35% is an outlier and the authors attribute some of its poor economic performance <a href=""><span class="s2">to Hurricane Katrina</span></a>, which occurred near the middle of the sample period (2005).</span></p> <p class="p1"><span class="s1">The authors also examine the relationship between the severity of local economic recessions and various MSA characteristics. They find robust evidence that MSAs with less-educated populations and with <a href=""><span class="s2">more inelastic housing supplies</span></a> experience more severe recessions. An inelastic supply of housing means that the quantity of housing is not very responsive to changes in the price of housing.</span></p> <p class="p1"><span class="s1">There is a large body of evidence showing that the proportion of educated residents in a city <a href=""><span class="s2">has a positive effect on subsequent economic and population growth</span></a>. It is not surprising to see that the economies of MSAs with more educated populations are more resilient as well.</span></p> <p class="p1"><span class="s1">The authors note that the elasticity of housing supply effect is also consistent with other studies. Places with more <a href=""><span class="s2">inelastic housing supplies experience larger housing price fluctuations</span></a> which can make these areas more susceptible to “boom-bust” housing cycles. There is also evidence<a href=""><span class="s2">that households decrease their spending in response to changes in housing net worth</span></a>, and since economic shocks that decrease housing demand will have larger price effects in areas with more inelastic housing supplies, it follows that these areas will experience larger declines in spending and thus longer and more severe recessions on average.</span></p> <p class="p1"><span class="s1">A map (below) from the study shows that areas that experienced some of the largest increases and subsequent decreases in housing prices prior to the Great Recession—Jacksonville, Tampa, and Orlando in Florida and Riverside, Sacramento, and Las Vegas in the West—were in a recession 6 months before the country as a whole (shaded indicates in recession according to study).</span></p> <p class="p1"><a href=""><img src="" width="575" height="271" /></a></p> <p class="p1"><span class="s1">The finding that an inelastic housing supply can deepen or prolong a recession is another reason for local policy makers to free up their housing markets. <a href=""><span class="s2">Other research</span></a> argues that local housing restrictions make housing more expensive, especially in the most productive areas of the country, which makes it harder for people to migrate to those areas. Ultimately this results <a href=""><span class="s2">in less GDP and makes us all worse off</span></a>.</span></p> <p class="p1"><span class="s1">The economic variation across the U.S. is considerable. Such variation calls into question the usefulness of top-down, federal fiscal policy designed to smooth out recessions. At any given time only some local economies are contracting, and <a href=""><span class="s2">the haphazard way in which fiscal stimulus is often implemented</span></a> may lead to some economies overheating while others are left languishing.</span></p> Thu, 21 Jul 2016 10:33:07 -0400 Against Regulatory Complexity <h5> Expert Commentary </h5> <p class="p1"><span class="s1">With the July 21 anniversary of the Dodd-Frank Wall Street Reform and Consumer Protection Act now upon us, it’s a good time to reflect on how this type of Byzantine legislation spawns a convoluted network of tangled regulations.&nbsp;</span></p> <p class="p1"><span class="s1">When recently unveiling his Financial CHOICE Act, House Financial Services Committee Chairman Jeb Hensarling <a href=""><span class="s2">highlighted</span></a> a key principle behind his efforts to combat this overgrowth: “Simplicity must replace complexity.” The chairman’s focus on regulatory complexity is appropriate.</span></p> <p class="p1"><span class="s1">In many ways, regulations are like a computer’s operating system, establishing processes and parameters within which programs must operate. But anyone who has undergone the experience of “upgrading” an operating system only to find her computer sluggish and unresponsive knows that complexity is not always a desirable feature. Steven Teles, a political scientist with Johns Hopkins, made a similar comparison when he <a href=""><span class="s2">famously referred</span></a> to American policy as a “kludgeocracy,” an ever-expanding series of “inelegant patch(es)” meant to solve short-term problems, but which ultimately hinder system performance.</span></p> <p class="p1"><span class="s1">A recent <a href=""><span class="s2">analysis</span></a> showed that Dodd-Frank accounted for nearly 30,000 new regulatory restrictions — more than all other laws passed during the Obama administration combined. These new regulations, authorized by a Congress in crisis mode, were piled on top of more than one million existing regulatory restrictions. Even former Senator Chris Dodd, one of the bill’s namesakes, <a href=""><span class="s2">admitted</span></a> just after the bill’s passage that “no one will know until this is actually in place how it works.” Scholars subsequently <a href=""><span class="s2">argued</span></a>that the regulatory uncertainty exacerbated by Dodd-Frank could explain the slow recovery. At the time, however, some facts were clear: Dodd-Frank would increase regulatory complexity, induce uncertainty, and line the pockets of regulatory compliance experts.</span></p> <p class="p1"><span class="s1">To an unprecedented degree, simply ascertaining the relevance of regulations stemming from an act of Congress now requires regulatory compliance expertise. To illustrate, consider a simple visualization of regulatory restrictions originating from another major financial regulatory law, the Sarbanes-Oxley Act of 2002. Sarbanes-Oxley, which dealt with audits and financial reporting, affected public companies in all sectors of the economy and induced some regulations that specifically targeted a handful of industries. Textual analysis of those regulations shows that five industries were directly targeted by regulations from two federal agencies:</span></p> <p class="p1"><img height="388" width="500" src="" /></p> <p class="p1"><span class="s1">Sarbanes-Oxley was, of course, a significant regulatory overhaul in its own right. In 2012, the Wall Street Journal Editorial Board went so far as to <a href=""><span class="s2">call it</span></a> one of the reasons for slow economic growth. Furthermore, much of the effect of Sarbanes-Oxley stems from the creation of the Public Company Accounting Oversight Board, a regulatory entity that awkwardly straddles the public-private divide with considerable control over auditing firms and — indirectly — the public companies they audit.</span></p> <p class="p1"><span class="s1">Nonetheless, even allowing for the additional complexity of referencing accounting standards that are not formally published as regulations, Sarbanes-Oxley is a model of simplicity compared to Dodd-Frank. Consider a similar visualization of the agency-industry relationships emerging from Dodd-Frank — which, for the sake of visualization, is limited to only 10 agencies and 10 industries. In fact, at least 32 different agencies have promulgated rules under the statutory authority of Dodd-Frank:</span></p><p><img height="492" width="500" src="" /></p> <p class="p1"><span class="s1">In the post-Dodd-Frank world, understanding which regulations are relevant to a business’s activities has become immensely more difficult. Many sectors of the economy were newly exposed to regulations from a multitude of unfamiliar agencies. Duplicative and contradictory rules became a <a href=""><span class="s2">fact of life</span></a>.</span></p> <p class="p1"><span class="s1">In 1788, James Madison <a href=""><span class="s2">worried</span></a> that laws may become “so voluminous that they cannot be read, or so incoherent that they cannot be understood.” He was right to worry: current regulatory code is so complex and voluminous that, rather than spend <a href=""><span class="s2">three years</span></a> reading it, I helped create text analysis software that uses machine learning to assess the probability that a given regulatory restriction targets a specific industry. But even with the insights of machine learning and text analysis software — or regulatory compliance experts who bill by the hour — considerable uncertainty remains. Regulatory agencies, themselves, are, increasingly, <a href=""><span class="s2">unfamiliar</span></a> with their own regulations.</span></p> <p class="p1"><span class="s1">When there are more rules in place than anyone can read, and interpretation of those rules and their scope is determined by the regulators themselves, businesses must pay for experts to filter the rules that are truly relevant from the rest. Meanwhile, businesses must also keep an eye on new rules coming down the pipeline and the possibility of reinterpretation of old rules. For both federal regulations and statutes, an irrelevant requirement only remains irrelevant until a bureaucrat, or a <a href=""><span class="s2">federal prosecutor</span></a>, decides otherwise.</span></p> <p class="p1"><span class="s1">Regulatory complexity engenders uncertainty. That may not be a problem for some politicians; but for anyone who must comply with regulations, complexity and uncertainty can be paralyzing. Simplifying the complex regulatory regime imposed by Dodd-Frank is an application of another lesson from the world of computer programming: iterative design can correct serious errors and reduce unnecessary complexity.</span></p> Thu, 21 Jul 2016 10:20:22 -0400 Where The Financial CHOICE Act Goes Wrong <h5> Expert Commentary </h5> <p class="p1"><span class="s1">New landmark financial services legislation recently introduced by Chairman Jeb Hensarling (R., Texas) of the House Financial Services Committee would go a long way toward addressing Dodd-Frank’s mistaken approach to financial regulation. Dodd-Frank assumes government can run financial markets through a combination of micromanagement, regulatory discretion and high penalties.</span></p> <p class="p1"><span class="s1">The Financial CHOICE Act looks to make market actors—not government bureaucrats—responsible for running financial firms and picking up the pieces when they run them into the ground. Out of step with the rest of the legislative package, however, is a plan to hike corporate penalties.</span></p> <p class="p1"><span class="s1">The desire to appear “tough on Wall Street” is understandable given critics’ predictable mischaracterization of the bill as “a wet kiss for Wall Street.” These same critics welcome corporate penalties even when innocent shareholders foot the bill. Rather than embrace the critics’ flawed logic, Congress may want to look at how such policies worked in the past.</span></p> <p class="p1"><span class="s1">When Congress passed a massive bailout for the Savings and Loan Industry in the Financial Institutions Recovery Act (FIRREA) in 1989, the legislation included banking sector penalties of up to $1 million per day for violations of banking rules. The structure of those penalties allowed for draconian recoveries for small infractions—the banking law equivalent of a million dollar ticket for speeding on the highway.</span></p> <p class="p1"><span class="s1">And in recent years, the Department of Justice has abused FIRREA by bringing multi-billion dollar cases built on increasingly novel interpretations of the act. Just a couple of months ago the Second Circuit rejected one of these cases and overturned over a billion dollar FIRREA penalty against Bank of America for lack of culpable intent (even under the low culpability standards of that law).</span></p> <p class="p1"><span class="s1"><b>Corporate penalty increases are not the right answer</b></span></p> <p class="p1"><span class="s1">Increasing those penalties by 50%, as the House draft would do, is not the answer. If it is, why stop the bidding there? If raising financial sector civil penalties is an absolute good, then why not a 100% increase in daily penalties? This legislation includes a requirement that financial regulatory agencies conduct cost-benefit analysis for new rules. Where is the cost-benefit analysis for these penalty enhancements?</span></p> <p class="p1"><span class="s1">The draft legislation also includes provisions enhancing penalty powers for the Securities and Exchange Commission that build on the controversial SEC enforcement powers and penalties granted in the Dodd-Frank Act and the Sarbanes-Oxley Act.</span></p> <p class="p1"><span class="s1">The draft increases corporate fraud penalties such that they mirror investor harm. It’s difficult to argue against that concept on its face, but corporate penalties are almost always paid by shareholders—the very same shareholders who were harmed by wrongdoing in the first place.</span></p> <p class="p1"><span class="s1">The draft contains enhanced penalties for insider trading as well. Again, Congress could learn from the past. While insider trading is illegal, it’s not clearly defined in statute. The SEC has a long history of expanding the doctrine of insider trading to include activity which most reasonable people wouldn’t consider culpable.</span></p> <p class="p1"><span class="s1">The SEC’s enforcement of the Foreign Corrupt Practices Act (FCPA) is another case in point for the dangers of penalties poorly metered to harm. The typical FCPA case has resulted in corporate legal and compliance expenses of <a href=""><span class="s2">nearly 10 times the actual fine</span></a>. In the 35 years of enforcing this law, and after billions in settlements and hundreds of cases, <a href=""><span class="s2">only one has actually gone to trial.</span></a></span></p> <p class="p1"><span class="s1"><b>Optimal penalty design is a dangerous job to leave in the hands of the Congress</b></span></p> <p class="p1"><span class="s1">While the bill contains helpful due process reforms, like requiring the SEC chief economist to weigh in on the impact of settlements on shareholders, the reforms will not counteract leverage that poorly metered penalties offer in settlement discussions outside of trial. A comprehensive examination of the balance between financial penalties on the books and due process might begin with the FCPA.</span></p> <p class="p1"><span class="s1">The financial penalty enhancements being considered in this legislation must be considered in light of this history. When the law is defined one case at a time, in a highly charged polemical environment that encourages pursuit of non-culpable activity, the rule of law comes under threat.</span></p> <p class="p1"><span class="s1">The fact is that an optimal penalty design is a dangerous job to leave in the hands of the Congress. Over the last 25 years polemics have always trumped policy on this issue in the legislative branch. A better approach would be to order the financial regulators to impanel an advisory commission of enforcement experts, including government lawyers, defense counsel, and economists to advise the financial regulators in both the optimal design of civil and criminal penalties and the problem of unpredictable doctrine creep in the law.</span></p> <p class="p1"><span class="s1">The CHOICE Act has much to commend, but incorporating historical lessons would make it more effective at ensuring that the right parties are held accountable for their misdeeds without harming the innocent in the process.</span></p> Thu, 21 Jul 2016 10:03:53 -0400 Airplane Speeds Have Stagnated for 40 Years <h5> Publication </h5> <p class="p1">This year marks the 40th anniversaries of two of the greatest achievements in manned flight. In 1976, US military pilot Eldon W. Joersz set the still-standing airspeed record of 2,193.2 mph in the Lockheed SR-71 “Blackbird.” That same year, the Concorde introduced the world to supersonic commercial travel with the first passenger flights to break the sound barrier.</p> <p class="p3"><a href=" copy.jpg"><img src=" copy.jpg" width="575" height="444" /></a></p> <p class="p1">In the decades to follow, the speed of aviation stagnated—and even regressed. The SR-71 retired from service in 1999, and no commercial airliner in service today flies at Mach 1 (the speed of sound), much less the Mach 2 speeds reached by the Concorde. The time required to fly from Los Angeles to New York or across the Atlantic Ocean is no different than it was 40 years ago for the average airline passenger.<span style="font-size: 12px; background-color: white;">&nbsp;</span></p> <p class="p1">The initial progress and current stagnation of airplane speeds is plain to see by looking at manned, air-breathing flight airspeed records since the Wright brothers’ first flight on December 17, 1903, which is estimated to have flown at 6 mph. Airspeed record data have been gathered since 1905 by the Fédération Aéronautique Internationale (FAI), an international organization that sets standards for tracking, measuring, and verifying aviation records.</p> <p class="p1">The blue trend line shows the rapid pace at which the airspeed record was bested in the first three-fourths of the 20th century. The fastest growth in achievable airspeeds occurred in the decades following the Second World War. The sound barrier was famously first broken by Chuck Yeager in 1947, but the FAI did not recognize Yeager’s flight for the record because the plane was rocket-powered and launched by a drop from a B-29 bomber. The first supersonic flight recorded by the FAI was made in 1955 by US Air Force Colonel Horace Hanes.</p> <p class="p1">Shortly thereafter, the FAI recorded its first Mach 2 flight in 1958. That speed was matched in commercial flight less than two decades later, a testament to the high level of innovation in air travel during the mid-20th century.</p> <p class="p1">What happened to this high level of innovation in air travel? Civil supersonic aviation was banned over the United States in 1973 because of fears that sonic booms would damage buildings and constitute an intolerable nuisance. The outright ban limited the market for the Concorde to transoceanic routes and destroyed incentives for research and development of new supersonic transports. Since 1973, airplane manufacturers have innovated on margins other than speed, and as a result, commercial flight is safer and cheaper than it was 40 years ago. But commercial flight isn’t any faster—in fact, today’s flights travel at less than half the Concorde’s speed.</p> <p class="p1">If we want to restore mid-century levels of aviation innovation and break the sound barrier again, we must first break regulatory barriers. The FAA should lift its ban on civil supersonic flight. Legitimate concerns about supersonic flight can be handled by specific policies that address concerns directly, such as a clear standard from the FAA for acceptable noise levels. It would be a shame to suffer another four decades of aviation stagnation.&nbsp;</p> Wed, 20 Jul 2016 14:29:17 -0400 Thirty Years after the Nobel: James Buchanan's Virginia Political Economy ( <h5> Events </h5> <p>Thirty years ago, in October 1986, James M. Buchanan was awarded the Nobel Prize in economics “for his development of the contractual and constitutional bases for the theory of economic and political decision-making.” His contributions in these areas as well as those in methodology, social philosophy, public policy economics, and political science continue to have a lasting influence on scholarship today.</p> <p>Please join us on October 6, 2016 for a keynote speech and panel discussion to reflect on the significance of Buchanan’s Nobel Prize and the various strands of influence his work has had in subsequent decades of scholarship. We will discuss his contributions in the fields of social and political philosophy, social contract theory, and constitutional political economy, together with his influence on the research of other prominent economic thinkers. In keeping with the F. A. Hayek Program’s view of political economy as a progressive research program, we will explore key themes in Buchanan’s research and see where they may lead us for the future of the discipline.</p> <p>From 2:00 to 3:30 p.m., <b><a href="">Michael Munger</a></b>, Professor of Political Science, Public Policy, and Economics at Duke University, will deliver a keynote address on James Buchanan’s contributions to political economy and social philosophy.</p> <p>From 4:00 to 5:30 p.m., we will have a roundtable discussion with:</p><ul><li><b style="font-family: inherit; font-style: inherit; background-color: white;"><a href="">David Schmidtz</a></b><span style="font-size: 12px; background-color: white;">, Kendrick Professor of Philosophy at the University of Arizona</span></li><li><b style="font-family: inherit; font-style: inherit; background-color: white;"><a href="">Barry Weingast</a></b><span style="font-size: 12px; background-color: white;">, Ward C. Krebs Family Professor of Political Science at Stanford University</span></li><li><b style="font-family: inherit; font-style: inherit; background-color: white;"><a href="">Luigi Zingales</a></b><span style="font-size: 12px; background-color: white;">, Robert C. McCormack Distinguished Service Professor of Entrepreneurship and Finance at the University of Chicago Booth School of Business</span></li></ul> <p>If you have any questions, please contact Martha Anderson at <a href=""></a></p> <p><b>About the Buchanan Speaker Series</b></p> <p>The Buchanan Speaker Series promotes Nobel laureate James Buchanan’s intellectual legacy by applying Buchanan’s ideas to the pressing matters of our time.</p> <p>James Buchanan moved to George Mason University in the early 1980s. His influence on the developing agenda at the Mercatus Center has been important in at least two ways. One is how it fostered a broad research and educational vision that seeks to embrace both political economy and social philosophy. As Buchanan once put it when establishing his first academic center at the University of Virginia in the late 1950s—the Thomas Jefferson Center for Studies in Political Economy—the faculty will</p> <blockquote><p style="padding-left: 30px;">“strive to carry on the honorable tradition of ‘political economy’—the study of what makes for a ‘good society.’ Political economists stress the technical economic principles that one must understand in order to assess alternative arrangements for promoting peaceful cooperation and productive specialization among free men. Yet political economists go further and frankly try to bring out into the open the philosophical issues that necessarily underlie all discussions of the appropriate functions of government and all proposed economic policy measures.”</p></blockquote> <p>Buchanan’s other lasting influence is his motto “dare to be different.” Mercatus is grounded in the intellectual traditions best exemplified by F. A. Hayek, but our scholars also draw from the best work in contemporary social science and the humanities. As Buchanan noted in a 1979 essay honoring Hayek, “The diverse approaches of the intersecting ‘schools’ must be the bases for conciliation, not conflict. We must marry the property-rights, law-and-economics, public-choice, Austrian subjectivist approaches.” At George Mason and the Mercatus Center this intellectual marriage has taken place.</p> Thu, 21 Jul 2016 14:19:30 -0400 The Rationalia Fallacy <h5> Expert Commentary </h5> <p class="p1"><span class="s1">Science is a noble witness, but a nefarious judge.</span></p> <p class="p1"><span class="s1">Celebrity astrophysicist Neil deGrasse Tyson recently launched a hail of hubris across the internet with <a href=""><span class="s2">this tweet</span></a>: "Earth needs a virtual country: #Rationalia, with a one-line Constitution: All policy shall be based on the weight of evidence."</span></p> <p class="p1"><span class="s1">Tyson's fantasy – a rational, science-based society – is nothing new. France's Reign of Terror near the close of the 18th century was sanctified in the revolution's Temples of Reason. Stalin and Mao murdered tens of millions in pursuit of Marx's scientific socialism. In the half-century before World War II, the world's greatest scientific minds conjured up eugenics – a dark pseudoscience of human breeding – and browbeat servile policymakers into a scourge of forced sterilizations in America and genocide in Europe. Nazi Deputy Führer Rudolf Hess stated – probably sincerely – that "National Socialism is nothing but applied biology."</span></p> <p class="p1"><a href="">Continue reading</a></p> Mon, 18 Jul 2016 19:01:25 -0400 Why Banks Should Beware of “Misbehaving” <h5> Publication </h5> <p class="p1"><b>Introduction</b></p> <p class="p1">Over the last few decades, psychologists have challenged economists on the notion that people always make rational decisions. Economists, of course, recognize that people are not always perfectly rational. Modeling them as such often adds to the precision of the model’s result, without reducing its relevance. Put another way, economists assume that most of the time people act rationally enough that modeling them as perfectly rational does not get in the way of discovering new insights into human behavior.</p> <p class="p1">Nevertheless, behavioral psychologists found this rational choice–based method wanting and have amassed a sizeable body of research demonstrating certain “anomalies” in laboratory studies that break from rational choice predictions. For example, behavioral psychologists Amos Tversky and Daniel Kahneman famously claimed that people are susceptible to certain biases that make them more risk averse to gaining wealth (and more risk seeking in losing it) than the standard rational choice model would predict. Furthermore, they claimed that framing choices in different ways elicits inconsistent behavior.</p> <p class="p1">These ideas eventually coalesced into the field known as “behavioral economics” and have since made their way into public policy. An example of this is the Consumer Financial Protection Bureau (CFPB), which regulates consumer credit products, such as mortgages and credit cards, and consumer credit providers, such as banks, payday lenders, and cell phone providers. This agency was largely influenced by behavioral economics in setting its organizational mission and goals, such as protecting consumers from exploitation and manipulation by credit providers.</p> <p class="p1">Despite these behavioral-based foundations (or perhaps because of them, as I will explain below), the CFPB has been criticized from both sides of the political divide for its aggressive bureaucratic expansion and failure to adhere to its original congressional mandate. Furthermore, the actions of the agency have directly led to the significant reduction in volume of certain credit products (e.g., residential mortgages, auto loans) in a manner that calls into question whether the agency is helping or harming consumers.</p> <p class="p1">The purpose of this paper is to outline the impact of behavioral economics on public policy by examining its central influence on the CFPB. In particular, it explains how behavioral ideas have been converted into policies that fail to account for actual government practice, which has led to mixed results for consumers. While understanding just how people are susceptible to market influence is important, the premature application of behavioral economics to public policy risks undermining the goal of helping consumers.</p> <p class="p1"><b>What is Behavioral Economics?</b></p> <p class="p1">Behavioral economics, simply put, is psychology applied to traditional economic concepts. What is novel about this approach is that it couches its critique in a language economists can understand. So, for example, when people are more likely to insure against risk because they fear losses more than they enjoy gains, behavioral economists position this outcome within the standard utility maximization framework employed by economists, but with the added flourish of describing such behavior as exhibiting “loss aversion.”</p> <p class="p1">Best-selling books, including <i>Nudge</i>, <i>Predictably Irrational</i>, and <i>Thinking, Fast and Slow,</i> have provided the public with accessible entries into the world of behavioral economics. Be it by showing how we process information and awareness through two corresponding mental systems (<i>Thinking, Fast and Slow</i>) or exposing why we react differently while in a panicked state (<i>Blink</i>) or demonstrating how government can be used to improve our everyday choices (<i>Nudge</i>), these books represent a growing and popular topic of inquiry among academics, policymakers, and even the general public.</p> <p class="p1">Whether this is a fad or something deeper, behavioral economics is already making a noticeable impact on several regulatory fronts, most notably in consumer finance, enough to be labeled by some as the “new paternalism.” For example, the CFPB implemented a provision that defined so-called “qualified mortgages,” a category of loans in which lenders adhere to certain parameters such as setting nonadjustable interest rates, determining the borrower’s ability to repay, etc. This is all predicated on the assumption that consumers do not understand what they are agreeing to—and that assumption, at the very least, constitutes a departure from the traditional justification for regulatory intervention, which is market failure.</p> <p class="p1">This policy outcome, like others from the CFPB as noted below, can be traced back to behavioral roots. In this case, it is from the book <i>Nudge</i>, which outlines a number of possible “soft” interventions into the marketplace to correct for common mistakes people make. The authors of <i>Nudge</i>, Richard Thaler and Cass Sunstein, have done more than anyone else to bring behavioral economics from mere laboratory studies of human behavior out into the world of policy. In chapter 8 of <i>Nudge</i>, they criticize mortgage products with low introductory interest rates and balloon payments as being too complicated for consumers to understand. They argue that products with simpler terms and conditions (e.g., a 30-year fixed mortgage) make choices easier for consumers and thus provide the standard by which all alternatives should be compared.</p> <p class="p1">The 2008 scribbling of two behavioral economists has become our new reality, as the financial industry must now work within regulations that penalize mortgage products that fail to adhere to federal guidelines. Qualified mortgages are restricted to those with fixed terms and interest rates. Mortgage products with features like adjustable rates and amortization fees are unlikely to pass muster. Banks can, of course, offer nonqualified mortgages, but they risk being sued by the borrower if they default, and there is a stigma associated with such a product label.</p> <p class="p1">Moreover, much of the financial industry has reacted to the implementation of the rule by withdrawing from the mortgage market altogether. The figure below, generated from data provided by the Mortgage Bankers Association, shows a steep reduction in the volume of residential loans in 2013, as originally reported by This decline is most pronounced with refinancing mortgage loans. This trend is in tandem with the implementation of the Ability-to-Repay and Qualified Mortgage Rule on January 10, 2014, which formally defined “qualified mortgages.”</p> <p class="p2">&nbsp;</p> <p class="p1">Figure 1. US Residential Loan Origination Trends</p> <p class="p3"><img height="444" width="575" src=" jpg.jpg" /></p> <p class="p1">Source: Mortgage Bankers Association, “Annual Mortgage Origination Estimates,” February 2016.</p> <p class="p1">The recent uptick in these originations indicates that the market may be finally adjusting to the new rules, though it is unclear whether volume will return to its previous level.</p> <p class="p1">This may be all well and good for those who believe we should all consume “plain vanilla” products. But for those who understand—and indeed, conduct their business on—the flexibility that alternative credit products provide, the new regulations stifle credit markets in a way that most assuredly hurts not just the financial industry but consumers, too.</p> <p class="p1"><b>Behavioral Economics in the Consumer Finance Industry</b></p> <p class="p1">These interventions into what behavioral economists describe as the “choice architecture” of the marketplace constitute a very real and problematic constraint for the financial industry, as the figure above illustrates. This stems from the work of the CFPB—one of Washington’s newest agencies and a major part of the larger Dodd-Frank Wall Street Reform and Consumer Protection Act—which regulates virtually any consumer practice in the financial industry. Even practices that the agency was explicitly told to ignore, such as auto lending, have become significant targets for the agency’s efforts.</p> <p class="p1">That this new agency is so aggressive should be no surprise given its lack of congressional oversight or budgetary approval. Indeed, these features are so extraordinary that the constitutionality of the CFPB is now facing a challenge in the US Court of Appeals for the DC Circuit. While Congress is certainly no safety valve against bureaucracy, it can create limits to certain excesses, particularly when those excesses affect the interests of constituents. The fact that the CFPB need not concern itself with the interests of the market participants it regulates, or the full range of consumers these regulations ultimately impact, is in large part responsible for the mixed results.</p> <p class="p1">What is novel about the agency is that its roots go deeply into the world of behavioral economics. Senator Elizabeth Warren, who was the driving force behind the agency’s inception, relied on behavioral assumptions in making her original case for the agency. Senator Warren later teamed up with well-known behavioral economist Oren Bar-Gill in an expanded law review article to make the case for the need for an agency dedicated to consumer finance. They cited cognitive shortcomings that people exhibit—including dealing poorly with complicated information, displaying inertia in switching to new products, and not providing for their true long-term interests—as justifications for such an agency. The agency would, therefore, be justified in targeting products based upon a preconceived notion of what is best for the consumer (as occurred in the example above with qualified mortgages).</p> <p class="p1">What is “best” for the consumer is defined by the agency itself. Rulemaking has largely been opaque at best, not transparent, though one interesting fact is that <i>Nudge </i>coauthor Richard Thaler is an official member of the agency’s advisory board. In fact, many of the targets of <i>Nudge </i>have become the targets of the agency. In addition to complex mortgages, the agency has targeted add-on products like credit card insurance and overdraft protection. These latter products represent what behavioral economists call “shrouded fees,” which they claim are meant to mislead consumers into making unwise purchases.</p> <p class="p1">In congressional testimony, law professor Todd Zywicki explained how the resulting decline in overdraft fees has also caused a precipitous decline in free checking accounts. Since the passage of Dodd-Frank, the number of banks offering free checking accounts has declined by half, with the accompanying result that banks have doubled the required account balance needed to maintain these checking accounts.</p> <p class="p1">The underlying trouble with closing credit channels is that this does nothing to boost consumer income. Instead, it simply takes away “undesirable” choices, as defined by bureaucrats, without replacing those choices with more viable ones.</p> <p class="p1">Overdraft protection in general has been a constant source of discussion within the CFPB and the Federal Reserve (Fed), where the CFPB is located. In 2010, the Fed required all banks to ask their customers to “opt in” to continue using overdraft services. Behavioral economists claimed that survey evidence indicated people do not really value the service and would not opt in if asked to use the service directly. But in fact, the opposite occurred, as those who most used the service were three times as likely to opt in as normal users. The regulation of overdraft protection has since passed on directly to the CFPB, which has only continued the trend the Fed started.</p> <p class="p1">This example would seem to challenge the claim that people are not rational in their decision-making. Either people are using a service that does not benefit them because they are not rational, and therefore should become aware of this when given information required by regulators, or they were rational in using the service in the first place and would, therefore, obviously choose to opt back into the service when asked.</p> <p class="p1">Some behavioral economists have instead argued that these supposed “high-value” users are only highly deluded and are now calling for restrictions on overdraft fees in general, despite the fact that other evidence shows closing off such channels encourages the use of payday lenders, another perennial target of the CFPB, or even loan sharks.&nbsp;</p> <p class="p1">This exposes a larger issue with using behavioral economics as a platform for policy prescription—it is not clear ex ante what behavior is considered rational and what is not. Shifting the definition of what constitutes rational behavior undermines the scientific basis for behavioral remedies. The result is a series of just-so stories that can appeal to the very same biases behavioral economics seeks to redress (i.e., confirmation bias among regulators).</p> <p class="p1"><b>Why Should the Financial Industry Care?</b></p> <p class="p1">The massive number of financial regulations that emerged from Dodd-Frank has perhaps obscured the growing influence of behavioral economics in this policy arena. But once adopted, regulations can become very hard to undo, particularly when they reflect a larger political movement, in this case propelled by a growing portion of the academy. Bottom line, behavioral economics is here to stay and will likely continue to drive regulatory reform in financial markets.</p> <p class="p1">While this will be welcomed by some who appreciate a more nuanced framework for addressing consumer missteps, others will be troubled by the idea that an agency can target products based on bureaucrats’ ideas of what is best for the consumer. The examples above show how this heavy-handed approach, guided by academic thinking, can lead to poor outcomes—not only for the financial industry but for financial consumers as well.</p> <p class="p1">The example of overdraft protection specifically demonstrates the growing influence of behavioral economics in this policy area. Use of overdraft protection declined in 2010, resulting in a loss of $2 billion to the industry itself. This is why a better framework is needed for dealing with consumer finance. Behavioral economics will most assuredly be part of the discussion. But that should be tempered by an understanding of how politics and regulatory reform work in practice and what constraints on government activity are needed to keep consumers’ true interests at heart.</p> <p class="p1">The financial industry can provide help in this area in a number of ways. First, it can provide its own set of private nudges to help consumers make choices—but in a way that reflects the greater feedback and competitive pressures of the marketplace, as opposed to the less nuanced direction of government bureaucracy. To a large extent, the industry already does provide such nudges, but pointing to examples where “choice architecture” is clearly improved for the consumer could in part help challenge the notion that only government can improve people’s choices.</p> <p class="p1">On that note, the financial industry should be prepared to show evidence that its consumers are indeed happy with the products they receive. Marketplace solutions have already arrived with rating sites like Angie’s List, Yelp, etc. But the industry has a still greater burden to bear. Instead, a product must coincide with what regulators believe to be appropriate products. Greater specificity from regulators in what they are looking for in the set of choices open to the consumer would be ideal, though they rarely pronounce this explicitly. Firms face the uncertainty of what regulators will “choose” for consumers based on the questionable advice of behavioral economists.</p> <p class="p1">This brings us to regulators themselves. Regulators should be challenged on what criteria they use to define their “normal” consumer. Studying just how people arrive at their choices and what parts of the environment trigger different responses is fascinating work that can potentially lead to a better set of choices for consumers. But premature emphasis on policy solutions risks stretching this new work beyond its competence. After all, if people have limited abilities to make decisions, then we must understand not just the behavior of financial companies and financial consumers, but the behavior of financial regulators as well. Only when we study the “choice architecture” of all three can we begin to understand how to arrive at better choices in practice.</p> Mon, 18 Jul 2016 15:53:16 -0400 Rethinking Taxi Regulations: The Case for Fundamental Reform <h5> Publication </h5> <p class="p1"><span class="s1">New technology can cause significant changes in an industry, potentially improving both consumer welfare and governance. The initial reaction of many regulators to the advent of “ridesharing” platforms such as Uber and Lyft was either to outlaw them or to burden them with the same level of regulations as taxis. But policymakers are now beginning to take a new approach. They are aiming to achieve regulatory parity between ridesharing platforms and taxis by deregulating taxis. In a new study, “Rethinking Taxi Regulations: The Case for Fundamental Reform,” Mercatus research fellows Michael Farren and Christopher Koopman and senior research fellow Matthew Mitchell determine that taxi regulation is outdated in light of the transformative technology changes and business innovations of the last few years. Now is an opportune time for fundamental reform of the entire regulatory regime to create a fair, open, and competitive transportation market.</span></p> <p class="p3">BACKGROUND: CURRENT APPROACH TO TAXI REGULATION</p> <p class="p4">Most US cities, including the District of Columbia, extensively regulate the taxicab industry. These regulations are often defended on the theory that customers lack sufficient information to make good decisions about taxi drivers, companies, prices, and proper routes. The purpose of detailed regulation of the companies, their drivers, and their cars is to solve this alleged problem of informational disparity in the market. The District of Columbia for-hire industry, while less regulated than that of other cities such as New York, still shoulders a considerable burden, amounting to 33 regulatory procedures and up to $2,643 in costs to drive a single car as a taxi.</p> <ul class="ul1"> <li class="li5"><i>Regulations impacting drivers.</i> Drivers must submit multiple forms, including a medical history, letters of recommendation, a criminal background history, a driving record, and proof of tickets paid. Further, they must pay hundreds of dollars in fees to drive.</li> <li class="li5"><i>Regulations impacting vehicles.</i> Before a car can be driven as a taxi in DC, it must have several additions installed, including a taximeter (costing $150), an approved dome light (costing as much as $700), and—in some cases—a vehicle condition monitoring device (costing between $100 and $169). The vehicle must also get a standard paint job (costing between $400 and $600) to comply with coloring and marking regulations, submit to annual inspections, and comply with vehicle retirement rules. Additionally, taxi companies must pay a licensing fee of $275 per vehicle.</li> <li class="li5"><i>Regulations for operation.</i> The District also sets rates and fees for customers, imposes procedures for collecting and remitting surcharges, and requires drivers to offer printed receipts. The regulations further mandate that drivers carry sufficient cash, display taxi-related signs and identification, maintain a record of all trips, carry insurance, and follow requirements regarding where and how passengers can be picked up.</li> </ul> <p class="p6"><span> </span></p> <p class="p3">COSTS OF REGULATION VS. BENEFITS OF NEW TECHNOLOGY</p> <p class="p7">Taxi Regulations Impose Significant Costs and Harm Consumers</p> <p class="p4">Taxi regulations undermine personal choice, voluntary exchange, and free and open entry. These regulations are often defended on the grounds that they protect consumers, but economists have demonstrated that protective regulations can undermine consumer welfare.</p> <ul class="ul1"> <li class="li5">The taxicab industry has “captured” the regulatory process so that regulations serve the interests of established taxi firms rather than the interests of consumers and newcomers.</li> <li class="li5">Many of the regulatory requirements, such as price controls, licensing fees, and mandated business practices, have undermined competition and growth in the industry to the detriment of consumers.</li> </ul> <p class="p7">New Technology Is Changing the Industry</p> <p class="p4">Ridesharing firms such as Uber and Lyft operate outside of the taxicab regulatory process and employ new technology to interact with consumers.</p> <ul class="ul1"> <li class="li5"><span class="s1">Between 2014 and 2016, business travelers increased their use of ridesharing firms from 8 percent to 46 percent, while they decreased their use of taxicabs from 37 percent to 14 percent.</span></li> <li class="li5">There are many benefits to the new ridesharing technology. It allows consumers to give feedback on their drivers and vice versa. This capacity for instant feedback ensures consistent, high-quality service because future consumers and drivers can understand more about each other before entering into a transaction.</li> <li class="li5">Ridesharing technology also decreases costs for consumers. In the District of Columbia, taxi fare estimates were 80 percent to 310 percent higher than UberX fare estimates, based on a sampling of some popular DC routes. These data demonstrate that consumers can benefit from the new technologies.</li></ul> <p class="p3">CONCLUSION</p> <p class="p4">Taxi regulations limit competition, thereby yielding higher prices, lower quality, and antiquated technologies and practices. Now is an opportune time to rethink the entire structure of taxi regulations. Policymakers should start with a blank slate, identifying the systemic market failure that the regulations aim to address and then proposing alternative solutions with their expected benefits and costs. These alternatives should include the option of no regulation. Only then can ridesharing regulatory policies best serve consumers.<span style="font-size: 12px; background-color: white;">&nbsp;</span></p> Wed, 20 Jul 2016 10:23:37 -0400 Does FDA Funding Increase Drug and Medical Device Innovation? <h5> Publication </h5> <p><b>INTRODUCTION</b></p><p>The US FDA receives funding through the general fund and user fees. Additional funding comes from the regulated industries. Specifically, the drug industry funds FDA through the Prescription Drug User Fee Act (PDUFA), and the medical device industry funds FDA through the Medical Device User Fee Act (MDUFDA). Both acts are considered a success for requiring FDA to improve approval time for drugs and devices. However, decreased approval times have not resulted in more drug and device innovation.<span style="font-size: 12px;">&nbsp;</span></p> <p>In fact, the same number of products are still submitted for approval to FDA. They are just approved more quickly. FDA does not have an incentive to actually increase innovation—its only incentive is to meet its MDUFA and PDUFA approval times to keep its funding flowing. The expense of putting drugs and devices through this system is almost unimaginable. The cost of bringing low- to medium-risk 510(k) medical devices to market averages $31 million, $24 million (75 percent) of which is dedicated solely to attaining FDA approval within an average of about six months. Any significant improvement to the device requires reapplication. For higher-risk medical devices where there may be significant health gains, the costs are about $94 million, $75 million (80 percent) of which is dedicated to attaining FDA approval.</p> <p>For drugs, the situation is much worse. It costs an average of $2.6 billion simply to get a drug through the FDA process and onto the market. This does not include postmarket monitoring, the terms of which are laid out by FDA upon approval. These costs have increased from about $1 billion between 1983 and 1994.</p> <p>In addition, the primary laws governing devices and drugs are now 40 and 50 years old, respectively. These laws, in conjunction with other incentives, attenuate progress in the device and drug arenas. As one congresswoman describes it, “Health research moves at a rapid pace, but the federal drug and device approval process is in many ways a relic of another era.” Yet we continue to increase the funding and authority for FDA and assume that we will somehow boost innovation in medical products (drugs and devices) despite the growing obstacles. This has not happened.</p> <p>FDA has grown in both resources and statutory authority, and to continue those increases, it must meet user fee goals and avoid bad publicity.</p> Wed, 20 Jul 2016 12:18:37 -0400 Monetary Rules for a Post-Crisis World ( <h5> Events </h5> <div id="_mcePaste" style="position: absolute; left: -10000px; top: 0px; width: 1px; height: 1px; overflow: hidden;">Central banks' part in the Great Recession, and the lackluster recovery since, are reviving interest in monetary rules. That revival raises crucial questions. Might the Federal Reserve and other central banks have performed better if they’d adhered to monetary policy rules? Could rules have avoided the crisis altogether? Can they avoid future crises? If so, which rules work best? Can a monetary policy rule work even in a world of near-zero, or negative, interest rates?</div><div id="_mcePaste" style="position: absolute; left: -10000px; top: 0px; width: 1px; height: 1px; overflow: hidden;"></div><div id="_mcePaste" style="position: absolute; left: -10000px; top: 0px; width: 1px; height: 1px; overflow: hidden;">On September 7, the Mercatus Center at George Mason University and the Cato Institute’s Center for Monetary and Financial Alternatives will team up for a day-long academic conference, hosting a distinguished group of scholars, to explore these pressing questions about monetary policy rules.</div><div id="_mcePaste" style="position: absolute; left: -10000px; top: 0px; width: 1px; height: 1px; overflow: hidden;"></div><div id="_mcePaste" style="position: absolute; left: -10000px; top: 0px; width: 1px; height: 1px; overflow: hidden;">Four panels will discuss:&nbsp;</div><div id="_mcePaste" style="position: absolute; left: -10000px; top: 0px; width: 1px; height: 1px; overflow: hidden;"></div><div id="_mcePaste" style="position: absolute; left: -10000px; top: 0px; width: 1px; height: 1px; overflow: hidden;">•<span style="white-space: pre;"> </span>The Evolving Case for Monetary Rules</div><div id="_mcePaste" style="position: absolute; left: -10000px; top: 0px; width: 1px; height: 1px; overflow: hidden;">•<span style="white-space: pre;"> </span>Monetary Rules and Monetary Stability</div><div id="_mcePaste" style="position: absolute; left: -10000px; top: 0px; width: 1px; height: 1px; overflow: hidden;">•<span style="white-space: pre;"> </span>Monetary Rules and Emergency Lending</div><div id="_mcePaste" style="position: absolute; left: -10000px; top: 0px; width: 1px; height: 1px; overflow: hidden;">•<span style="white-space: pre;"> </span>Monetary Rules in Light of the Crisis</div><div id="_mcePaste" style="position: absolute; left: -10000px; top: 0px; width: 1px; height: 1px; overflow: hidden;"></div><div id="_mcePaste" style="position: absolute; left: -10000px; top: 0px; width: 1px; height: 1px; overflow: hidden;">For additional details or questions, please contact Elizabeth Leibundguth (<a href=""></a>).</div><div id="_mcePaste" style="position: absolute; left: -10000px; top: 0px; width: 1px; height: 1px; overflow: hidden;"></div><div id="_mcePaste" style="position: absolute; left: -10000px; top: 0px; width: 1px; height: 1px; overflow: hidden;">John B. Taylor, Mary and Robert Raymond Professor of Economics, Stanford University and George P. Shultz Senior Fellow in Economics, Hoover Institution</div><div id="_mcePaste" style="position: absolute; left: -10000px; top: 0px; width: 1px; height: 1px; overflow: hidden;">&nbsp;</div><div id="_mcePaste" style="position: absolute; left: -10000px; top: 0px; width: 1px; height: 1px; overflow: hidden;">David Laidler, Professor Emeritus of Economics, University of Western Ontario&nbsp;</div><div id="_mcePaste" style="position: absolute; left: -10000px; top: 0px; width: 1px; height: 1px; overflow: hidden;"></div><div id="_mcePaste" style="position: absolute; left: -10000px; top: 0px; width: 1px; height: 1px; overflow: hidden;">Mark Calabria, Director of Financial Regulation Studies, Cato Institute</div><div id="_mcePaste" style="position: absolute; left: -10000px; top: 0px; width: 1px; height: 1px; overflow: hidden;"></div><div id="_mcePaste" style="position: absolute; left: -10000px; top: 0px; width: 1px; height: 1px; overflow: hidden;">Robert Hetzel, Staff Economist, Federal Reserve Bank of Richmond</div><div id="_mcePaste" style="position: absolute; left: -10000px; top: 0px; width: 1px; height: 1px; overflow: hidden;"></div><div id="_mcePaste" style="position: absolute; left: -10000px; top: 0px; width: 1px; height: 1px; overflow: hidden;">Scott Sumner, Ralph G. Hawtrey Chair of Monetary Policy and Director, Program on Monetary Policy, Mercatus Center and Professor of Economics at Bentley University</div><div id="_mcePaste" style="position: absolute; left: -10000px; top: 0px; width: 1px; height: 1px; overflow: hidden;"></div><div id="_mcePaste" style="position: absolute; left: -10000px; top: 0px; width: 1px; height: 1px; overflow: hidden;">David Papell, Joel W. Sailors Endowed Professor of Economics and Chair, Department of Economics, University of Houston&nbsp;</div><div id="_mcePaste" style="position: absolute; left: -10000px; top: 0px; width: 1px; height: 1px; overflow: hidden;"></div><div id="_mcePaste" style="position: absolute; left: -10000px; top: 0px; width: 1px; height: 1px; overflow: hidden;">Perry Mehrling, Professor of Economics, Barnard College, Columbia University</div><div id="_mcePaste" style="position: absolute; left: -10000px; top: 0px; width: 1px; height: 1px; overflow: hidden;">&nbsp;</div><div id="_mcePaste" style="position: absolute; left: -10000px; top: 0px; width: 1px; height: 1px; overflow: hidden;">Kevin Sheedy, Assistant Professor of Economics, London School of Economics</div><div id="_mcePaste" style="position: absolute; left: -10000px; top: 0px; width: 1px; height: 1px; overflow: hidden;"></div><div id="_mcePaste" style="position: absolute; left: -10000px; top: 0px; width: 1px; height: 1px; overflow: hidden;">Walker F. Todd, Trustee, American Institute for Economic Research, and former Assistant General Counsel and Economics Officer, Federal Reserve Bank of Cleveland</div><div id="_mcePaste" style="position: absolute; left: -10000px; top: 0px; width: 1px; height: 1px; overflow: hidden;"></div><div id="_mcePaste" style="position: absolute; left: -10000px; top: 0px; width: 1px; height: 1px; overflow: hidden;">David Beckworth, Senior Research Fellow, Program on Monetary Policy, Mercatus Center</div><div id="_mcePaste" style="position: absolute; left: -10000px; top: 0px; width: 1px; height: 1px; overflow: hidden;"></div><div id="_mcePaste" style="position: absolute; left: -10000px; top: 0px; width: 1px; height: 1px; overflow: hidden;">Peter Ireland, Murray and Monti Professor of Economics, Boston College</div><div id="_mcePaste" style="position: absolute; left: -10000px; top: 0px; width: 1px; height: 1px; overflow: hidden;"></div><div id="_mcePaste" style="position: absolute; left: -10000px; top: 0px; width: 1px; height: 1px; overflow: hidden;">Miles Kimball, Professor of Economics and Research Professor of Survey Research, University of Michigan Department of Economics and Survey Research Center</div><div id="_mcePaste" style="position: absolute; left: -10000px; top: 0px; width: 1px; height: 1px; overflow: hidden;"></div><div id="_mcePaste" style="position: absolute; left: -10000px; top: 0px; width: 1px; height: 1px; overflow: hidden;">George Selgin, Director, Center for Monetary and Financial Alternatives, Cato Institute</div><div id="_mcePaste" style="position: absolute; left: -10000px; top: 0px; width: 1px; height: 1px; overflow: hidden;"></div><div id="_mcePaste" style="position: absolute; left: -10000px; top: 0px; width: 1px; height: 1px; overflow: hidden;">David Glasner, Economist, Federal Trade Commission</div><p>Central banks' part in the Great Recession, and the lackluster recovery since, are reviving interest in monetary rules. That revival raises crucial questions. Might the Federal Reserve and other central banks have performed better if they’d adhered to monetary policy rules? Could rules have avoided the crisis altogether? Can they avoid future crises? If so, which rules work best? Can a monetary policy rule work even in a world of near-zero, or negative, interest rates?</p><p>On September 7, the Mercatus Center at George Mason University and the Cato Institute’s Center for Monetary and Financial Alternatives will team up for a day-long academic conference, hosting a distinguished group of scholars, to explore these pressing questions about monetary policy rules.</p><p>Four panels will discuss:&nbsp;</p><ul><li><span style="font-size: 12px; background-color: white;">The Evolving Case for Monetary Rules</span><span style="white-space: pre;">&nbsp;</span></li><li><span style="font-size: 12px; background-color: white;">Monetary Rules and Monetary Stability</span></li><li><span style="font-size: 12px; background-color: white;">Monetary Rules and Emergency Lending&nbsp;</span></li><li><span style="font-size: 12px; background-color: white;">Monetary Rules in Light of the Crisis</span></li></ul><p>For additional details or questions, please contact Elizabeth Leibundguth at <a href=""></a></p><p><b>Scholars:</b></p><p><a href="">John B. Taylor</a>, Mary and Robert Raymond Professor of Economics, Stanford University and George P. Shultz Senior Fellow in Economics, Hoover Institution&nbsp;</p><p><a href="">David Laidler</a>, Professor Emeritus of Economics, University of Western Ontario&nbsp;</p><p><a href="">Mark Calabria</a>, Director of Financial Regulation Studies, Cato Institute</p><p><a href="">Robert Hetzel</a>, Staff Economist, Federal Reserve Bank of Richmond</p><p><a href="">Scott Sumner</a>, Ralph G. Hawtrey Chair of Monetary Policy and Director, Program on Monetary Policy, Mercatus Center and Professor of Economics at Bentley University</p><p><a href="">David Papell</a>, Joel W. Sailors Endowed Professor of Economics and Chair, Department of Economics, University of Houston&nbsp;</p><p><a href="">Perry Mehrling</a>, Professor of Economics, Barnard College, Columbia University&nbsp;</p><p><a href="">Kevin Sheedy</a>, Assistant Professor of Economics, London School of Economics</p><p><a href="">Walker F. Todd</a>, Trustee, American Institute for Economic Research, and former Assistant General Counsel and Economics Officer, Federal Reserve Bank of Cleveland</p><p><a href="">David Beckworth</a>, Senior Research Fellow, Program on Monetary Policy, Mercatus Center</p><p><a href="">Peter Ireland</a>, Murray and Monti Professor of Economics, Boston College</p><p><a href="">Miles Kimball</a>, Professor of Economics and Research Professor of Survey Research, University of Michigan Department of Economics and Survey Research Center</p><p><a href="">George Selgin</a>, Director, Center for Monetary and Financial Alternatives, Cato Institute</p><p><a href="">David Glasner</a>, Economist, Federal Trade Commission</p> Tue, 19 Jul 2016 16:01:39 -0400 Tracing the Roots of Today’s Fiscal Policy: How Systematic Deficits Got Their Start <h5> Publication </h5> <p>US federal debt held by the public is at a peacetime high of 75 percent of GDP, and the Congressional Budget Office (CBO) <a href="">expects it to reach</a> 86 percent by 2026 before soaring to 155 percent by 2046. Similarly, CBO projects annual deficits to triple in the next decade, from $438 billion in 2015 to $1.36 trillion in 2026. Entitlements and interest are the most widely discussed drivers of this fiscal outlook. However, the roots of today’s fiscal problems go much deeper, both historically and politically.</p> <p>When looking at the history of deficits and debt, most studies focus on the modern budget era (1974 onward) using either inflation-adjusted dollars or debt as a percentage of GDP. Instead, in our recent Mercatus Working Paper, <a href="">we present deficits and debt using dollars not adjusted for inflation</a>, going all the way back to the nation’s founding, because the nominal figures offer a way to compare short-term fiscal patterns over a long period. We find that deficit patterns gradually transform from infrequent deficits during emergencies to systematic deficits every year.</p><p><a><img src=" C1 fixed.png" alt="US Federal Deficit and Debt: 1702–1857" width="585" height="425" /></a></p><p><a><img src=" C2 fixed.png" alt="US Federal Deficit and Debt: 1858–1914" width="585" height="425" /></a></p> <p>The first chart shows that for most of the nineteenth century, the pattern was to run deficits only during periods of genuine national emergency (War of 1812, Mexican-American War, financial crises of the 1840s, and Civil War), but to run surpluses during normal times to pay down the accumulated debt. However, as the second chart demonstrates, between about 1880 and 1914, deficits became slightly more routine, and accumulated debt was paid down more slowly. After the exceptional period of the two World Wars, peacetime deficits became even more frequent and grew larger in magnitude, as seen in the third chart. Finally, the modern budget era, in the fourth chart, features large and growing deficits every year since 1974 (except 1998–2001), in both emergency periods and normal times.</p><p><a><img src=" C3 fixed.png" alt="US Federal Deficit and Debt: 1915–1973" width="585" height="425" /></a></p><p><a><img src=" C4 fixed.png" alt="US Federal Deficit and Debt" width="585" height="425" /></a></p> <p>We trace the political roots of this gradual change to the decades surrounding the turn of the twentieth century. Between about 1880 and 1930, two <i>informal</i> <i>rules</i> that govern fiscal policymakers shifted. First, new norms of federal spending began to emerge in the electorate, as new demands for spending emerged in the areas of economic security at the household level and economic stability at the macroeconomic level. Second, norms began changing within government as well, as elected office transformed from a temporary duty into a career-long endeavor.</p> <p>By the early decades of the<sup> </sup>twentieth century, current voters and professionalized politicians had powerful incentives to increase spending. The new informal norms were thereby codified into durable legislation that would determine how future policymakers must spend. This is the root of entitlement programs and debt-financed stimulus, and this is how chronic deficits got their start. Although the modern budget era has seen numerous legislative and constitutional attempts to constrain spending—or shifts in the <i>formal constraints</i>—the reality is that policymakers have difficulty controlling deficits, and federal debt has marched steadily toward unsustainable levels. In short, <a href="">informal rules have trumped formal constraints</a>.</p><p>Whether today’s policymakers intend or realize it, the evolution of informal and formal fiscal rules continues to shape today’s fiscal policy outcomes. This has led to chronic deficits, mounting debt, a dizzying complexity of tax and budget procedures, and unsustainably large unfunded obligations—all leading toward an overall bad and worsening fiscal outlook. Any serious discussion of reform must start by recognizing the current incentive structure embedded in the budget process. Piling on more formal constraints, without addressing the shifts in the informal rules, will be futile.</p> Wed, 20 Jul 2016 10:23:13 -0400 Labor Department Works Overtime to Harm Workers <h5> Expert Commentary </h5> <p class="p1"><span class="s1">Starting Dec. 1, the Department of Labor will force businesses to pay millions of salaried workers time and a half for every hour over 40 that these workers work weekly. Great news for salaried workers, right?</span></p> <p class="p1"><span class="s1">Wrong. The luckiest of these workers will experience no change in their pay or work hours while many less fortunate workers will be priced out of their jobs.</span></p> <p class="p1"><span class="s1">Here's an example: Jones is a night manager at O'Burger's Restaurant. He works an average of 45 hours each week for a weekly salary of $750. Because the Labor Department calculates Jones' hourly rate of pay based on a 40- (not 45-) hour work week, it concludes that Jones' hourly rate of pay is $18.75 (which is $750 divided by 40). Under the new Labor rule, Jones must be paid time and half — $28.13 — for every hour each week that he works over 40.</span></p> <p class="p1"><span class="s1">Therefore, if O'Burger's continues to work Jones 45 hours weekly, it will have to pay Jones each week, not $750, but $890.65. That's a 19 percent increase in O'Burger's cost of employing Jones for an average of 45 hours weekly.</span></p> <p class="p1"><span class="s1">Unlike when Obama administration officials discuss minimum wages, these officials here correctly understand that government-enforced hikes in the cost of employing labor prompt employers to cut back on the use of now-more-costly labor. Specifically, Labor predicts that, in this example, O'Burger's will simply reduce Jones' weekly hours from 45 to 40 and hire an additional worker — at straight time — to perform the other five hours of work.</span></p> <p class="p1"><span class="s1">Yet while Labor officials are correct that employers will take steps to avoid the higher costs of employing workers such as Jones, these officials are mistaken in their prediction of how employers will do so.</span></p> <p class="p1"><span class="s1">The most obvious and easiest way that O'Burger's will protect itself from the higher mandated labor cost is to cut Jones' hourly rate of pay to $15.79. At this base rate, Jones will get paid a total of $750 weekly when he works a 45-hour week and is paid time-and-a-half for five of those hours. For Jones, nothing changes.</span></p> <p class="p1"><span class="s1">Another possible way for O'Burger's to adjust to Labor's mandate is to cut the value of Jones' benefits — for example, offer Jones fewer days of paid vacation or contribute less to Jones' pension plan.</span></p> <p class="p1"><span class="s1">The government is naive to suppose that O'Burger's would instead simply reduce Jones' weekly hours to 40 and hire an additional worker for the other five hours. Because Jones has managerial duties, it's just not feasible to shut Jones down after 40 hours each week and to then have someone else do the managing for the remaining five hours.</span></p> <p class="p1"><span class="s1">Not all workers will be as lucky as Jones. Because of the minimum wage, some workers' hourly pay rate — unlike Jones' — will be too low to cut in order to keep these workers' weekly pay unchanged. The effect of the overtime-pay mandate on these workers will be to raise employers' costs of employing them. With the cost of employing these workers forced higher by the government, some of these workers will simply lose their jobs.</span></p> Wed, 20 Jul 2016 10:32:08 -0400 Why the FDA Prevents Well-Informed Patients From Getting New Drugs <h5> Expert Commentary </h5> <p class="p1"><span class="s1">In April of this year, an advisory panel to the Food and Drug Administration (FDA) voted <span class="s2"><a href="">against</a>&nbsp;</span>approval of the drug Eteplirsen, a treatment for Duchenne muscular dystrophy. This came despite the testimonies by patients and their families who believed the drug gave them hope and who voiced their support for the drug’s approval. The conflict between FDA experts and patients underscores the contradictions in the current drug-approval process. In addition, it raises the question of whether the FDA should be in position to prevent well-informed patients from gaining access to drugs.</span><span style="font-size: 12px; background-color: white;">&nbsp;</span></p> <p class="p3"><span class="s1">Under the 1962 amendment to the Food, Drug and Cosmetics Act, the FDA must ensure both safety and efficacy of drugs. The FDA panel <a href=""><span class="s2">faulted</span></a> Sarepta, the drugmaker behind eteplirsen, for failing to provide sufficient proof that its treatment is in fact effective. Consequently, the panel recommended against approving the drug.</span></p> <p class="p1"><span class="s1">The fundamental problem with the FDA's approval process is that it is trying to answer two distinct questions: which drugs the agency should permit and which drugs it is willing to recommend. Since the approval process allows only a single answer to both questions, the FDA does not permit any new drug unless it is also willing to recommend it. Consequently, the process needlessly restricts patient access to new drugs.</span></p> <p class="p1"><span class="s1">Safety trials determine which drugs the FDA is willing to allow on the market. The public health disaster in 1937 when the drug&nbsp;<a href=""><span class="s2">Elixir Sulfanilamide</span></a> fatally poisoned over 100 people clearly demonstrated the need for safety regulation. In the wake of the tragedy, Congress required all new drugs to receive FDA approval before they could be marketed. The safety regulations were further tightened in 1962 after the drug&nbsp;<a href=""><span class="s2">Thalidomide</span></a>, commonly prescribed to pregnant women to reduce morning sickness, was found to cause birth defects in newborn babies.</span></p> <p class="p1"><span class="s1">The primary aim of this standard is to ensure that new drugs do not lead to adverse effects in patients; it is not concerned with therapeutic qualities of the drugs. For example, under the compassionate use clause, the FDA <a href=""><span class="s2">allows</span></a> patients access to unapproved drugs that are still undergoing clinical trials. At this stage, the FDA does not have sufficient evidence that the drug would actually help patients. However, the FDA allows expanded access only when there is sufficient data on the experimental drug's safety. It does not allow access to drugs with serious safety concerns.</span></p> <p class="p1"><span class="s1">In contrast, efficacy trials determine which drugs the FDA is willing to recommend. The agency ensures that the new drug is not only safe but also offers substantial health benefits to patients. Since many physicians are too <a href=""><span class="s2">busy</span></a> to <a href=""><span class="s2">examine</span></a> the published clinical studies on a new drug's efficacy, they rely on the FDA's seal of approval to tell them which drugs will actually benefit their patients.</span></p> <p class="p1"><span class="s1">The two standards serve different purposes and therefore require different processes. The approval process makes sense when enforcing drug safety — the FDA is correct to keep unsafe drugs from the market. Yet, the same restrictive approach does not make sense in case of the efficacy standard. The problem is that new drugs must demonstrate effectiveness in a broader population — clinical trials testing efficacy typically require <a href=""><span class="s2">thousands</span></a> of volunteers. Since the FDA does not have specific information on patients' cases, it must ensure the drug's efficacy in a wide variety of cases.</span></p> <p class="p1"><span class="s1">Yet, as the compassionate use cases indicate, there may be valid reasons for physicians to recommend experimental drugs to their patients. There may be no alternative treatments, as in the case of Duchenne muscular dystrophy, or patients may not react well to the existing treatments.</span></p> <p class="p1"><span class="s1">A better approach to ensuring drugs' efficacy would be a certification process. Under such process, the FDA would bestow a seal of approval on drugs that have demonstrated their efficacy. The certification would provide guidance to busy physicians with regards to a new drug's efficacy. At the same time, it would not restrict patients' access to safe drugs. In fact, separating safety and efficacy processes may relieve political pressure from patients' groups on the FDA to approve drugs prematurely.</span></p> Thu, 14 Jul 2016 11:42:02 -0400 David Cameron's Great Blunder <h5> Expert Commentary </h5> <p class="p1"><span class="s1">First Britain, now Italy, what’s next? The Brexit has triggered the EU’s worst political crisis.</span></p> <p class="p1"><span class="s1">And to make matters worse Britain’s Prime Minister David Camron steps down on Wednesday, having bet his entire political future on the likelihood that a vote for Brexit would lose at the polls.&nbsp;Pollsters and policy experts alike did not take the likelihood of a vote for Brexit seriously. After all, if David Cameron and his advisors believed for a minute that they’d lose at the polls, they would never have initiated a referendum.</span></p> <p class="p1"><span class="s1">Their arrogance rests in a conviction that when countries join the global economy and become more prosperous, they gradually experience a sociopolitical “convergence.” In part, it’s the same narrative that has dictated Western global development and defense policy, and global diplomacy, since the Cold War.</span></p> <p class="p1"><span class="s1">In one of David Cameron’s favorite books, “Why Nations Fail,” political economists Daron Acemoğu and James Robinson explain that there is an iron law of political and economic development, and that countries that attempt to deviate from its grip will fail.</span></p> <p class="p1"><span class="s1">But the iron law did not sway England’s masses. Every conceivable economic argument was made by policy pundits from the entire spectrum of British politics, including both the Tories and Labor, to communicate the risks of exiting. Informed that leaving was the riskier option, why then did the British population exhibit such incredible boldness? Who — the experts or the population — made the bigger miscalculation?</span></p> <p class="p1"><span class="s1">Maybe the voters sensed that building a “European identity” upon shared fiscal and commercial interests is, after all, nothing but “a grand illusion?” That’s the very phrase the late historian Tony Judt used in a prescient essay of the same name written 20 years ago. He described what he called the “reductionist fallacy, the curiously nineteenth century belief shared by classical economists and Marxists alike, that social and political institutions and affinities naturally follow economic ones.”</span></p> <p class="p1"><span class="s1">Led by this fallacy of socioeconomic convergence, Britain’s political elite concluded that a Brexit win was unthinkable. They are unable to grasp why a plurality of British voters consider that the nation state and not the EU offers them the best assurance that the social fabric will not be torn apart by globalization. Far more suited toward civic responsibility and effective participation, a well-governed nation state can promise to ease the disruptive effects of transnational trade, protect the disadvantaged and distribute resources according to some shared standard of equity. Thus Judt cautions that it is wrong to read nationalism and the nation-state as anachronisms; they can be the most modern of institutions.</span></p> <p class="p1"><span class="s1">The current generation of globalists who stand with Cameron and the nation’s cosmopolitan elites can be excused for believing that history moves in one direction. During their lifetime, they have seen the Berlin Wall torn down, China turn capitalist, Marxism lose its grip on Third World revolutionaries, and Sweden and France give up their socialist yearnings.</span></p> <p class="p1"><span class="s1">Yet there are deviations — and very noticeable ones — from the globalization narrative that cultural change must follow economic integration. These examples show up in countries with narrow democratic credentials and limited democratic experience. China, Russia, Saudi Arabia, Turkey, Hungry, the Czech Republic and Poland all actively participate in economic globalization while embracing nationalist visions of the future.</span></p> <p class="p1"><span class="s1">Vladimir Putin dramatizes Russia’s history of past military glory, and Recep Tayyip Erdoğan dreams of a Turkish-centric Middle East. Xi Jinping seeks to revive the ancient Silk Road. The Saudis seek to spread Koranic laws to the world’s Muslim populations.</span></p> <p class="p1"><span class="s1">These deviations from a shared cosmopolitan ethos might have been attributable to immature institutions. But this time it is the British, a central pillar and source of global liberalism, defecting from the very narrative the country’s own history has inspired.</span></p> <p class="p1"><span class="s1">Yet it is also wrong to view the Brexit as signifying a desire to move backward toward an idyllic little England for the English only. No country in Europe plays a greater role on the global stage. Britain’s contributions to the defeat of fascism, communism and authoritarianism far exceed those of either France or Germany. No county in Europe has done more in the past 25 years to welcome immigrants. No country in Europe does more to promote open economies, and few countries have benefitted more.</span></p> <p class="p1"><span class="s1">The narrative that economic growth will replace historic identities with a more cosmopolitan one no longer bears repeating. That narrative cannot chart, solve or analyze the crisis that Europe now faces.</span></p> <p class="p1"><span class="s1">The Brexit will not mean that Britain has withdrawn from globalization. What the Brexit shows us is that change does not move in one direction. That vote brings an end to illusions of global uniformity, not to globalization itself. The future course of globalization is likely to be charted by populations that can be both global and national, and in this regard the Brits, by choosing memory over materialism, may continue to be trendsetters.</span></p> Thu, 14 Jul 2016 11:28:24 -0400 Make Disaster Recovery a Success <h5> Expert Commentary </h5> <p class="p1"><span class="s1">Frontline and NPR recently aired a special on the "<a href=""><span class="s2">Business of Disaster</span></a>" which investigated issues with flood insurance and aid distribution after Hurricane Sandy. In it they find disaster victims have been systematically underpaid on flood insurance claims, aid programs have been slow to distribute funds and homeowners have spent years dealing with regulatory red tape just to get back to the homes and communities they love.</span></p> <p class="p1"><span class="s1">The main takeaway from the special was that the public-private partnerships tasked with implementing disaster recovery programs failed to provide the resources and support communities needed to rebuild after the storm. For instance, the companies who were selected to handle flood insurance found ways to <a href=""><span class="s2">increase their profits</span></a> and <a href=""><span class="s2">limit payouts</span></a>. Additionally, the Federal Emergency Management Agency failed to adequately oversee the actions of their partners in the flood insurance program and to enforce best practices. Similarly, the <a href=""><span class="s2">NYC "Build It Back" program</span></a> has failed to effectively distribute funds to homeowners in need and had several issues with unqualified contractors.</span></p> <p class="p2"><span class="s1">The entangled relationship between public and private entities results in high administrative and overhead costs with little relief to those who need it the most. As Brad Gair, a New York City disaster recovery manager, <a href=""><span class="s3">said during the special</span></a>, "Did we put a bunch of money out? Yes. Is everybody mad? Yes. Did people get what they needed to get back into a home? No."</span></p><p class="p2"><span class="s1"><a href="">Continue reading</a></span></p> Wed, 20 Jul 2016 10:32:51 -0400 Bursting the Pentagon Spending Bubble <h5> Expert Commentary </h5> <p class="p1"><span class="s1">Next year the White House will have a new occupant, but one thing is almost certain not to change: a U.S. foreign policy driven by mind-boggling sums of taxpayer money. With the exception of Bernie Sanders, all the major-party presidential candidates during this election season have said they would oppose military spending cuts. Even the relatively non-interventionist Sen. Rand Paul wanted to bust the military spending caps put into place by the Budget Control Act of 2011, while the other Republican candidates essentially fought over who wanted to increase spending most.</span></p> <p class="p2"><span class="s1">Donald Trump, the presumptive GOP nominee, left no doubt that he intends to keep the military gravy train rolling. Trump may have said that President George W. Bush lied about Iraq (a war that he claims, falsely, to have been publicly against since the start) but he has nonetheless earned the endorsements of former Vice President (and Hawk in Chief) Dick Cheney; interventionist former U.N. Ambassador John Bolton, and Arizona Sen. John McCain, a man who has rarely met an international problem he does not want to fix with American force.</span></p> <p class="p2"><span class="s1">Don't count on presumptive Democratic nominee Hillary Clinton to cut Pentagon spending, either. Clinton's track record of supporting more and bigger interventions paid for with a growing military budget makes her virtually indistinguishable from the Republican White House hopefuls. As reason's Nick Gillespie put it, "a vote for Hillary is a vote for war."</span><span style="font-size: 12px; background-color: white;">&nbsp;</span></p> <p class="p2"><span class="s1">Sadly, the American public is also warming up to the idea of more military spending. A recent Pew Research Center poll found that 35 percent say the U.S. should increase spending on national defense; that's a 12 percentage-point hike just since 2013. As Pew notes, "Most of the increase has come among Republicans. Fully 61% of Republicans favor higher defense spending, up 24 percentage points from 2013."</span></p><p class="p2"><span class="s1"><a href="">Continue reading</a></span></p> Wed, 13 Jul 2016 15:04:01 -0400 Are Political Parties Obsolete? <h5> Expert Commentary </h5> <p class="p1"><span class="s1">This year’s struggle in both major parties — Democrat and Republican — to identify and rally behind a clear-cut presidential nominee tells us that there is more going on than just the action of unorthodox candidates.</span></p> <p class="p1"><span class="s1">Although confronting a “bridge too far” in gaining nomination, Sen. Bernie Sanders of Vermont has shown what social media funding mechanisms can do. He remarkably raised more money using crowd-funding techniques than his more well-established Democratic opponent, former Secretary of State Hillary Clinton.</span></p> <p class="p1"><span class="s1">On the Republican side, Donald Trump refuses to be painted with traditional GOP hues and colors. He has shown how a person with no previous experience in party politics, who never held public office, can self-fund a primary campaign and generate massive turnouts for rallies and votes. And he has done so without spending huge amounts on advertising — by using Twitter to communicate constantly with his supporters.</span></p> <p class="p1"><span class="s1">If people interested in holding office can communicate directly, at low cost, with millions of voters — and can use crowd-funding techniques or private wealth to fund their candidacies — are political parties really necessary?</span></p> <p class="p1"><span class="s1">In a 2011 Huffington Post piece titled “The Parties Are Over,” former presidential aspirant Gary Hart talked about the history of America’s party system and then, with a rush to the moment, had this to say: “In recent years, however, the parties’ entire role and therefore their power has been collapsing. If a candidate is clever enough and has something to say, he or she can get direct access to the media. As political entrepreneurs, most candidates now raise their own financing and depend on money from the parties less and less.”</span></p> <p class="p1"><span class="s1">He went on to observe, “Candidates form their own policy groups or court the flourishing idea forums that span the political spectrum. Self-confident and ambitious candidates put themselves forward for any office they desire, up to and including the presidency, without seeking the approval of party officials. Individual office-seekers form their own coalitions by shopping for support among the smorgasbord of interest groups.”</span></p> <p class="p1"><span class="s1">Put another way, the old party approach for identifying and bringing forth candidates may be becoming obsolete and for one simple reason: The old way may be too costly for the participants, relative to other available approaches.</span></p> <p class="p1"><span class="s1">But what do I mean by costly?</span></p> <p class="p1"><span class="s1">Writing in 1937, Nobel Laureate Ronald H. Coase explained that business and other firms, such as political parties, exist because it’s too costly for people who wish to organize production and marketing to contract on a daily basis with large numbers of individual workers and owners of capital. The firm, as we know it, organizes resources with low-cost, long-term contracts. Indeed, that it is the essence of the firm. It’s an intermediary between individuals with labor services and capital to sell and the market for final goods and services. By eliminating lots of transactions, the firm (or political party!) reduces costs.</span></p> <p class="p1"><span class="s1">Coase recognized that if transaction costs fall for other reasons, say because of smart phone technologies, then firms could become more temporary, or even nonexistent. Alternate approaches for organizing work would emerge. People might meet at Starbucks and organize business ventures (or another political party) while transacting on Facebook! (Next time you’re in a coffee shop, look around.)</span></p> <p class="p1"><span class="s1">But what about political parties? They are the counterpart of Coase’s firms. As Jonathan Rauch pointed out recently in The Atlantic (July/August 2016), party leadership forms an intermediary that lies between voters, primaries and caucuses and the final achievement of political success by aspiring candidates. Until now, party leadership (and the ability of parties to gather resources) has been a lower-cost substitute for an individual candidate going it alone. But if the cost of gathering resources directly by candidates falls dramatically, relative to the old party mechanism’s costs, then market-savvy politicians will rely less on political parties. Party leaders and parties themselves will lose power unless they change their ways.</span></p> <p class="p1"><span class="s1">Let’s face it: Smart phone and social media technology have significantly reduced the cost of organizing political resources. But will parties go away completely? Hardly. Consider smartphone-supported Uber, which offers transportation services in lots of cities — but the old cabbies are still present, although they may operate differently. In fact, many of them now offer smart-phone apps, making it easier to “hail a cab.” Political parties are here to stay, but how they operate will change.</span></p> Wed, 13 Jul 2016 14:33:58 -0400