Housing building fatigue: an alternative interpretation of the origin of the Great Recession.

Economists have pointed out three contributing factors for the Great Recession. All of them closely related to the financial and banking system: Leverage, complexity, and liquidity (Blanchard, 2013). In this article, we try to advance an alternative interpretation of the origin of the Great Recession by looking at the housing market and to its links to mortgage borrower’s expectations. More precisely, we look at one reason why a Mortgage-backed loan (MBL) may go underwater: that is, a “Mortgage default Risk derived from housing building fatigue”. Here, we advance the first hypothesis of a series of three that compose the entire proposal. We assume that the Great Recession was largely an effect of a substantial decrease in consumer confidence. Likewise, we consider that the Great Recession has arguably many similarities with the U.S. Recession of 2001. The work cites those two crisis in order to determine stylized facts of U.S. Recessions.

On the third part of the article we look at the logic behind the boom of the housing prices. We aim at situating the discussion in the frame of capital inflows to the housing market. We see this as a reasonable movement since nominal interest rates in the financial markets were at record lows. After presenting the macroeconomic context based on the stylized facts, we propose a model of what we call “Mortgage default Risk derived from housing building fatigue”, followed by some linear regression to estimate the correlation of the first hypothesis. Finally we present some conclusion about the limits and the open windows for further research.

Hypothesis:

We propose the following hypothesis as a contributing factor to the origin of the Great Recession: default risk in mortgages payments are higher for low income Americans and old housing buildings mortgages exacerbate the risk. There is a correlation between low income Americans dwelling in aged housing buildings. There are reasons to believe that a mortgage loan goes underwater more often as the housing building gets older. This basically means that the older the building, the greater the chances for default of payments in the mortgage.

We put forth the following three hypothesis extending the contributing factors of the Great Recession:
Hypothesis number one: low income Americans tend to live in aged housing buildings.
Hypothesis number two: mortgage loans go underwater more often as the housing buildings get older.
Hypothesis number three: default in mortgages payments are higher for both low income Americans as for old housing buildings mortgages loans.

Methodology:

We use descriptive statistics to give an account of our hypothesis. We first take data from the American Housing Survey which is processed by the United States Census Bureau. This data inquire into the various aspects of housing buildings. Data on age of buildings and household income are described used such a survey. Then, some simple linear regression are run for establishing the correlation of the first hypothesis. We certainly acknowledge limitations of the data base and the conclusions we are making. Nonetheless, the aim of this paper is to point out some aspects of the hypothesis so that it can be tested in either larger samples or by using panel comparative data. Not enough numbers of observation may be the main source of statistical violations methodologically speaking.

Stylized facts of the Great Recession:

Stylized facts of the Great Recession relate to the analytical framework of IS-LM model. More precisely, stylized facts refer to the immediate effects of the financial crisis over the macro economy. It is arguably accepted that, in terms of the IS-LM model, the IS curve shifted sharply to the left thereby fostering the Great Recession (Graph # 1). The Great Recession was largely an effect of both, a substantial decrease in consumer confidence preceded by a large increase in interest rates. The United States Government reacted wisely but late. By the time the U.S. government realized there was a crisis in the economy, the Great Recession had spread throughout the economy. By 2008 Banks were reluctant to lend to each other, which is a signal of how pervasive Mortgage Backed Securities were considered to be (Blanchard, 2013). After Lehman Brothers declared bankruptcy in September 15th 2008, the Government of the United States decided to intervene by putting in place two programs: the Troubled Asset Relief Program (TARP); and later on, the American Recovery and Reinvestment Act (ARRA). TARP was intended to back toxic assets already in the market, whereas ARRA meant an increase in government deficit (Blanchard, 2013). Clearly, both ARRA and TARP consisted of an effort to use fiscal policy as a tool for pushing back the IS curve, since it is hard to assume TARP as a monetary tool.

A more detailed account of the events that fostered the Great Recession focus at the role of banks in the economy. Economists have pointed out onto three contributing aspects across the financial system: Leverage, complexity, and liquidity (Blanchard, 2013). First, bank leverage went at high level during the Great Recession. Through the usage of “innovative” financial instruments, Banks were able to hide off-balance-sheet leverage ratios (Kalemi-Ozcun, S. et al, 2011). Second, complexity. The growth of securitization of Mortgages-Baked loans (MBS) and the proliferation of Collateralized Debt Obligations (CDO) helped lending risk to hide too. Rating agencies were unable to detect and discriminate toxic assets grouped in large security bundles (Horton, B. 2013). Finally, liquidity related regulation was circumvent by Banks via Credit Default Swaps CDS. Through the issuance of security insurance, Banks were able to mask their liabilities with respect of the level of risk they took. In other words, riskier mortgages loans or Subprime loans were masked riskless throughout CDSs and bundled into MBSs (Blanchard, 2013). This structure helped the crisis to spread out deeper the financial system. These three aspects cogently support the Great Recession explanations in regards to the financial system.

Generally speaking, a fall in consumer confidence affects consumption negatively. Given that investment depends partially on sales, a fall in consumer spending will make non-residential investment to drop. Private savings will decrease since consumption depends on income. The overall effect of a decrease in consumer confidence would eventually lead to a deferral of household spending whereby hurting output through a deferral of non-residential investment.

Graph 1.
1
Thus, housing prices increased from 2,000 to 2,006, which in some indexes such as the Case-Shiller that increase represents more than 200% change in less than a decade (Blanchard, 2103). This increase in demand for loans was outcome of low interest rates in mortgages. Economists have come into a consensus that during 2,000 and 2,006 Banks apparently were more willing to lend to borrowers, to the extent that by 2,006 20% of the U.S. loans were mortgages (Blanchard, 2013). Additionally, both borrowers and lender seemed to agree on the assumption that housing buildings do not depreciate over time.

U.S. Recession of 2001:

To some extent the Great Recession has many similarities with the U.S. Recession of 2001. The years preceding 2001 were characterized by a similar economic expansion. Though investment declined 4.5% in 2001 (Blanchard, 2013). Data at the time did not show negative growth nor signals of slowdown. Both Recession were triggered by a decrease in autonomous spending. The Fed pursued a very expansionary monetary policy after realizing there was a recession in the U.S. economy. The Fed chair of the time, Alan Greenspan, called the preceding expansion that led the economy to the slow down, a period of “irrational exuberance” (Blanchard, 2013). There were high levels of economic optimism that lasted for almost a decade before 2001. On the fiscal policy side, President George W. Bush pushed through legislation that lowered income taxes and increased spending in homeland security sector. On the monetary policy side, the Fed increased the money supply in 2001. Graph #2 depicts the effect of a decline in investment demand as well as the effects of the policies put forth by the government. Note that both graphs look similar.

Graph # 2.

2

Nonetheless, as Horton (2013) point out financial derivatives have always existed and performed well. Yet an explanation that considers only the financial crisis seems to be not sufficient. Perhaps, a deeper look into consumer expectations and into the history of the housing market may shed light onto some complementing conclusions. Thus, the model presented in this paper advances an emulation of the real interest rate calculation as the interest rate in terms of the housing prices. In the context of low nominal interest rates, analysis considering either the appreciation or depreciation of houses prices become more relevant. Investor’s decision on whether to hold bonds or equity on a house makes the difference for the real estate market, and particularly to the housing market. When interest rates are at low levels it is expected that investments seeking profits turn into the real sector. Even more, under the perceived reality that housing prices do not depreciate with time.

Capital inflows to the housing market are reasonable movements since nominal interest rates are at records lows. Additionally, expected inflation has been also around 1.1% (Blanchard, 2013). Both nominal interest rate and real interest rate in the American economy have been moving closely since 1995, with both fluctuating below 4% (Blanchard, 2013). This is an outcome of the so-called liquidity trap. Monetary institutions in the United States aim at stimulation economic growth by lowering nominal interest rates. The outcome has been no so much of an increase of household spending but a mobility of capital into more profitable markets.

The effect of changes in interest rate comes in two places. First the effect nominal interest rate has over money stock. The other effect, which is the one this paper is concerned with, is the effect over the goods market. The direct effect yields changes in investment.

y=C(y-T)+I(y,ⅈ-πⅇ)+G

This view main explain how expectations of the interaction between the nominal interest and the inflation lead investors to the housing market. Such movement of capital inflows helped create the increase in housing prices often cited by Case-Shiller index. More precise, the increase from 2000 to 2006. Some other indexes indicate somewhat similar. The index of monthly housing prices created by the United States Census Bureau has 1991 as year base, and also shows the housing price increase from 2000 to 2006. Graph number three shows the evolution of housing prices for the Census Bureau index. Perhaps the only difference with the Case-Shiller index is a couple of month lagging in the case of the Census Bureau index. Nonetheless, both indexes point out the sharp increase.

Graph #3.

3

That increase in housing demand can also be seen in the number of housing units completed for the same period. Data taken from the United States Census Bureau indicates that (not seasonally adjusted) completion of housing units in the United States also went sharply up during the same period. From 1990 through 2006 completion of new housing buildings doubled. This reaffirm the statement on the housing price increase due to an increase in demand as well as financial capital seeking real sector to invest. Graph number four shows that such increase ranging from 1990 through 2006 was of about hundred percent. In 1991 the number of new housing built in the United States was roughly one million units. The same indicator for the year of 2006 grew up to roughly two million.
Graph #4.

4

Total number statistics of household organized by income are in table #1. Graph #5 shows the aggregate housing building stock in the United States. 47 percent of Housing buildings in the United States are between 60years and 40 year aged. 33 percent of housing buildings are less than 35 years aged. And roughly 20 percent are more than 60 years aged.
Table # 1.
Total number of households by Income (numbers in thousands).
Sum of Less than $10,000 Sum of $10,000 to $19,999 Sum of $20,000 to $29,999 Sum of $30,000 to $39,999 Sum of $40,000 to $49,999 Sum of $50,000 to $59,999 Sum of $60,000 to $79,999 Sum of $80,000 to $99,999 Sum of $100,000 to $119,999 Sum of $120,000 or more
11401 12228 13882 11434 10212 8581 14330 10054 7195 16577

Graph #5.

5

A bigger picture of the correlation between age of housing buildings and household income in the United States id depicted by graph number 6. That graph shows that the big chunk of buildings currently occupied in the United States were built during 1950’s through late 1970’s. Average Households in the United States tend to live in building built during those years. It is also reasonable to think that those same buildings –that same chunk- comprise a great percentage of the Mortgage-backed loans MBLs. If we assume those buildings require repairs and maintenance, we can assume Building Fatigue came into place as a contributing factor for the Great Recession.
Graph # 6.

6

The graph clearly indicates the distribution of occupied buildings by household income. Note that the great concentration of number of buildings are between 1950’s and late 1970’s. Those four columns may shed light into our hypothesis. Roughly 54 million housing buildings may be running progressively into building fatigue. This effect may be exacerbated by household income.

The model:

We look at the reasons why a Mortgage-backed loan MBL may go underwater. As defined by Blanchard, “underwater” means that the amount of the mortgage exceeds the market value of the house. Without trying to contest any of the current explanations about such phenomenon, it is important to consider how age of buildings combined with household income may have an effect on payments default. In the aggregate, we claim that the United States is currently in transition for renewing housing building stock. That is, more than half of the housing stock has aged over 40 years. Such condition makes structural aspects of the buildings deteriorating. Such effect may affect expectation of the mortgage-backed loans.
Therefore, we claim that there is a higher risk of defaulting based on buildings fatigue. We call that risk “Mortgage defaulting Risk derived from housing building fatigue”. We would argue that the defaulting risk from building fatigue is a function of Household Income (X); plus the age of the house (y) times a ratio of the house price relative to the amount of debt or Mortgage-Backed loan.

f(d)=βx+∂y(Avg Housing Price/Avg Mortgage loan)

If we consider a fact that low income people live in aged housing buildings, it is feasible to start to correlate some more aspects of such stylized fact. Our second hypothesis claims that mortgage loans go underwater more often as the housing building gets older, basically because aged buildings require more repairs than newer housing buildings. The empirical foundations for intuitively thinking this is the case are the following: 3% of the total housing building in the United States have moderate structural problem –mostly in kitchens. Data available in the survey for 2010 shows that roughly 3,939,000 building in the United Sates have problem of such type. A lower number, but in bigger trouble are 1,950,000 housing unit that count on severe physical problems. This are basically plumbing related deficiencies. It makes up to 1% of the total housing buildings. 1% may not sound alarming, but it certainly adds up, especially when such problems are coupled with signs of mice in buildings up to a 9%; and also signs of cockroaches of about ten percent of the total housing buildings in the United States.

This combination of factors is what we call the Housing building fatigue. Living condition after housing purchase may change rapidly creating investment depreciation. The economic activity known as housing flipping may be inflating expectation on the borrower’s side. Renovations performed in old building may increase temporarily the value of the property. In order to keep up with the purchase value, instead of depreciating, the borrower depends on neighbor’s own renovations. A kind of externality will have an effect on the initial value at which the borrower purchased the house. If the borrower bought a renewed house, which be older than say 40 year, its price will depends heavily on neighbors’ renovations. If neighbors renew their houses our borrower’s price will at least keep its value. If the neighbors do not renovate our borrower’s house may depreciate, and therefore the mortgage loan may go underwater.

The neighbors’ externality takes place in a housing market populated with housing buildings demanding renovations. That market segment happens to be concentrated at the low income brackets. The real interest rate is the analogues borrower’s property appreciation. Mortgage loans for many families represent the investment of their life. As such, investments are expected to give a return on capital. There is a borrower’s expectation of a future price increase that incentivize the purchase of the house. This is the same incentive that brings capital to the housing market. Thus, we can claim that there is a housing appreciation that mimics the interest rate in terms of the basket of goods –the real interest rate. We actually can call that the real-real-estate interest rate. We just need to change the level price for the average housing price in order to calculate this rate. Unlike the real interest rate, housing appreciation need not to follow investor’s decision on whether to hold bonds or equity, because the investment is made not necessarily for extracting profits. Nonetheless, this type of investment still requires to show some return, otherwise it goes underwater. As we said we just need to replace the price level (p) from the real interest rate formula.

(1+r_t )=((1+i ̇_t ) p_t)/(pe_(t+1) )

Replacing (P) with the average house price,

(1+r_t )=((1+i ̇_t ) 〖AVG House price〗_t)/(AVG House Price e_(t+1) )

Empirical testing of the first hypothesis:

Hypothesis number one: low income Americans tend to live in aged housing buildings.
We found not a strong correlation among the two variables. However there is a slight sign of such correlation that can get stronger as data allows for more observations and comparisons. That is, low income Americans tend to live in aged housing buildings.

We break down the data of income into tens brackets: less than $10,000; from $10,000 to $19,999; from $20,000 to $29,999; from $30,000 to $39,999; from $40,000 to $49,999; from $50,000 to $59,999; from $60,000 to $69,999; from $70,000 to $79,999; from $80,000 to $89,999; from $90,000 to $99,999; from $100,000 to $109,999; from $110,000 to $120,000 and over. We also break the housing age by fifteen brackets ranging from houses built from 1919 or earlier to houses built in 2013.
The first bracket data depicted in Graph # 1 of the annex shows a correlation between household income and year built of the building they live in. The correlation is a direct negative correlation telling us that it is unlikely to find household income than less than ten thousand dollars living in buildings built the last decade. This correlation gradually inverses as income rises. The highest bracket of household income which for this regression is $120,000 and over gives a positive correlation unlike the lower income bracket. This basically let us generalize –with some caveats- some of the results for the hypothesis. Such conclusion is that low income people tend to live in old houses, whereas high income people tend to live in newer building houses. More details can be seen in the following regressions. As income increase chances to find newer buildings occupied by those households rise either. The threshold where the correlation starts to be positive in at income level $80,000 and higher (Graph # 10 of the annex). Form that level of income it is likely to find such households living in newer building houses. The correlation and the r^2 become higher as income rises.

Conclusions:

We proposed that a contributing factor of the origin of the Great Recession can be what we identify as: default risk in mortgages payments for low income Americans due to a housing building fatigue. In this first article we established empirically that there is a correlation between low income Americans dwelling in aged housing buildings. This correlation makes up just a first step in a series of hypothesis testing regressions in which the general hypothesis will be tested.
Second, by looking at the descriptive statistics we could state that the number of housing buildings in the U.S. were built between 1950’s and late 1970’s, which helps support intuitively our hypothesis. More precisely, 47 % of Housing buildings in the United States age between 60 years and 40 year; 33 % of housing buildings are less than 35 years aged. And roughly 20 % are more than 60 years aged.

Third, this article could also show data that confirm comparable findings of the Case-Shiller index. That is, the increase in housing demand could also be seen in the number of housing units completed for the period ranging from 2000-2006, which allowed us to reaffirm that housing price increased due to an increase in demand as well as financial capital seeking real sector to invest.

Fourth, leverage, complexity, and liquidity in the financial and banking sector may help explain how the crisis spread and how it was amplified. However, looking at the aspects related to housing age may help understand better how the housing sector affects consumption via housing repair needs. Furthermore, housing building conditions may shed light into an explanation of why mortgages go underwater since roughly 5% of the existing housing buildings in the United States have reported some sort of structure deficiency. Certainly, we cannot conclude so far that the age of the house and its neighbor’s effect may exacerbate the risk for mortgage payments defaults, but we can intuitively point out at that as a possible factor for increasing the risk.

Neither can we claim that renovations performed in old building may increase temporarily the value of the property, nor prices of these houses depend on neighbors’ renovations. We are certainly far from proving the risk of defaulting in a mortgage increases with the age of the house, and that housing prices depend on the pace at which old house buildings are renovated.
Annex:
Graph # 1
7 8 9 10 11 12 13 14 15 16

References:

Blanchard, Olivier, & Johnson, David (2013). Macroeconomics. Pearson. Sixth Edition.
Kalemi-Ozcon, S. & Sorensen, B. (2011). Leverage across firms, banks and countries. NBER, Working paper # 17,354.
Horton, B. (2013). Toward a more perfect substitute: how pervasive on the issuers of private-label mortgages-backed securities can improve the accuracy of rating. Boston University Review.

On the mathematization of Economics in the 21st Century.

The origins:

The origins of mathematization of economics apparently is a contested issue even for economists. The genesis of this phenomenon may change depending on the measure used for establishing when historically math joined economics. However, and regardless of the precise historical development of such relationship, what seems to be useful to note are Wassaly Leontief’s and Gerard Debreu’s main conclusions: mathematics have allowed economics to progress as a science; and, economics have the same all-social-science epistemological boundaries given the nature of its object of study –society. The former conclusion seems to be robust and uncontested; whereas the latter conclusion can be questioned given the rise of “Big data”. This essay contributes to the mentioned author’s conclusions by pointing out that the rise of “Big data” may make applied statistics a more suitable data science than either calculus or matrix algebra in a vacuum. This essay starts by summarizing the accounts provided by Gerard Debreu and Wassaly Leontief regarding the mathematization of economics. Then the essay takes Cukier & Mayer-Schoenberger’s essay about the Rise of Big data to complement both Leontief’s and Debreu’s main concerns. The main conclusion is that Economics may benefits wider from the digitization of the society than from the mathematization of its core concepts.

First, Gerard Debreu (1991) introduces an account of events that relates the origins of interdisciplinary in economics. Debreu states that the stimulus for the mathematization of economics derives from an ideal –not feasible- emulation of physics. Starting by 1933 with the launch of the Journals Econometric and the Review of Economics Studies, Debreu places such year as the beginning of a fruitful interdisciplinary relation between mathematics and economics. Debreu also looks historically into other possible measures for determining the integration of both fields, such as the rise of Game Theory in 1944, the growing use of mathematical expressions in refereed publications, and even, the increasing number of the Econometric Society fellows as faculty members of Departments of Economics (Debreu, 1991).

For Gerard Debreu (1991), though, there exists a limit to the economics’ ideal pursuit of physics scientific formalization. Such a limitation is given by both the possibilities of performing experiments, and the use of mathematical deductive reasoning itself. In regards to the former Debreu states that “being denied a sufficiently secure experimental base, economic theory has to adhere to the rules of logical discourse and must renounce the facility of internal consistency” (Debreu, 1991. Page 2). Meaning there is no other way than logic in which economists can inquire their object of study. As well as in any other social science, the limits set by epistemology apply to economics. Nonetheless, it is this use of mathematical logic which has given economics the chance to scientifically progress through a constant dialogue with previous proposed ideas and theories. In Debreu’s words, “the great logical solidity of more recent analysis has contributed to the rapid contemporary construction of economic theory. It has enable researches to build on the work of their predecessors and to accelerate the cumulative process in which they are participating” (Debreu, 1991. Page 3). Such a dialogue plays a key role in the structure of scientific revolutions and scientific progress (Kuhn, 1962).

A Second View:

The second interpretation about the mathematization of economics is introduced by Wassaly Leontieff (1954). In his account Leontieff sort out the contributions mathematics have done to some working concepts in economics. Working concepts such as the idea of maximizing behavior, theory of games and interdependent choices, dynamics approaches for analyzing repeated economic phenomena among others have benefited from the use of calculus, statistics and matrix algebra (Leontieff, 1954). For Leontieff, the same way as for Debreu, the boundaries of epistemology limit the possibilities of assessing social-economic reality. The eternal dichotomy between subjectivism and objectivism is introduced by Leontieff as a struggle between theorists and empiricists (Leontieff, 1954). The tension between inductive reasoning and deductive reasoning does not scape to economics.

For both authors the use of mathematics in economics has meant the vehicle for advancing the field. Mathematics as a shared and precise language has allowed economist to compare ideas, refine concepts, and -in general- to speak to each other in order to build upon predecessor’s work. To mention that economics has been able to build and progress upon previous works sounds odd; however, it is a key consideration if taken in the context of other social sciences which barely speak even within themselves.

Likewise, both authors point out the boundaries for understanding cogently economic phenomena. For both of them, the economic “laboratory” is not as feasible to build and use as it is for physics. Therefore, economics will always have the limitations for really knowing the parameters of the measures it inquires about. However, such a limitation may be getting to an end with the rise of “Big data”. Here is where “The rise of big data”

Photographer: Catherine De Las Salas

Photographer: Catherine De Las Salas

comes into place (Cukier & Mayer-Schoenberger 2013). Cukier & Mayer-Schoenberger’s essay aims at unfolding the potential use of information generated by the growing trend in the consumption of internet. They propose a twofold argument: the rise of an endless possibility of new sources of information and, therefore, a paradigm shift in research –possibly for social sciences. They do articulate such argument by relating four pillars. First, the authors point out the evidence of the latest technological revolution; second, they show a bit of evidence of a growing data collection practice that they call “Datification”; third, the authors stress the convenience of getting massive data at a fraction of the current cost, which basically means overcoming the cost barriers for “approaching the N=ALL”; and fourth, Cukier and Mayer-Schoenberger remark the usefulness of working with correlations instead of just causations.

In spite of the Cukier and Mayer-Schoenberger’s implicit assumption that research methods may be under a paradigm shift, it really deserves more of both attention and solid evidence. The authors claim that “All those digital bits that have been gathered can now be harnessed in novel ways to serve new purposes and unlock new forms of value. But this requires a new way of thinking and will challenge institutions and identities”. Evidently, the author’s use of new-proposed-terms such as “datification” is strong evidence of an emergency, or at least, an improvement of what maybe an obsolete set of methodological tools. Thus, if a brand-new field is emerging and a paradigm is being shifted from the current use of statistics toward the innovative interpretation of Big Data, then, we are indeed in front of an unprecedented turn in social sciences research, and particularly in Economics research. In the context of Big Data, apparently, economics may benefit wider from the digitization of the society than from the mathematization of its concepts. In other words, it is reasonable to expect that “Big data” will yield enough material for a better, more data grounded economic theory, and a larger information dataset used with the purpose of interpreting socio-economic reality.

References:

Debreu, G. (1991). The mathematization of economic theory. The American Economic Review. Vol. 81. No. 1.

Leontief, W. (1954). The twenty-seventh Josiah Willard Gibbs lecture, delivered at Baltimore Maryland. December, 1953.

Kuhn, T. (1962). The Structure of the scientific revolutions. University of Chicago Press.

Kenneth N. Cukier and Viktor Mayer-Schoenberger (2013). “The rise of big data”. Foreign Affairs.

“Cutting edge” technology and its impact on democracy: the case of QUBE (1977) in Columbus Ohio.

 

Sometimes Innovation does not let people think historically. By historically, we mean considering past, present and future at the same time. Most of the times innovation leads us to think that any Innovation is entirely newand therefore it will have unprecedentedconsequences in the future. However, history is full of innovation experiments and the subsequent consequences.  Western societies have known information technologies and internet by decades now and there always has been a belief about its relation with the political system. The question has been: will the new technologies make our society more democratic? From experts’ point of view democracy is still a stage to achieve in a developmental line. Internet experts strongly believe that someday the political system will benefit from technology to the extent in which direct democracy will emerge. This paper goes back in history to show how innovation in communication has affected the political system. By looking at the QUBE experiment in 1977 in Columbus, Ohio, this paper offers a historical perspective on innovation and its effects on democracy. The second part of this short paper briefly summarizes some beliefs about the relationship between democracy and internet. The third part brings two examples of using technology for politics and policy decision-making processes. The forth part goes into the QUBE experiment analysis, and finally the paper presents some conclusions.
 

Democracy and technology:

Rostow’s idea of development assumes that all innovations made on information technologies belong to the latest stage of growth. In his view, developed countries have reached the age of high mass-consumption and beyond. The technology itself allegedly proves such transition. From there, innovation will always be cutting edge. The way western societies understand the concept of development applied to technology is by constantly pushing technological matters close to a time-cliff in which neither future nor past can be interpreted, basically, because innovations are so unique that they cannot be compared to previous experiences, and the technology always moves “up” and “ahead”. Once again, that restriction on innovations non-comparability was created by the western idea of history in which stages follow one to each other until reaching the age of high mass-consumption.  
Regarding political systems the story is not too much different. We have come to believe that democracy is the latest stage of liberal political institutions. Francis Fukuyama’s thesis dramatically ends the debate about further development of democracy. For him, the western world does not look for substantial changes on democratic systems as it does in social and moral issues (Fukuyama, 2000). Democracy is as it is currently and it is not going anywhere else. Somehow, western societies believe that there was the time when tyranny ruled societies; there was the time when God ruled the world; there was the time when direct democracy took place; those stages were successfully passed and now there is the time in which people govern themselves.
These two development lines (political institutions and technology) merge in a belief in which technology transforms representative democracy into direct democracy. Experts such as Darrel West, David Coursey and Donald Norris, agree in formulating four general stages (more stages) in the development of the relationship between democracy and information technology. Summarizing their concept of e-government evolution, there is a first stage called the billboard stage; then comes the partial service-delivery stage; one after, the portal stage with fully service delivery shows up; and finally, we get to the point in which the interactive democracy with public outreach and accountability – enhancing features are presented (West, 2007, pag 8-9).
Furthermore, West, Norris and Coursey reach a conclusion about the confluence of democracy and information technologies. They “…find evidence that is consistent with models of incremental change. Most public sector sites have not made much progress in incorporating public outreach or personalization into their websites” (West, 2007, pag 102). They also find that “local e-government in mainly informational, with a few transactions but virtually no indication of the high-level function predicted in the models” (Coursey&Norris,2008). These two propositions wish a stage in which direct democracy can be fully implemented. Innovation tends to make us think that we have never seen experiments regarding direct democracy. History may prove otherwise.
 

Previous experiences:

Going backwards in history, the first example Americans may have in mind, when it comes to using information technologies for public decision, is the Howard Street Bridge in Baltimore Maryland. Such case represents always an opportunity for illustrating how governance has been affected by information technologies, to the extent in which a Mayor may solve a public controversy  by simple asking his fellow citizens for their preference about the color of a popular bridge. This example is presented as an engagement of citizens in public decision-making. The Mayor Martin O’Malley set an online poll in 2004 at the city hall web site in order to decide on the color the Howard Street Bridge should be painted. The Mayor favored a Kelly green and his proposal was defeated on the online “plebiscite”. “In a close vote, the 5,139 voters who used a city hall web site chose a rust red-brown favored by Baltimore artist Stand Edmister” (The Baltimore Sun. Nov. 1st, 2003) The question that remains is, would the mayor call for a similar consultation in order to determine citizen’s preference on Personal Income Tax policies or reforms?
Continuing backwards in history, the second most popular example of mixing technologies with democracy is the 900 number used by ABC news in the Regan-Carter debate of 1980. As Jaffrey Abramson recorded in 1992, “After the first Regan-carter debate in 1980, ABC news invited viewers to declare a winner by calling the 900 number at 50 cents a call” (New York Times, 1992). What Abramson from Brandeis University pointed out in 1992 about the use of such technology and its impact on democracy is that the results proved to be excessively biased against Carter, giving to Ronald Regan an overstated perception of real leadership. Issues such as time difference between west and east could affect the outcome of the poll. Also the fact that the caller should pay 50 cent should have discouraged low-income potential callers, and so on and so forth.
 

QUBE:

QUBE was a system developed by Warren-Amex Corporation in 1977 which was based on television cable technology. QUBE introduced for the very first time the possibility of interaction between people in different places in real time. Columbus, Ohio marked the starting point of the network experiment. Initially, QUBE was designed for selling Warner TV shows and products to householders. However, QUBE rapidly showed its potential on political matters, mainly on polling. QUBE was used for registering preferences, cast ballots, order movies and play games. Basically, “their cable box was outfitted with five buttons, which could be programed to offer various choices. Buttons could be coded for yes, no, or undecided preferences…” (West, 2007. Pag. 104). It rapidly extended to several other cities such as Dallas, Cincinnati, Pittsburg, Milwaukee and Houston among others. QUBE die in 1984.
Now, what were the conclusions reached after this “cutting edge” technology was used for political purposes?  Did QUBE innovation make Ohio more democratic than the rest of the United States?
 

Results:

This paper uses an historical approach in order to analyze “cutting edge” technology and its impact on democracy in the case of QUBE. Then, the sources that were part of this short research were mainly news and media articles. The research was limited to three newspapers: The New York Times, The Nation and the Baltimore Sun. From those articles we took a couple of excerpts which constitute a sample of what may be considered was the debate at the time. The following table shows the results.  
Table #1.
Newspaper, month, day and year
Tittle
Excerpt
The New York Times. June 21/1992.
“Perot’s ‘electronic town hall’ wouldn’t work.
“Previous attempts to bring democracy into the electronic age have produced mixed, sometimes frightening results”
“It (QUBE) promised to let viewers ‘talk back’ to government and was used in several cities”.
“Rates of participation were low; those who participated were not a scientifically polled sample of the population”.
“The cost of subscribing to QUBE reintroduced a poll tax on voting”
“Children could be pushing the buttons”.
The New York Times. July 14th/1978.
“Planning commission gets its turn.
“votes were not binding, and most questions put to the audience amounted to a multiple-choice preference quiz on the towns future rather than the expected referendum on planning board recommendations to the city council”
“I Think this is going to put Gallup and Harris in the 19th century”
The New York Times. December 6th, 1977.
The Black Box Tyranny.
“One night during its tryout period in March, the audience was asked whether a show featuring rock records ought to continue. The vote, by Black Box, was thumbs down”.
The New York Times. August 29th/ 1982.
Are those ‘instant polls’ democratic?
“Albert H. Cantril, president of the National Council of the Public Polls, subsequently chastised NBC, noting that the “polls” results cited on the air were not an accurate measure of opinion even in Columbus, much less the Nation”
The Nation. August 7th/1982.
Democracy and the QUBE tube.
“Will the new television technologies make our society more democratic?”
“But the advocates of interactive television display a misapprehension of the nature of real democracy, which they confuse with plebiscite system”.
“Plebiscitims is compatible with authoritarian politics carried out under the guise of, or with connivance of, majority opinion”

Conclusions:

This short paper tried to identify precedents for the debate on innovative application of technology and its impact on the democracy. By looking briefly at the QUBE experience as a tangible example of the application of information technology we may conclude that although Information Technologies are fairly recent developments, history may give some comparable examples that may help us understand the entire phenomena.
a)      The first aspect to note is that the debate about information technologies and its impact on democracy can be traced long back in history.
b)      There has been a concern about the legitimacy and transparency of technological tools used in previous cases in which technology has helped governments reach decisions. Such is the case of Columbus Ohio.
c)      Network Coverage and access also decrease the chances of legitimacy. Not even having a large amount of subscribers could support statistically appreciated values.    
d)      Opinions made public through these channels were not considered for final policy implementation. Unlike Baltimore’s Bridge, public consultations were not mandatory.    

References:

Fukuyama, Francis. (2000). The great disruption: human nature and the reconstruction of social order. New York: Free Press.
Coursey, D. & Norris, D. (2008). Models of e-government: are they correct? an empirical assessment. Public Administration Review. May/June 2008.
West, Darrel. (2007). Digital Government Technology and Public Sector Performance. NJ: Princeton University Press. 

Information society and the future of democracy.


Western world governments take advantage of tech revolution. However, Non-democratic regimes may benefit from technology revolution as well.

Liberal political institutions in the western world have been developed long side capitalism and information. The wisdom we inherited from the Cold War is that capitalism fosters liberty and therefore democracy as long as information and free speech rights are protected. Likewise, -regardless information and technology changes- Francis Fukuyama’s thesis ends the debate about further development of democracy. For him, the western world does not see changes on democracy as it does in social and moral issues (Fukuyama, 2000). It is as it is currently and it is not going anywhere else. However, now that capitalism seems to be crossing its more dramatic crisis since the Great Depression and at the same time facing technology changes, it is relevant to think about how political institutions will change in the future as well. This essay takes a look at the future of liberal democracy in a small sample of twenty nine countries by considering the so-called Information Age as a conditional factor for institutional change and democracy quality. Our conclusion is that political regimes –regardless its type or definition- adapt to technological changes and legitimize themselves by using electronic tools. It basically means that the convergence of liberal democracy and technological advances does not end in more democracy in non-democratic regimes. 
Western world governments take advantage of tech revolution: 
Developed societies have been exploring the consequences of becoming increasingly service oriented economies. That phenomenon has been characterized as the Information Age.  Castells states it as follow: “Although econometric equations are obscure in identifying the precise source of the new productivity pattern, the “statistical residual” found to be critical in the new production function has been often assimilated to the new inputs represented by the deeper penetration of science, technology, labor skills and managerial know-how in the production process” (Castell, The Information Economy). Technology is becoming more accessible nowadays. Internet literacy expands everyday as broadband initiatives populate public policies. The same countries have also established democratic political institutions. Most of them rank high when measured in terms of stability of the rule of law, competitive elections, check and balance institutional system and free market.    
If we consider knowledge as the intensive factor of production in postindustrial societies, then we should assume its scarcity at the same time, which means necessarily higher costs of getting information. Information cost seems to be decreasing since developed societies enjoy faster access to news and reports. Citizens seem to be more informed now that they were before. Facing that challenge, governments started to have a stake in this problem by providing access to communities, and at the same time, by providing access to services and procedures electronically. Government services started to take advantage of modern technology. Some have candidly claimed that there is a new evolution in democracy system: electronic democracy.  “Electronic democracy can be understood as the capacity of the new communication environment to enhance the degree and quality of public participation in government. For example, the Internet can enable certain citizens (namely, those with access to IT) to vote electronically in election, referendums, and plebiscites. The Internet also can facilitate opinion polling” (Kakabadse.A. & Others. 2003. Pag 47). 
Non-democratic regimes may benefit from technology revolution as well: 
In this process, which has taken place for the last two decades, E-Government models have made a transition from just e-mail and internal work use of technology toward a joined-up government which involves participation (Coursey & Norris. 2008). “In this model, e-government is clearly expected to evolve to a higher plane at which citizens have moved beyond accessing information and services, interacting with governmental officials, and transacting business with government. At this stage, citizens participate electronically in the very activities of governance” (Coursey & Norris. 2008. Page 525). So far, the process does not mean or make any reference to the kind of political system. 
Furthermore, challenges are understood in terms of public administration rather than in terms of the change of political institutions. To summarize the challenges, political systems face crises that the public administration must tackle in terms of accountability, governance and legitimacy (Huddleston, 2000).None of what Huddleston identifies as challenges are strongly related to regime type. All of those challenges may be tackled by authoritarian regimes as well as democratic ones.  A solution for facing effectively those three foresight crises relies on e-government or a better engagement of citizen in public service and democratization in where it takes place. 
Empirical evidence from twenty nine countries: 
The following table matches two different indexes. The first column shows scores of Citizen Participation from the Digital Governance and Municipalities Worldwide report 2007. Citizen Participation is defined as online participation in decision making, allowing policy feedback by constituencies, publishing public information, polling key issues, or citizens surveys (Holzer & Kim, 2007. Pag. 27). The second column refers the scores at the Polity IV index made by the University of Maryland 2010. The score that we take here is the general score which summarize variables such as competitive election and public service recruitment, executive constraints and political participation. The index also considers political stability and rule of law as a proxy for determining how democratic a country is.      
Citizen Participation
Polity IV Index
Republic of Korea
8
Singapore
-2
Thailand
3
Finland
10
Netherlands
10
Austria
10
Bulgaria
9
Latvia
8
Japan
10
Slovakia
10
Russia
4
USA
10
Switzerland
10
Canada
10
Ireland
10
South Africa
9.5
Croatia
9.5
Germany
10
Italy
10
Spain
10
Venezuela
1
France
10
UK
10
Australia
10
Belarus
-7
Bosnia
5
Costa Rica
10
Indonesia
8
Serbia Montenegro
8
Table #1 illustrates a paradox. Information age allegedly bring more democracy, however, countries such as Singapore, Thailand, Russia, Venezuela, -Belarus and Bosnia Herzegovina, have been gotten low scores at the Polity IV democracy index. Clearly, it is not possible to establish a relationship between the usage of information technology and democracy regimes. Electronic political participation does not translate in more democratic system. Actually within democracies such United States “contrary to the hopes of some advocates, the internet is not changing the socioeconomic character of civic engagement in America. Just as in offline civic life, the well to do and well-educated are more likely than those less well-off to participate in online political activities such as emailing a government official, signing an online petition or making a political contribution” (Smith & Others, 2009. Pag 3). More internet access does not mean a better democracy. 
Conclusion: 
By looking at the outliers, which happen to be non-European countries, we may conclude that more internet access may help totalitarian regimes legitimize themselves. Therefore we cannot expect more democracy as an outcome of the technological revolution. As we can see in the table, cases such as Singapore may represent a paradox for participation in which a society bolsters one party system regime. Although the People Action Party in Singapore is very popular among the population and the electorate, there is also evidence that the party has controlled the media and also intimidated the opposition.  Hugo Chavez in Venezuela also may represent a case in which democracy is kidnapped by a political party that publicizes its policies through an effective use of media, including Internet. Alyksandr Lukashenko in Belarus is also a good example of how by using the media and information technologies a government may legitimize constitutional changes that severely affect political rights, ended up in a closed democratic regime.
Bibliography:
Fukuyama, Francis. (2000). The great disruption: human nature and the reconstruction of social order. New York: Free Press.
Castells, Manuel. The Information Economy. 
Kakabadse, A., Kakabadse, N.K.& Kouzmin, A. (2003). Reinventing the democratic governance project through information technology? a growing agenda for debate. Public Administration Review. 63 (1), 44-60.
Smith, A, Scholzman, K. Verba S. & BradyH (2009). Pew Internet and American Life Project. The Internet and civic Engagement. 
Holzer & Kim (2007). Digital Governance in Municipalities Worldwide. The E-Government Institute. Rutgers, the State University of New Jersey. 
Coursey, D. & Norris, D. (2008). Models of e-government: are they correct? an empirical assessment. Public Administration Review. May/June 2008. 
Marshall, M. Polity IV Project. Political regimen characteristics and Transitions, 1800-2010. http://www.systemicpeace.org/polity/polity4.htm 

Performance of federal public agency for economic development: the Economic Development Administration (EDA) 1965-2010.

The eternal and controversial role of the state in fostering and bolstering economic development has been advanced -substantially and theoretically- since the second half of the twenty century. The administration of JFK – encouraged by some scholars- marks the beginning of a series of agencies devoted to tackle poverty. Amid Cold War (1961), the federal state of the United States established what could be seen as the first American Agency for economic development: USAID. Four years later in 1965, The Economic Development Administration (EDA) was institutionalized under the Public Works and Economic Development Act. The latter is the agency we will focus in by reviewing agency’s related literature. In that regard, this literature review found that the discussion about the agency has focused mainly on its expected outcomes and in its needed reforms. Only few works have been dedicated to analyze aspects of the management within EDA. 
Thus, the following balance presents a critical approach to what some scholars have said about the EDA so far. Despite the fact that the academic field of both economic development and public management are enormous, the search was narrowed mainly by geography (U.S.), agency (EDA), and period of time (1965-2010). We focus on accredited journals and specialized articles. Edited books, web sites and visual sources among other formats were omitted from this work.  Furthermore, this literature Review has its main limit basically on theoretical issues about the state’s mission in intervening the economy. We assume that EDA itself belongs to a specific theoretical framework that has ground on Economic Development Theories (Rostow, 1962; Hirshman, 1958; Rosenstein-Rodan, 1961).   
Therefore, if we exclude the theoretical framework of economic developmental issues, we can identify three categories of academic production. First, the most prolific debate over EDA is about its impacts and economic outcomes. Traditionally, that sort of analysis are intended to bolster ideological position on the role that public agencies should pursue in order to foster economic development. Therefore, most of them are not empirical research with exhaustive data-supported conclusions. A second less developed area of study is the performance of EDA itself: the structure of the agency –its institutional arrangements, its bureaucracy and its culture, have not been tackled intensively. Few authors have specialized on EDA examination. Third, by looking at EDA’s performance and outcomes -from almost an ideological perspective- its critics have furthered a debate over a set of reforms that EDA needs. Likewise we present our result in the same order: first section of this literature review is this introduction; the second section shows what experts have said about EDA outcomes and Impacts; the third section considers EDA’s performance related literature; and the fourth looks at the proposed reforms debate. Finally we present our conclusions. 
Impacts and Outcomes:
Regardless the theories of either public management or economic growth, the administration of the economic development requires the measurement of its goals and expectations in terms of objective metrics for unemployment rates, capital growth and poverty alleviation. Scholars have focused in looking for impacts mainly in labor rates. Two works depicted EDA’s employment effect during two major time periods, first assessment cover the first period from the EDA’s creation to 1975 (Borrows & Browley). The second assessment covers until 1999 (Haughwout, 1999). Despite that absence, works have conclude that, with relative solid empirical evidence, the impact of EDA’s programs on unemployment rate have been positive so far. The conclusion for the agency’s first decade of works is that its programs have not been successful in fostering employment mainly in large metropolitan areas, while it did have a different outcome on rural areas (Barrow & Browley, 1975). In authors word,  “the empirical analysis in this study indicates that EDA investments in large urban center or regions did not produce employment impacts as large large as projects located in less densely populated areas” (Barrows & Bromley, 1975). 
Besides  Barrows and Bromley, Haughwout’s later analysis on labor states that positive improvements have been made by the agency, in spite of one programs externality: the improvements on labor market  were advanced at expenses of surrounded areas where the projects took place and EDA’s investments  had not significant effect on employee compensation (Haughwout, 1999). Considering the methodology, these two works used multiple regression. Particularly, Barrows and Bromley used county population, non-agriculture employment rate, average household income, among other proxis. Finally, it is important to note that no similar works, for the most recent Agency historical period (from 2000 to nowadays), have been published yet.               
An other metric for measuring EAD’s real impact on economic development is change in personal income rate, which is a proxy for gauging economic growth. This is a metric that helps understand why state intervention is important for alleviating poverty and leveling people over the poverty line. Martin and Graham in a paper published in 1980 provided some evidence of  EDA’s programs “on the aggregate economic growth of recipient areas” (Martin & Graham, 1980. P. 52). They take Personal Income as a proxy, and by comparing aided-counties to non-aided-counties, they conclude that “EDA-assisted counties experienced significant improvements in relative personal income growth rates during the period of aid receipt” (Martin & Graham, 1980. P. 6).
Finally, there are some publications that mostly look like propaganda (Burchel, 1997; EDA, 1974; EDA Annual report 2000)). It is hardly to define them as strictly scholar production. However, within those publications it is possible to find not only some interesting facts, but also interesting scholars. The work led by Burchel in 1997 is a good example of that. It was developed by six universities and academic institutions: Rutgers University, Columbia University, Princeton University and University of Cincinnati among others. Despite the fact that it could have been a paid non-independent study, authors conclude that most of the EDA’s projects were achieved as follow: 99% were completed as planned; 91% were completed on time; and 52% were completed under the budget estimations (Burchel, 1997). Somehow they made an approach to measure the performance of the agency throughout its outcomes, which is our the next issue.

EDA’s Performance: 

Within the literature we accessed, there are no works that assess Management Performance considering goals, objectives and expectation versus Outcomes and Impacts. In fact, Agency’s Outcomes and Impacts are analyzed, by independent scholars, regardless the stated objectives of the Agency. Nonetheless, there is a relative consistent effort to follow the Agency historical evolution. Penn State University Professor Amy Glasmeier has recently devoted research resources in studying the expenditure patters of the agency (Glasmeier & Wood, 2005) and its likely overhauling (Glasmeier & Markusen, 2008). It is important that Glasmeier’s works focus mainly in poverty reduction, and within that effort she has tackled EDA’s performance. 
The first analysis she wrote about the agency was about its expenditure patters over EDA’s history. There, along with Lawrence E. Wood, she described the historical background of the Agency stating that since 1930 the American state started to build such a kind of public agencies due to “a fear of the political implications of the failure of market-led models of development…” (Glasmeier & Wood, 2005. P. 1262). They note that the greatest pitfall that the Agency has fall in is policy intervention. As James Wilson would describe it (Wilson, 1989. Chapter 13), Congress ask the agency not only to do things without allocation of enough resources, but also increase EAD’s responsibilities over time (Glasmeier & Wood, 2005). For instance in 1969 Congress‘ mandate to the Agency was to cover urban areas as well as rural, broadening the eligibility of potential recipients; In early 1970’s Congress required the agency to provide technical assistance to firms that had been affected by free trade; even the Congress has charged responsibilities on libraries, drug rehabilitation and other programs such as “defense adjustment assistance to areas that had recently experienced military base closures” (Glasmeier & Wood, 2005). Authors argue that all these added responsibilities have been done in the context of dramatic financial cuts and adjustments.    
The other important work by Glasmeier was co-authored with Ann Markusen. There authors tackle one of the principal critics scholars have made to the agency: It has focused excessively on infrastructure and housing projects. In author’s words: “Across these three agency programs, the bulk of funds have been spent providing basic infrastructure and housing (Glasmeier & Markusen, 2008. P. 84), and also “over the years, water and sewer provision has comprised between 25% and 75% of program expenditure” in EDA programs (Glasmeier & Markusen, 2008. P. 85). They also note that “EDA is largely viewed as rural economic development agency because many older cities and deteriorating suburbs have had difficulty acceding EDA’s development and infrastructure programs” (Glasmeier & Markusen, 2008. P. 88). From there, they acknowledge that EDA should devote efforts to improve human capital. By doing so, they identify EDA’s limitation to deal with such challenge since it lacks the required expertise to work with other agencies that deal with educational issues. Therefore, coordinating with other agencies, redefining the policy itself, and defining its constituency base are part of the reforms the agency needs in order to be understood and judged it performance in a fair way. 

Reforms:

Less rigorous analyzes have been written on EDA’s needs for reform. Mark Drabensttot and Jeffrey Finkle -at best- might be the most bitter critics of EDA’s policy. Both have works that barely look at empirical evidence. Drabensttot’s work on EDA focuses more in policy recommendation than in empirical evidence to support his perspective and critics. So does Finkle in a short essay of barely four pages. 
In a  publication of the Federal Reserve Bank of Kansas, entitled “Rethinking Federal Policy for Regional Economic Development” (2006), Drabensttot basically explain the reason why EDA is lagging behind the economic progress. Generally speaking, he states that EDA’s policy must adapt to post-industrialism economies‘ needs. He -whom was at the time Director of the Center for the Study of Rural America at the Federal Bank Reserve in Kansas City- has led the argument for a reshaping in EDA policy, basically by stating that “the problem is quite simple: most of the federal programs for economic development were written for the economy of the 20th century, not for the 21st century”(Drabensttot, 2006. P. 115). As well as Glasmeier and Markusen noted, Drabensttot acknowledge that EDA has had a heavily emphasis on infrastructure, housing and airports. Likewise, Drabensttot’s proposal consists in having a comprehensive regional strategy, regional governance improvement program, bolstering regional innovation initiatives and entrepreneurship (Drabensttot, 2008). Beside that, Jeffrey Finkle -who is CEO of the International Economic Development Council- has agreed with Glasmeier, Markusen and Drabensttot that “for successful consolidation of these programs to occur, each program must undergo a review and performance evaluation, coordinate across departmental lines, and place more emphasis on human capital” (Finkle, 2008. P. 112) 
   

Conclusions: 

Generally speaking, EDA is seen as a small non-important agency within the economic establishment. That might be the reason why the reviewed bibliography is neither developed with strong arguments nor with passion and exhaustiveness. Certainly, economic policy focuses on monetary issues and public budget allocation, but scholars should devote more resources and efforts in looking at major projects the Agency has. Considering the current economic recession, future researches should look closer at how EDA has worked out and helped the US economy to move forward during the last economic downturn. Definitively, more researches are needed in order to clarify the role of public management in pursuing the economic recovery. In that sense, having a comprehensive understanding of the agency and the role it should perform during economic crisis will contribute not only to the debate over policies but also over the nature of the public sector itself. 
Specific conclusions: 
  1. Analysis on economic impact and agency outcomes are not updated. The last analysis we had access to was published by 1997 which leaves a hole of more than a decade without serious and data-driven discussion. In that regard, poverty alleviation theories should also frame future researches, complementing with Personal Income and economic Growth perspectives. That would help the Agency’s ability to reshape its policy. 
  2. Management Performance Analysis must match goals and objectives previously set and the outcome and the impact obtained. The actual knowledge we have on this regard is dysfunctional, basically because economic growth depends not only on the Agency’s work or policy. That is the reason why it is important to limit Agency’s scope while judging its performance. 
  3. Although it might be an issue of ideology, debate on EDA reforms require more empirical evidence demonstration. Data-based discussion would help improve not only the debate, but also the policies the Agency needs.           
Bibliography: 
  1. Glasmeier, Amy and Ann Markusen (2008). Overhauling and Revitalizing Federal Economic Development Programs. Economic Development Quarterly, May; vol. 22, 2: pp. 83-91.
  2. Barrows, R. L., & Bromley, D. W. (1975). Employment Impacts of the Economic Development Administration’s Public Works Program. American Journal of Agricultural Economics, 57(1), 46-54. Retrieved from EBSCOhost.
  3. Glasmeier, A., & Wood, L. (2005). Analysis of US Economic Development Administration Expenditure Patterns over 30 Years. Regional Studies, 39(9), 1261-1274. Retrieved from EBSCOhost.
  4. Haughwout, A. F. (1999). New Estimates of the Impact of EDA Public Works Program Investments on County Labor Markets. Economic Development Quarterly, 13(4), 371-382. Retrieved from EBSCOhost.
  5. Finkle Jeffrey A. (2008). A Cautious Look Into Reconfiguring Federal Economic Development Programs.  Economic Development Quarterly, May; vol. 22, 2: pp. 112-114.
  6. Drabenstott, Mark (2008). An Effective Overhaul of Federal Economic Development Policy: Response to Markusen and Glasmeier. Economic Development Quarterly, May; vol. 22, 2: pp. 92-98.
  7. Drabensttot, Mark. (2006) Rethinking federal policy for regional economic development. http://geography.tamu.edu/class/bednarz/feddevpol.pdf 
  8. Martin, R. C., & Graham, R. r. (1980). The Impact of Economic Development Administration Programs: Some Empirical Evidence. Review of Economics and Statistics, 62(1), 52-62. 
  9. Burchel R. (1997) Public Works program: Performance Evaluation. Economic Development Administration, Washington. http://www.eda.gov/ImageCache/EDAPublic/documents/pdfdocs/1g3_5f19_5fpweval_2epdf/v1/1g3_5f19_5fpweval.pdf 
  10. Rostow, Walt Whitman (1962), “The Stages of Economic Growth” London: Cambridge University Press.
  11. Hirshman. Albert. (1958). The Strategy of Economic Development. New Haven, Conn.: Yale University Press. 
  12. Rosenstein-Rodan, Paul. (1961)Notes on the Theory of the Big Push”, in Ellis, editor, Economic Development for Latin America (1961).