Roxane Googin's predictions and the telecom world Andrew Odlyzko Digital Technology Center University of Minnesota http://www.dtc.umn.edu/~odlyzko Revised version, February 20, 2002. 1. Introduction The telecom crash that started in the year 2000 has been brutal to the traditional long distance carriers and devastating to the new carriers, whether in local or long distance segments of the market. The companies that have emerged almost unscathed are the established local carriers, the ILECs, which still have an effective monopoly on "first mile" connectivity. To many, they appear as the undisputed winners of the recent turmoil. Roxane Googin's interview with Gordon Cook (and her earlier, shorter one with David Isenberg) serves as a useful antidote to any temptation towards complacency on the part of ILEC managers or ILEC investors. She warns that ILECs might collapse, as their expensive infrastructure fails to hold its own in competition with new carriers deploying modern technology. Roxane Googin is surely right that ILECs are not immune to the pressure for change that new technologies are creating. Telecom business models will have to change, and the eventual winners may include none of the ILECs. Still, the ILEC situation does not appear dire, at least not yet, and there is time for corrective action. The main reason is that the drastic technology changes that some hope for, and others fear, will not occur on "Internet time." In particular, the local ILEC monopoly will take time and huge efforts to erode. The spectacular collapses of carriers such as Iridium, Winstar, Teligent, 360Networks, and Global Crossing were the result of the irrational overinvestment during the Internet bubble in companies that did not have solid business plans. In particular, they did not have a stable customer base, and were in markets with low barriers to entry. Neither of these applies to ILECs. What follows are some brief comments and speculations on how the telecom industry might evolve. Historical analogies are drawn, especially with the evolution of the computer industry. They do suggest that telecommunications will also be moving towards a horizontal structure. The greatest opportunities for established carriers are likely to be in providing network outsourcing services. The changes required to exploit these opportunities will be major, and it is very likely that there will be huge bankruptcies and writedowns. However, the telecom industry, as measured by revenues, is likely to grow, as it has in the past. There will be greater diversity in available communication services, which should provide for a more resilient infrastructure, even when individual companies restructure. 2. Growth and rates of change Telecommunications is a growth industry, and has been for centuries, as can be seen in the statistics in the manuscript "The history of communications and its implications for the Internet," . The telecom crash was a crash only for the telecom suppliers, the result of an unsustainable explosion in capital expenditures. (That explosion, in turn, was stimulated primarily by the competition unleashed by the Telecommunications Reform Act of 1996 and by unrealistic views of how fast Internet traffic was growing and how quickly new technologies would be adopted, with an extra boost from Y2K spending thrown in.) Total revenues of service providers have been growing all along at rates of 5 to 10% a year for the whole sector, with some companies going out of business, while others boom. That rate of growth is very respectable, and, following historical precedent, faster than the growth rate of the economy as a whole. The crash occurred because even this fast growth rate was simply not sufficient to sustain the boom in capital expenditures. The growth of the entire telecom pie allows for relatively gentle transformations. In particular, new services can grow without destroying old ones. That has been the historical pattern, and can be seen today. Let us consider some data for the United Kingdom. I am using this example because, unlike in the US, the UK regulator, Oftel, requires carriers to provide detailed statistics on usage of their networks. Here is some data extracted from the reports available at . The quarters listed are calendar quarters (not British government fiscal quarters used in the reports), and the columns are as follows: A = millions of minutes of outgoing calls from fixed phones B = millions of minutes of outgoing voice calls from fixed phones (i.e., A excluding Internet access) C = millions of minutes of outgoing cell phone calls D = millions of SMS messages A B C D 1999q2 47220 36979 4956 159 1999q3 50608 37590 5804 297 1999q4 53786 38869 7092 599 2000q1 56728 38806 7848 1306 2000q2 58339 37783 8388 1421 2000q3 62783 38237 9340 1648 2000q4 68289 38536 10525 2215 2001q1 73525 39349 11064 2758 2001q2 71940 37166 10874 2762 2001q3 ? ? 11222 3069 (Note: Column B figures are derived from those in Column A by subtracting the volume of "Other" calls, as Oftel calls them. Hence Column B suffers from underestimates because it ignores the voice calls to directory assistance and other categories that are counted in "Other." Column B also suffers from overestimates because probably between 10% and 20% of voice call volume is for fax transmission. Thus the numbers in Column B are only approximations, but the trend they show is probably correct. A final remark is that, based on prior experience, all the figures for 2001q2, which show drops in usage, are likely to be revised upwards.) Basically wireline voice usage has been pretty stable, while Internet access, cell phone usage, and SMS have all been booming. Note in particular that Internet access minutes have gone from 28% of voice minutes in 1999q2 to 94% just two years later. Thus the total wireline usage has increased by 52%. I do expect (and will discuss in more detail later) that voice usage will largely migrate to wireless links. However, by that time Internet access is likely to grow further, so this may not be devastating to wireline telephony. A basic principle to keep in mind is that new technologies take time to diffuse. As one example, consider the statistics for the number of cell phones in the US (year-end figures, with those for 2001 estimates): year millions of subscribers 1986 0.7 1991 7.6 1996 42.8 2001 115 So here we have something that many people consider indispensable (and for which they are paying, in aggregate, at least twice as much as they are paying for the Internet, in other words, voting with their pocketbooks for mobility over broadband), yet it took over 15 years to reach this stage. There is much public concern about the slow spread of broadband in the US. Yet, with the number of households with DSL or cable modems about tripling in 2001, that rate is quite fast, certainly by comparison with the spread of cell phones. Some countries (especially South Korea) have experienced much faster broadband penetration (with over half the households in South Korea having cable modems or DSL by the end of 2001), but then their prices have been much lower. For a service that costs $40-50 a month (comparable to the average cell phone bill), the US is not doing too badly in speed of broadband adoption. Lower price would certainly increase US penetration, but this price sensitivity just reinforces the point that the perceived utility of broadband is not all that great, at least not yet. At a low enough price, people will buy just to get a modestly improved performance with email. At the higher price that prevails in the US, consumers have to be convinced that the higher bandwidth and the always-on capability provide real value, and that takes time. In a 1997 paper entitled "The slow evolution of electronic publishing," , I argued that this speed of diffusion is rather typical, that "Internet time" is a myth, and that new technologies still take on the order of a decade to be widely adopted, just as the "grandfather of the Internet," J. C. R. Licklider, had observed. (A shorter piece on this theme, "The myth of Internet time," , was published in 2001.) That thesis is overwhelmingly supported by available data, as well as by numerous projections for the near future. For example, the January 2002 Forrester report, "Sizing US consumer telecom," by Charles Golvin with D. M. Cooperstein, G. J. Scaffidi, J. Schaeffer, and A. van Giffen, predicts that by 2006, 20 million circuit lines in the US will be dropped as a result of usage of wireless, broadband, and packet technologies. Yet, because of natural growth, this means that there will still be 116.1 million residential lines in 2006, as opposed to 126.4 million in 2001. While the actual numbers are bound to be different, the magnitude of the shift seems about right. A drop of 8% of the lines in 5 years is not negligible, but not catastrophic. Of course, in a low marginal cost business such as telecommunications, even a drop of 8% is noticeable, and would surely be reflected in stock market valuation of the companies. (It would also likely be followed by much faster deterioration, as a snowball effect takes hold.) The main issue is whether some compensating sources of revenues can be developed in the meantime. New technologies, such as wavelength switching, are on the horizon, and will surely transform the industry. However, that will take time. Right now, only a few of the largest ISPs can utilize wavelengths productively. Even very large and sophisticated enterprises, such as universities, typically have at most DS3 (45 Mb/s) or OC3 (155 Mb/s) links to the commercial Internet. Allowing them to direct OC48 to specific destinations is of little use now. Eventually their demands will grow, but not right away. At the moment, though, lambdas are relevant primarily at the carrier level, where they do indeed argue (and have contributed already) to the drastic restructuring in long distance transport. Roxane Googin is right to point out that debts can bring a carrier down. Leverage is great on the way up, but deadly when a business is contracting. However, we should keep in mind that the ILEC debt burden is not overwhelming. Global Crossing had debts of more than twice their annual revenues. AT&T until recently had debts about equal to its annual revenues. Qwest, a hybrid of a new long distance carrier and an ILEC, has long-term debt about equal to its annual revenues. On the other hand, SBC, to take just one ILEC example, has long-term debts of under 40% of annual revenues. Furthermore, ILEC revenues are largely shielded from the fierce competition that prevails in the long distance segment of the industry. That provides more security and freedom of action. Whether the ILECs will use this well is another question, of course. 3. The mysterious economics of telecoms The telecommunications industry is extremely complex. What is beyond dispute, though, is that most of the costs are at the edges of the network. Furthermore, technology over the last two decades has been shifting the balance of costs even further towards the edges than before. The main reason the established long distance carriers have been suffering is that their consumer voice business is not viable. When the Bell System was broken up, long distance transport was still expensive. However, for several years now wholesale prices for long distance voice calls in the US have been under a penny a minute (reflecting lower costs). Those costs and a competitive marketplace are not compatible in the long run with consumer prices that are 10 times as high. It was inevitable that for ordinary customers, long distance was eventually going to be provided by their local carriers on a flat rate basis. The decline in long distance transport costs that doomed consumer long distance voice carriers predated the Internet. However, the Internet helped speed up the transformation we are witnessing by leading to the wild overbuilding of long distance transport and by accelerating technical advances. The costs on the Internet are even more heavily concentrated at the edges, especially if one considers the Internet as just a component of the entire IT infrastructure. (See the papers "The economics of the Internet: Utility, utilization, pricing, and Quality of Service," and "Smart and stupid networks: Why the Internet is like Microsoft," .) The costs on the edges, in contrast to those in long haul, have been decreasing slowly. The electronics (and optics) have been getting less expensive, but the labor has not, and in particular the costs of the final few hundred feet of a connection to the network have been remarkably stable and remarkably high. Is the "first mile" a natural monopoly? That is what the failure of the CLECs has led many observers to conclude. Yet there are some contrary indicators. After all, most households do have three separate communication systems, the copper-based one from their ILEC, a coax-based one from their cable TV provider, and a cell phone from a wireless carrier. Thus a much deeper look is needed to understand what is going on, far beyond the scope of this note. A key factor, though, is that change is slow but inevitable. Hence a static analysis of technology choices, without taking account how quickly consumer are likely to move, is bound to be inadequate. In particular, whether DSL or cable is better technically is not overwhelmingly important. Both are capable of speeds in the hundreds of megabits per second in both directions over short distances. Further, the relevant distances will be getting smaller as fiber is pulled closer and closer to the home by both cable and DSL carriers. Eventually fiber will go all the way into the home. Whoever manages to accomplish this, will then likely have a true natural monopoly, with the ability to increase the bandwidth of the connection at low cost. Victory in this race to bring fiber to the home will likely depend mostly on the strategies and tactics of the competing players, and not so much on technology. The arcane economics of local connectivity can be seen also by considering capital costs and stock market valuations. The cell phone industry has been making capital expenditures of around $1,000 per subscriber, while copper or coax connections (with the associated central office and related facilities) to add additional subscribers run about twice as high. In the financial markets, wireless carriers are valued at about $2,000 per subscriber (adding up stock valuations to debt). ILECs are valued at $2,000 to $3,000 per line, while cable TV networks are valued at $4,000 to $5,000 per subscriber. These valuations, much higher than the cost of replacing the assets of these businesses, all may very well be too high, as Roxane Googin warns. What these valuations do is reflect the inertia in the system, the difficulty of building a competing carrier and attracting customers without destroying the economic basis of the business. (This applies also to general valuations of US businesses, for which the Tobin Q ratio has been above 1 for several years, although it is not as high as it is in telecoms.) Investors may be overestimating this inertia, and the ability of managers to exploit it profitably, but so far their faith has been justified. While change has been slow, it is likely to accelerate. The huge glut of fiber in long haul means that long distance transport is becoming easily available at a small cost. This means that one can build viable local carriers. We see increases in the ranks of companies that have been growing slowly, without becoming seduced by the myth of "Internet time" into huge upfront investments that a national footprint requires. Further, there is fast growth in availability of fiber in the metro. It often still takes a year to install a fiber to a commercial building, but it is being done. Also, data transport revenues are growing much faster than those for voice, as they have done for decades, but they are still far smaller, and will take some time to catch up. 4. Competition in local access In 1996, the US passed the Telecommunications Reform Act that was expected to create vigorous competition in all segments of communications. Six years later, after the telecom boom and bust, the entrenched ILECs appear to be even more securely entrenched, apparently the sole winners to emerge from all the turmoil. The reformers are beginning to concede that the "first mile" is a natural monopoly, and are being driven to consider remedies such as separating the ILECs into monopoly wholesale providers of basic network connectivity and retail service providers that alternative carriers might have a hope of competing with. However, given the long lead times that US legislative and judicial systems operate on, especially where drastic changes are involved, there is very little hope of any such transformation being forced through in less than 5 years, and potentially it could take much longer. Even without any ILEC separation, though, we are likely to see several serious challenges to the ILEC monopoly emerge in the next half a dozen years. The main challenge is likely to come from diversion of voice traffic to cell phones. As is shown in the UK statistics cited above, wireless phone usage is under a quarter of wired phone usage. (Similar estimates apply in the US, although the data there is not as complete.) Already a substantial fraction of the population regards cell phones as their primary communication tool, and that fraction is likely to increase. My view is that this increase will become striking as soon as the wireless industry increases the available bandwidth (through deployment of 2.5G and 3G technologies) and gives up its preoccupation with the "mobile Internet," and concentrates on increasing voice usage. (This view is expounded in detail in the paper "Content is not king," published in February 2001 in First Monday, and available at , and in a shorter note in Forbes in August 2001, entitled "Talk, Talk, Talk: So who needs streaming video on a phone?," available at .) Policy makers who are interested in promoting competition could help this move along by forcing those ILECs that have not yet done so to completely sever their ties with cellular carriers. This would be a much simpler move, both technically and politically, than the separation of wireline industry that is widely discussed. Competition from cellular carriers for voice is likely to force ILECs to concentrate on exploiting their natural advantage in bandwidth, and to emphasize Internet access. (Note again the UK statistics, where Internet access traffic on the voice network is fast approaching that of voice itself, especially since the latter figure includes some modem and fax traffic.) This will likely also force them to emphasize broadband, as a way to segment the market, and to create a natural progression path for their customers, towards higher and higher bandwidth. ILECs will also be forced to emphasize broadband by competition from cable TV companies. So far the cable and phone industries have been coexisting amicably without serious competition. However, that is likely to change. Cable is likely to be the aggressor. The reason is that connectivity is much more important than content (see "Content is not king" for supporting data and arguments), and there is much more money in it. Thus cable has much more to gain. Furthermore, the prices that are being paid for cable networks can only be justified on the assumption that a large bundle of services will be sold to customers. (You can't recover a $4,500 price per subscriber if you only get $45 per month for entertainment, and a third of that has to be spent to buy the content you have to provide.) Thus we are likely to see vigorous competition developing, which will be bringing fiber closer and closer to the home, and eventually into the home. A third important force threatening the ILECs is fixed and nomadic wireless access. As we have seen with the spread of cordless phones, the combination of low cost with mobility and simplicity is irresistible. For broadband communication, wireless distribution inside the house is especially attractive, given the costs of alternatives. Depending on how issues of scaling and business models are resolved, this approach could extend to a much wider communication network. So far wireless data communication has been a great disappointment, and 802.11 standards are the first ones that appear to be gaining wide acceptance. As wireless data spreads, and especially as voice transport capability is provided (which is essential, since that is where the money is, and will continue to be for quite a while), we might see it offering serious competition to DSL and cable modem systems. What is especially attractive about wireless communication is that at least potentially it scales with the size of the population that is served, and so avoids the huge upfront costs of wireline infrastructure. In summary, ILEC will be exposed to competition in the residential segment from three sources, cellular, cable, and fixed and nomadic wireless, and in business from those three as well as wireline CLECs. Thus Roxane Googin's warnings should not be taken lightly. 5. Computer and telecom industry evolution The traditional model for the telecom industry has been that of a carrier that provides all the equipment and systems and sells a service. This model has been fading for a long time, with customers buying their own phone sets, fax machines, and PBXs. We are likely to see a continued shift towards customers owning more and more of the facilities, including fiber (or wavelengths). However, that is not the full story. With ownership comes complexity (see the paper "Smart and stupid networks ..." referenced above), which will likely lead to a different type of service-oriented industry. A comparison with the computer industry is likely to be productive, especially since communication, information, and computation are becoming increasingly interrelated. The general trend in the modern economy has been away from vertical integration and towards horizontal integration. This trend is consistent with what one would expect in view of Coase's theory of the firm, as improved communication makes it possible to use market mechanisms between different layers, and at the same time makes coordination easier. Among the main advantages of a division into horizontal layers, with separate companies operating in each layer, is that it provides a method handling complexity and rapid change. It is worth noting that such separation was already visible inside the vertically integrated enterprises. For example, in telecommunications, the wireline carriers have for quite a long time had essentially separate divisions handling the optical network and the voice and data networks that rode on top of it and were the only ones visible to most customers. The computer industry is a prime example of the trend away from vertically integrated enterprises. A standard example of such an enterprise was IBM in the 1960s through much of the 1980s. It developed and produced its own processors, memories, hard disks, operating systems, application software, and so on. The current paradigm is that of horizontal players, in which Micron produces memory chips, Intel microprocessors, Microsoft the operating system and basic applications, Dell assembles the components and sells them, etc. The most successful player in this game is Microsoft. In the computer industry, everybody wants to be Microsoft, but there is only one Microsoft, and there cannot be too many Microsofts. (There simply aren't enough sufficiently large niches in the economy that can be dominated in a similar way and which can yield comparable profits. Perhaps the most ludicrous aspect of the Internet bubble was that hundreds of the startups were all expected to attain profits on the scale of Microsoft's.) Microsoft has annual revenues approaching $30 billion, after-tax profits approaching $12 billion, $38 billion in cash, and a stock market valuation (as of Feb. 8, 2002) of $330 billion. It has attained this enviable financial position by dominating the desktop software market, and appropriating the lion's share of the profits in PC software. It has been careful to largely stay within a single horizontal layer of the industry, and by being a modest part of it, so that it benefits from the contributions of many other (much less profitable) players. One pertinent comment is that Microsoft's dominance of the computer industry was not a foregone conclusion. It was predictable (and predicted by many clear thinkers) that PCs were going to dominate. However, it does not seem inevitable that control of the most popular operating system was bound to lead to control of the industry. Some people projected that applications were going to be the dominant layer, and it is possible that Lotus might have been able to build up its 1-2-3 spreadsheet into a suite of products that might have controlled the industry. (If this seems improbable, recall the huge effort Microsoft put into squashing Netscape. Clearly Microsoft viewed the upstart as a serious threat, potentially able to supplant Miscrosoft's dominant platform of the Windows operating system with another one, based on the browser, even at that late stage.) The conclusion from this discussion is that discerning broad technology trends does not necessarily enable one to determine who the winners will be. Much depends on individuals and strategies. In communications, the trend towards a horizontal structure of the industry also appears inevitable. Who will adopt to it best is uncertain. The natural division would seem to be into providers of physical connectivity, providers of basic data transport, and outsourcers, who manage customer networks. The established long distance carriers, with their expertise in serving large enterprise customers, would seem to be best positioned to dominate in basic data transport and outsourcing. On the negative side, they have heavy debt loads and also the (rapidly shrinking) legacy of the consumer long distance business as a burden. The ILECs have the financial resources to attempt moving into outsourcing, but lack the expertise, and would require a wrenching cultural change to adapt. Their natural niche seems to be in providing physical access and basic data transport, although even there, one could have separation into several layers. (The separation that reformers would like to force on the ILECs for the sake of creating competition may be sound business strategy.) Can anyone duplicate Microsoft's feat and occupy a dominant position? The lofty valuations accorded to many startups during the bubble (both in communications and other areas) seemed to be based on blind and irrational faith that they would all become like Microsoft. That was absurd. There aren't enough opportunities like that seized by Microsoft, and today, when everybody is aware of what such dominance means, there are many fewer opportunities to repeat Microsoft maneuvers. In particular, in communications, there does not seem to be much hope of duplicating Microsoft's feat. Cisco comes closest, but even its position seems to be fragile, in that the network effects in routers are not as strong as in PC software. Juniper showed that Cisco's dominance in the core of the network can be seriously dented, and it would not be a surprise if others started to make inroads at the edges as well. Some parts of the telecom world appear destined for commoditization, especially long distance transport. The Internet at its core is a "stupid network," as David Isenberg has called it. With technological progress and the fierce competition created by the massive overinvestment in that area, backbone transport is not and will not be all that big a piece of the telecom pie. Now there is nothing wrong with a commodity business (just consider Dell in PCs, or, for that matter, the oil companies). It can be profitable, and Internet backbones are likely to be profitable, once we go through a period of consolidation. However, there will be room for far fewer players than are contending in that market now, and the revenues and profits in this sector are not likely to be large. Local access is likely to be a large part of the industry. The costs are going to remain high, and economies of scale will limit competition. Hence revenues are going to be large, and profits are likely to be substantial. However, the potential for exorbitant profits will be slim, both because of regulation and growing competition. The greatest growth opportunities in telecommunications are likely to lie in services. As was mentioned above, the "stupid network" metaphor for the Internet ignores the huge costs at the edges of the network, costs that are often not explicit, since they involve time and aggravation for the customers. This again mirrors what happened in the computer industry. It again brings up the example of IBM, a company that is in the process of transforming itself from a vertically integrated producer of computer systems to a service company, running other companies' IT operations. IBM has about three times the revenues of Microsoft, two thirds of the profits, a lower growth rate, and a stock market valuation about 55% that of Microsoft. However, it is regarded as a remarkable success story, and its stock price has not been devastated to the extent that shares of most of the other computer companies have been. IBM apparently sees its future in dominating enterprise software integration, thus operating in a horizontal layer different from that of Microsoft. (It has given up the fight for the desktop with Microsoft that it carried on for a long time with OS/2, but supports Linux, presumably largely to keep Microsoft from exploiting its monopoly. Similarly, continued development of its own hardware serves to keep Intel's dominance in microprocessors in check.) The IBM story is instructive for several reasons. One is that change has taken a long time. Gerstner, the architect of the evolution, arrived in 1993, and it is only recently that services have started to provide more than half of IBM's revenues. Another is that this strategy is being emulated by other computer companies. In particular, the proposed merger between HP and Compaq is supposed to allow those enterprises to follow in IBM's footsteps better. Finally, the IBM story shows that it is possible for a large company, with a monolithic culture, to transform itself. While there are many enterprises that failed to adapt to a new environment (such as DEC and Wang), IBM shows that required changes are possible. The conclusion is that the long distance carriers should aim to model themselves after IBM. The most promising area for them is to manage networks that are largely owned by their customers. This will be a huge change, but the IBM example shows that it possible, and also that there is time to do it. The ILECs might be tempted to follow in this same direction, but are less likely to succeed, and may have to resign themselves to operating at lower levels of the networking hierarchy. However, there is likely to be enough opportunity for them even there to thrive. Acknowledgement: I thank Reed Burkhart and Charlie Sands for their comments on dearlier drafts of this essay. Addendum: Some general observations and speculations about long-haul telecommunications 1. There is a fiber glut in the long-haul area, not just the perception of a fiber glut. This is clear from what is happening to prices for transmission, the bankruptcies of a whole flock of carriers, as well as some simple calculations. If one looks at the estimates for Internet traffic on Internet backbone capacity, say using the numbers in the Larry Roberts (Caspian Networks) studies, some of which I do question, but which do seem to be of the right order of magnitude, we see that it would take just a few strands to carry all of it from coast to coast using state of the art equipment. Similarly, if we look at the statistics on the number of core router ports sold, we see that they cannot be using more than a small fraction of the fiber capacity. Only a small fraction of the fiber in the ground is lit. I have not seen any reliable statistics on this, but it is pretty clear that expenditures on optical equipment would be far higher other wise. For another view, with the same conclusion, let us look at the supply and demand factors over the last 4 years, from year-end 1997 to year-end 2001. (a) At the level of fiber itself, at the end of 1997, there was (according to FCC figures, which appear to have provided pretty good coverage through the end of 1998) around 3.4 million fiber miles in the long distance market (of which under half was lit). By the end of 2001, it appears that someplace between 15 and 20 million fiber miles were in place, growth by a factor of between 4 and 6. Furthermore, DWDM has expanded the capacity of each fiber. Back in 1997, most of the lit fiber was running at at most 1.7 Gb/s. Today we have a huge variety of systems, but just to be ultra conservative, if we were to install just the 40 wavelegth 2.4 Gb/s systems, (far below state-of-the art today) we would have more than a 50-fold increase in the capacity of each fiber. Thus we have increased the potential capacity of the long-haul system (if it were to be lit with reasonably run-of-the-mill equipment), by a factor of several hundred. (b) On the other hand, let us consider the sizes of networks that actually are used to carry traffic for customers (thus ignoring dark fiber, as well as the spare capacity in SONET rings, and various other factors). According to the estimates in my 1998 paper with Kerry Coffman, "The size and growth rate of the Internet," at year-end 1997, only about 10% of the network at this level was used for Internet traffic, with about 45% each for voice and for private line. The tremendous buildout of the last few years was the result of two fundamental mistakes. One was to assume astronomical and unrealistic growth rates for Internet traffic (the mythical "doubling every three or four months"). The other was to compound that mistake by extrapolating this unrealistically high growth rate to the entire network. Yet it was only this 10% Internet piece that was growing really fast, namely about doubling each year. A doubling of the Internet piece every year, plus growth of 10% per year in voice, and around 30-40% in private line (plus ATM and Frame Relay networks, which I am throwing into private line here just for simplicity) means that the network did not have to grow by more than about 4x or so over those 4 years, from year-end 1997 to year-end 2001. (Even today, when Internet traffic has just or is just about to exceed the combined traffic on private line and voice networks, the growth rate of the entire network is probably closer to 50% than 100% per year, even though the Internet itself appears to be on the 100% growth path.) There are a few other factors to consider, such as distance dependence and redundancy, but they are not likely to be major, and are very technical, so I won't go into them here. The one thing to note is that utilization of lit fiber is not a useful metric to consider. Lighting fiber costs quite a bit, so it is not done unless there is demand for extra capacity. The general conclusion is that far more fiber was deployed in the long-haul market than was necessary, too much by a factor of at least 10. (The same conclusion does not apply to the metro area, however.) Thus in principle there is no need to deploy any more fiber in long-haul for 5 or more years. What will actually happen, though, is another issue, as the industry will be trading off deployment of new fiber versus deployment of equipment that can use existing fiber more intensively. 2. The prospects for the next couple of years are extraordinarily murky. The problem is that we have not just technology trends and the basic growth rate in demand for transmission (which is still high, approaching the approximate 100% per year that still seems to hold on the Internet) to contend with, but also the huge fiber glut (with smaller gluts of routers, etc.), and the dynamics of the financial markets and bankruptcy courts. Thus while I was confident a few years ago in predicting disaster, on the basis that demand was not going to materialize for the supply that was being built, I don't feel I can predict how things will play out in the next 2-4 years. There are certainly many interesting possibilities for exploiting the existing fiber glut, especially as more fiber is deployed in the metro. Here are just some thoughts: (a) The fiber glut is a done deal, a sunk cost. Since fiber does not deteriorate much with age and use, though, it makes sense to use it. This means that we may see a lot of enterprises leasing dark fiber and lighting it with inexpensive (often second-hand) equipment, say running a single OC12 on a fiber strand that is theoretically capable of carrying 80 OC192s. (I have heard anecdotal evidence that some of this is happening, for example for enterprise database mirroring.) Thus statistics of lit fiber and the like will be very tricky to obtain and interpret. (b) The telecom supplier sector in general may revive sooner than many expect. (This may not apply to the fiber segment, though, because of the long-haul fiber glut.) Many observers look at statistics such as those in and conclude that the telecom industry is getting back to its traditional pattern of spending around 15% of revenues on capex. The approximately 30% of revenues spent on capex in 2000 was clearly an aberration that is unlikely to see. However, my guess is that capex will be higher than the traditional 15%, because technology is advancing more rapidly. Instead of switches that are good for a decade or two, you have routers that are obsolete in 3 to 5 years. The fact that capex has not collapsed even further over the last year or so, in spite of the glut of fiber and other capital equipment on the market, may mean that the service companies are already unable to resist the pressure to upgrade. In the long run, capex is not going to grow faster than service revenues (which, by historical precedent, are unlikely to grow more than 8 to 10% per year), but there could be a few years of ramping up as capex increases from 15% to 20% of revenues or even a bit higher. A related factor (at least in long-haul, probably much less so in metro, but I would really like to see some data on this) is that as time goes on, capex is likely to tilt more towards high-tech. Much of the early expenditure was for trenching, putting up huts for regeneration, and so on. Now that this is done, most of the expenditure will likely be for electronic and optical equipment, which is good news for the telecom supplier segment. (But this is probably not true in metro.) (c) If the prediction in (b) about increased capex as fraction of revenues is realized, we will see a greater emphasis on simplicity, to lower operational expenses. This would go somewhat counter to the current trend, where to improve their financials, carriers are clamping down on capex and asking for higher utilization rates. The smart thing to do in the long run is to throw capex at the problem and eliminate labor (as in getting rid of SONET, etc.). This would mean lighting as few fibers as possible, at speeds as high as possible (which, though, would go against the trend towards wavelength switching). (d) Point (c) above suggests that demand for fiber might decline. However, there is another factor that goes counter to it. Namely, the fiber glut we have is also accompanied by a conduit glut. Most of the recent fiber deployments involved putting down several empty conduits in addition to the main one (which was sometimes not fully filled, either). This means that if considerably improved fiber becomes available, it could be deployed inexpensively, and pressure to reduce opex might lead to such moves.