Book contents
- Frontmatter
- Contents
- Preface
- List of abbreviations
- 1 A brief history of the Internet
- 2 How the Internet works
- 3 Measuring the global Internet
- 4 The Internet's large-scale topology
- 5 Modeling the Internet
- 6 Internet robustness
- 7 Virtual and social networks in the Internet
- 8 Searching and walking on the Internet
- 9 Epidemics in the Internet
- 10 Beyond the Internet's skeleton: traffic and global performance
- 11 Outlook
- Appendix 1 Graph theory applied to topology analysis
- Appendix 2 Interface resolution and router topology
- Appendix 3 Numerical analysis of heavy tailed distributions
- Appendix 4 Degree correlations
- Appendix 5 Scale-free networks: scaling relations
- Appendix 6 The SIR model of virus propagation
- References
- Index
6 - Internet robustness
Published online by Cambridge University Press: 12 January 2010
- Frontmatter
- Contents
- Preface
- List of abbreviations
- 1 A brief history of the Internet
- 2 How the Internet works
- 3 Measuring the global Internet
- 4 The Internet's large-scale topology
- 5 Modeling the Internet
- 6 Internet robustness
- 7 Virtual and social networks in the Internet
- 8 Searching and walking on the Internet
- 9 Epidemics in the Internet
- 10 Beyond the Internet's skeleton: traffic and global performance
- 11 Outlook
- Appendix 1 Graph theory applied to topology analysis
- Appendix 2 Interface resolution and router topology
- Appendix 3 Numerical analysis of heavy tailed distributions
- Appendix 4 Degree correlations
- Appendix 5 Scale-free networks: scaling relations
- Appendix 6 The SIR model of virus propagation
- References
- Index
Summary
The Internet is composed by thousands of different elements – both at the hardware and software level – which are naturally susceptible to errors, malfunctioning, or other kind or failures, such as power outages, hardware problems, or software errors (Paxson, 1997; Labovitz, Ahuja, and Jahanian, 1999). Needless to say, the Internet is also subject to malicious attacks. The most common of those are the denial-of-service attacks, that encompass a broad set of attacks aimed at a diversity of Internet services, such as the consumption of limited resources or the physical destruction of network components (C.E.R. Team, 2001). Given so many open chances for errors and failures, it might sometimes be surprising that the Internet functions at all.
The design of a computer network resilient to local failures (either random malfunctions or intentional attacks) was indeed one of the main motivations for the original study of distributed networks by Paul Baran (1964). Considering the worst possible scenario of an enemy attack directed towards the nodes of a nationwide computer network, Baran analyzed the “survivability” (defined as the average fraction of surviving nodes capable of communication with any other surviving node) of the several network designs available at that time. His conclusion was that the optimal network, from the survivability point of view, was a mesh-like graph with a sufficient amount of redundancy in the paths between vertices. Even in the case of a severe enemy strike, depleting a large number of components, such network topology, would ensure the connectivity among the surviving computers, diverting communications along the ensemble of alternative paths.
- Type
- Chapter
- Information
- Evolution and Structure of the InternetA Statistical Physics Approach, pp. 112 - 139Publisher: Cambridge University PressPrint publication year: 2004
- 1
- Cited by