Network Modeling
[Alderson-Li-Willinger-Doyle-2005 (doi) .]
David Alderson, Lun Li, Walter Willinger, and John C. Doyle
“Understanding internet topology: principles, models, and validation”,
IEEE/ACM Transactions on Networking, vol.13, #6, December 2005, pp. 1205–1218
ResiliNets Keywords: network modeling
Keywords: Degree-based generators, heuristically optimal topology, network design, network topology, router configuration, topology metrics
Abstract: “Building on a recent effort that combines a first-principles approach to modeling router-level connectivity with a more pragmatic use of statistics and graph theory, we show in this paper that for the Internet, an improved understanding of its physical infrastructure is possible by viewing the physical connectivity as an annotated graph that delivers raw connectivity and bandwidth to the upper layers in the TCP/IP protocol stack, subject to practical constraints (e.g., router technology) and economic considerations (e.g., link costs). More importantly, by relying on data from Abilene, a Tier-1 ISP, and the Rocketfuel project, we provide empirical evidence in support of the proposed approach and its consistency with networking reality. To illustrate its utility, we: 1) show that our approach provides insight into the origin of high variability in measured or inferred router-level maps; 2) demonstrate that it easily accommodates the incorporation of additional objectives of network design (e.g., robustness to router failure); and 3) discuss how it complements ongoing community efforts to reverse-engineer the Internet.”
Notes:
[Doyle-Alderson-Li-Low-2005 (doi) .]
John C. Doyle, David L. Alderson, Lun Li, Steven Low, Matthew Roughan, Stanislav Shalunov,Reiko Tanaka, and Walter Willinger
“The "robust yet fragile" nature of the Internet”,
Proceedings of the National Academy of Sciences, vol.102, #41, 2005, pp. 14497–14502
ResiliNets Keywords: list
Keywords: complex network, HOT, Internet topology, network design,scale-free network
Abstract: “The search for unifying properties of complex networks is popular, challenging, and important. For modeling approaches that focus on robustness and fragility as unifying concepts, the Internet is an especially attractive case study, mainly because its applications are ubiquitous and pervasive, and widely available expositions exist at every level of detail. Nevertheless, alternative approaches to modeling the Internet often make extremely different assumptions and derive opposite conclusions about fundamental properties of one and the same system. Fortunately, a detailed understanding of Internet technology combined with a unique ability to measure the network means that these differences can be understood thoroughly and resolved unambiguously. This article aims to make recent results of this process accessible beyond Internet specialists to the broader scientific community and to clarify several sources of basic methodological differences that are relevant beyond either the Internet or the two specific approaches focused on here (i.e., scale-free networks and highly optimized tolerance networks).”
Notes: Bibliographic Entries