Global Networks Must be Re-Designed
The increasing interdependencies between the world’s technological, socio-economic, and environmental systems have the potential to create global catastrophic risks. We may have to re-design many global networks, concludes Professor Dirk Helbing at ETH Zurich’s Risk Center in this week’s issue of Nature. Otherwise they could turn into “global time bombs”.
Living in
a Hyper-Connected World
Our
global networks have generated many benefits and new opportunities. However,
they have also established highways for failure propagation, which can
ultimately result in man-made disasters. For
example, today’s quick spreading of emerging epidemics is largely a result of
global air traffic, with serious impacts on global health, social welfare, and
economic systems.
Helbing’s
publication illustrates how cascade
effects and complex dynamics amplify the vulnerability of networked systems.
For example, just a few
long-distance connections can largely decrease our ability to mitigate the
threats posed by global pandemics. Initially beneficial trends, such as
globalization, increasing network densities, higher complexity, and an
acceleration of institutional decision processes may ultimately push man-made
or human-influenced systems towards
systemic instability, Helbing finds. Systemic instability refers to a system, which will get out of control sooner
or later, even if everybody involved is well skilled, highly motivated and
behaving properly. Crowd disasters are shocking examples illustrating that many
deaths may occur even when everybody tries hard not to hurt anyone.
Our
Intuition of Systemic Risks Is Misleading
Networking
system components that are well-behaved in separation may create
counter-intuitive emergent system behaviors, which are not well-behaved at all.
For example, cooperative behavior might unexpectedly break down as the
connectivity of interaction partners grows. “Applying this to the global
network of banks, this might actually have caused the financial meltdown in
2008,” believes Helbing.
Globally
networked risks are difficult to identify, map and understand, since there are
often no evident, unique cause-effect relationships. Failure rates may change
depending on the random path taken by the system, with the consequence of
increasings risks as cascade failures progress, thereby decreasing the capacity
of the system to recover. “In certain cases, cascade effects might reach any
size, and the damage might be practically unbounded,” says Helbing. “This is
quite disturbing and hard to imagine.” All of these features make strongly
coupled, complex systems difficult to predict and control, such that our
attempts to manage them go astray.
“Take the
financial system,” says Helbing. “The financial crisis hit regulators by
surprise.” But back in 2003, the legendary investor Warren Buffet warned of
mega-catastrophic risks created by large-scale investments into financial
derivatives. It took 5 years until the “investment time bomb” exploded, causing
losses of trillions of dollars to our economy. “The financial architecture is
not properly designed,” concludes Helbing. “The system lacks breaking points,
as we have them in our electrical system.” This allows local problems to spread
globally, thereby reaching catastrophic dimensions.
A Global
Ticking Time Bomb?
Have we
unintentionally created a global time bomb? If
so, what kinds of global catastrophic scenarios might humans face in complex
societies? A collapse of the
world economy or of our information and communication systems? Global
pandemics? Unsustainable growth or environmental change? A global food or
energy crisis? A cultural clash or global-scale conflict? Or will we face a
combination of these contagious phenomena – a scenario that the World Economic
Forum calls the “perfect storm”?
“While
analyzing such global risks,” says Helbing, “one must bear in mind that the
propagation speed of destructive cascade effects might be slow, but nevertheless
hard to stop. It is time to recognize that crowd disasters, conflicts,
revolutions, wars, and financial crises are the undesired result of operating
socio-economic systems in the wrong parameter range, where systems are
unstable.” In the past, these social
problems seemed to be puzzling, unrelated, and almost “God-given” phenomena one
had to live with. Nowadays, thanks to new complexity science models and
large-scale data sets (“Big Data”), one can analyze and understand the
underlying mechanisms, which let complex systems get out of control.
Disasters
should not be considered “bad luck”. They are a
result of inappropriate interactions and institutional settings, caused by
humans. Even worse, they are often the consequence of a flawed understanding of
counter-intuitive system behaviors. “For example, it is surprising that we
didn’t have sufficient precautions against a financial crisis and
well-elaborated contingency plans,” states Helbing. “Perhaps, this is because
there should not be any bubbles and crashes according to the predominant
theoretical paradigm of efficient markets.” Conventional thinking can cause
fateful decisions and the repetition of previous mistakes. “In other words:
While we want to do the right thing, we often do wrong things,” concludes
Helbing. This obviously calls for a paradigm shift in our thinking. “For
example, we may sanction deviations from social norms to promote social order,
but may trigger conflict instead. Or we may increase security measures, but get
more terrorism. Or we may try to promote innovation, but suffer economic
decline, because innovation requires diversity more than homogenization.”
Global
Networks Must Be Re-Designed
Helbing’s
publication explores why today’s risk analysis falls short. “Predictability and
controllability are design issues,” stresses Helbing. “And uncertainty, which
means the impossibility to determine the likelihood and expected size of
damage, is often man-made.” Many systems could be better managed with real-time
data. These would allow one to avoid delayed response and to enhance the
transparency, understanding, and adaptive control of systems. However, even all
the data in the world cannot compensate for ill-designed systems such as the
current financial system. Such systems will sooner or later get out of control,
causing catastrophic man-made failure. Therefore, a re-design of such systems
is urgently needed.
Helbing’s
Nature paper on “Globally Networked Risks” also calls attention to strategies
that make systems more resilient, i.e. able to recover from shocks. For
example, setting up backup systems (e.g. a parallel financial system), limiting
the system size and connectivity, building in breaking points to stop cascade
effects, or reducing complexity may be used to improve resilience. In the case
of financial systems, there is still much work to be done to fully incorporate
these principles.
Contemporary
information and communication technologies (ICT) are also far from being
failure-proof. They are based on principles that are 30 or more years old and
not designed for today’s use. The explosion of cyber risks is a logical
consequence. This includes threats to individuals (such as privacy intrusion,
identity theft, or manipulation through personalized information), to companies
(such as cybercrime), and to societies (such as cyberwar or totalitarian
control). To counter this, Helbing recommends an entirely new ICT architecture
inspired by principles of decentralized self-organization as observed in immune
systems, ecology, and social systems.
Coming Era
of Social Innovation
Socio-inspired
technologies built on decentralized mechanisms that create reputation, trust,
norms or culture will be able to generate enormous value. “Facebook, based on
the simple principle of social networking, is worth more than 50 billion
dollars,” Helbing reminds us. “ICT systems are now becoming artificial social
systems. Computers already perform the great majority of financial
transactions, which humans carried out in the past.” But if we do not
understand socially interactive systems well, coordination failures, breakdowns
of cooperation, conflict, cyber-crime or cyber-war may result.
Therefore,
a better understanding of the success principles of societies is urgently
needed. “For example, when systems become too complex, they cannot be
effectively managed top-down” explains Helbing. “Guided self-organization is a
promising alternative to manage complex dynamical systems bottom-up, in a
decentralized way.” The underlying idea is to exploit, rather than fight, the
inherent tendency of complex systems to self-organize and thereby create a
robust, ordered state. For this, it is important to have the right kinds of interactions,
adaptive feedback mechanisms, and institutional settings, i.e. to establish
proper “rules of the game”. The paper offers the example of an intriguing
“self-control” principle, where traffic lights are controlled bottom-up by the
vehicle flows rather than top-down by a traffic center.
Creating
and Protecting Social Capital
It is
important to recognize that many 21st century challenges such as the response
to global warming, energy and food problems have a social component and cannot
be solved by technology alone. The key to generating solutions is a Global
Systems Science (GSS) that brings together crucial knowledge from the natural,
engineering and social sciences. The goal of this new science is to gain an
understanding of global systems and to make “systems science” relevant to
global problems. In particular, this will require the combination of the Earth
Systems Sciences with the study of behavioral aspects and social factors.
“One
man’s disaster is another man’s opportunity. Therefore, many problems can only
be successfully addressed with transparency, accountability, awareness, and
collective responsibility,” underlines Helbing. “For example, social capital is
important for economic value generation, social well-being and societal resilience,
but it may be damaged or exploited, like our environment,” explains Helbing.
“Humans must learn how to quantify and protect social capital. A warning
example is the loss of trillions of dollars in the stock markets during the
financial crisis.” This crisis was largely caused by a loss of trust.
“It is
important to stress that risk insurances today do not consider damage to social
capital,” Helbing continues. However, it is known that large-scale disasters
have a disproportionate public impact, in part because they destroy social
capital. As we neglect social capital in risk assessments, we are taking
excessive risks.
New
Instruments for the 21st Century
Finally,
to gain the urgently needed insights, the study suggests to build new
instruments, as proposed by the FuturICT initiative (http://www.futurict.eu): This
comprises a “Planetary Nervous Systems” (PNS) to measure the state of our
planet in real-time, capturing also socio-economic trends, social capital, and
the “social footprint” of human decisions and actions. These data may be fed
into a “Living Earth Simulator” (LES) to study “what … if” scenarios. A “policy
wind tunnel” or “socio-economic flight simulator” of this kind could provide
better, evidence-based advice for decision makers, be it politicians, business
leaders, or citizens. It could help us to identify opportunities and alert us
of risks or unwanted side effects. Last but not least, the “Global
Participatory Platform” (GPP) would open up the above-mentioned tools for
everyone and support collaboration, interactive exploration, and crowd sourcing.
This bold
vision can be realized, provided that we learn how to design and operate open,
value-oriented ICT systems and how to promote a non-malicious and
self-determined use of data. It would take a major investment: an Apollo-like
project focusing on techno-socio-economic-environmental systems, life on earth
and everything it relates to. Helbing is convinced: “it would be the best
investment humanity can make”.
Paper
Source: http://dx.doi.org/10.1038/nature12047
See also
- “Denial of catastrophic risks”, http://www.sciencemag.org/content/339/6124/1123.full
- World Economic Forum, Global Risks 2011, 2012, and 2013 (WEF, Geneva, Switzerland, 2011, 2012, 2013), downloadable via http://www.weforum.org/issues/global-risks
Supplementary
Videos:
- http://vimeo.com/53876434 : Spreading and erosion of cooperation in a social dilemma situation
- http://vimeo.com/53872893 : Cascade spreading is increasingly hard to recover from as failure progresses. The simulation model mimics spatial epidemic spreading with air traffic and healing costs.
For
independent opinions, you may contact the following experts:
- A. Laszlo Barabasi barabasi@gmail.com
- Shlomo Havlin havlin@ophir.ph.biu.ac.il
- Hans J. Herrmann hans@ifb.baug.ethz.ch
- Alessandro Vespignani alexves@gmail.com
- Dirk Brockmann brockmann@northwestern.edu
Spreading
and erosion of cooperation in a prisoner’s dilemma game. The computer simulations assume
the payoff parameters T = 7, R = 6, P = 2, and S = 1 and include
success-driven migration. Although cooperation would be profitable to everyone,
non-cooperators can achieve a higher payoff than cooperators, which may
destabilize cooperation. The graph shows the fraction of cooperative agents,
averaged over 100 simulations, as a function of the connection density (actual
number of network links divided by the maximum number of links when all nodes
are connected to all others). Initially, an increasing link density enhances
cooperation, but as it passes a certain threshold, cooperation erodes. (See http://vimeo.com/53876434 for a related movie.) The computer
simulations are based on a circular network with 100 nodes, each connected with
the four nearest neighbours. n links are added randomly. 50 nodes are
occupied by agents. Blue circles represent cooperation, red circles
non-cooperative behaviour, and black dots empty sites. Initially, all agents
are non-cooperative. Their network locations and behaviours (cooperation or
defection) are updated in a random sequential way in 4 steps: (1) The agent
plays two-person prisoner’s dilemma games with its direct neighbours in the
network. (2) After the interaction, the agent moves with probability 0.5 up to
4 steps along existing links to the empty node that gives the highest payoff in
a fictitious play step, assuming that no one changes the behaviour. (3) The
agent imitates the behaviour of the neighbour who got the highest payoff in
step 1 (if higher than the own one). (4) The behaviour is spontaneously changed
with a mutation rate of 0.1.
Cascade spreading is increasingly hard to recover
from as failure progresses. The
simulation model mimics spatial epidemic spreading with air traffic and healing
costs in a two-dimensional 50 × 50 grid with periodic boundary
conditions and random shortcut links. The colourful inset depicts an early
snapshot of the simulation with N = 2500
nodes. Red nodes are infected, green nodes are healthy. Shortcut links are
shown in blue. The connectivity-dependent graph shows the mean value and
standard deviation of the fraction i(t)/N of infected nodes over 50 simulation
runs. Most nodes have four direct neighbours, while a few of them possess an
additional directed random connection to a distant node. The spontaneous
infection rate is s = 0.001
per time step; the infection rate by an infected neighbouring node is P = 0.08. Newly
infected nodes may infect others or may recover from the next time step
onwards. Recovery occurs with a rate q = 0.4,
if there is enough budget b > c to bear the healing costs c = 80. The budget
needed for recovery is created by the number of healthy nodes h(t). Hence, if r(t) nodes are
recovering at time t, the
budget changes according to b(t + 1) = b(t) + h(t) - cr(t).
As soon as the budget is used up, the infection spreads explosively. (See also
the movie at http://vimeo.com/53872893 )
there is the great information in your blog
ReplyDeleteEmerging Epidemic | Unlimited potential