Thursday, 28 March 2019

Using The Wisdom Of Crowds To Make Cities Smarter

How could networks of innovative cities contribute to the solution of humanity’s existential problems?

Given the on-going digital revolution and our present-day sustainability challenges, we have to reinvent the way cities are operated. We propose that the requirement of organizing societies in a more resilient way implies the need for more decentralized solutions, based on digitally assisted self-organization, and that this concept is also compatible with sustainability requirements and stronger democratic participation. 

The project by Prof. Dirk Helbing and his Computational Social Science team will investigate, whether such a decentralized, participatory approach could compete with a fully centralized approach in terms of efficiency and sustainability, or perform even better than that. This requires in particular to figure out, how distributed co-creation processes can be coordinated and lifted to a professional level in a scalable way. 

The main questions of the project are: How could more participatory smart cities work, and how can they meet the requirements of being more efficient, sustainable and resilient? What are their risks and benefits compared with centralized approaches? How could digital societies fitting our culture, for example, based on values such as freedom, equality and solidarity (liberté, égalité, fraternité) look like, and what performance can be expected from them?

The project brings together two research directions: first, the automation of mobility solutions based on the Internet of Things and Machine Learning approaches, as they have been pursued within the “smart cities” paradigm and, second, novel collaborative approaches as they have been recently discussed under labels such as participatory resilience, digital democracy, City Olympics, open source urbanism, and the “socio-ecological finance system”.



In German:


Wie „Smart Cities“ mit Schwarmintelligenz noch besser werden können

Wie können Netzwerke innovativer Städte zur Lösung der existentiellen Menschheitsprobleme beitragen? Angesichts der fortschreitenden digitalen Revolution und unserer heutigen Nachhaltigkeitsherausforderungen müssen wir die Art und Weise, wie Städte organisiert werden, neu erfinden. Wir schlagen vor, dass die Notwendigkeit, die Gesellschaft krisenfester zu gestalten, stärker dezentralisierte Lösungen auf der Grundlage digital unterstützter Selbstorganisation erfordert, und dass dieses Konzept auch mit den Nachhaltigkeitsanforderungen und einer stärkeren demokratischen Beteiligung vereinbar ist. 

Das Projekt von Prof. Dirk Helbing und seinem Computational Social Science Team soll untersuchen, ob ein derart dezentraler, partizipativer Ansatz mit einem vollständig zentralisierten Ansatz in Bezug auf Effizienz und Nachhaltigkeit konkurrieren kann oder sogar noch besser abschneidet. Dies erfordert insbesondere herauszufinden, wie verteilte Prozesse der Ko-Kreation skalierbar koordiniert und auf professionelles Niveau gebracht werden können. 
Die Hauptfragen des Projekts lauten: Wie könnten stärker partizipative Smart Cities funktionieren und wie können sie die Anforderungen erfüllen, effizienter, nachhaltiger und krisenfester zu sein? Welche Risiken und Vorteile bestehen im Vergleich zu zentralisierten Ansätzen? Wie können digitale Gesellschaften aussehen, die zu unserer Kultur passen, also beispielsweise auf Werten wie Freiheit, Gleichheit und Solidarität (liberté, égalité, fraternité) basieren, und welche Leistungsfähigkeit kann von ihnen erwartet werden?
 

Das Projekt bringt zwei Forschungsrichtungen zusammen: erstens die Automatisierung von Mobilitätslösungen, die auf den Ansätzen des Internets der Dinge und des maschinellen Lernens beruhen, wie sie beim Ansatz von „Smart Cities“ („intelligenten Städte“) zum Einsatz kommen; zweitens neuartige kollaborative Ansätze, wie sie neuerdings unter Stichworten wie partizipative Resilienz, digitale Demokratie, Städte-Olympiaden, Open Source Stadtentwicklung und dem sozio-ökologischen Finanzsystem „Fin4“ diskutiert werden.

























Saturday, 16 March 2019

Fixing Over-Connectivity

The functioning of many socio-technical systems depends on the ability of its subcomponents or nodes to communicate or interact via its connections, but high connectivity may imply problems. By removing or deactivating a specific set of nodes, a network structure can be dismantled into isolated subcomponents, thereby disrupting the (mal)functioning of a system or containing the spread of misinformation or an epidemic. The researchers (Ren, Gleinig, Helbing, Antulov-Fantulin) at the Swiss Federal Institute of Technology ETH Zurich recently proposed and published insights about Generalized Network Dismantling in the Proceedings of the National Academy of Sciences of the United States of America (PNAS). Link https://doi.org/10.1073/pnas.1806108116

In a hyper-connected world, systemic instability, based on cascading effects, can seriously undermine the functionality of a network. The quick global spread of rumors and fake news may be seen as recent examples, while the spread of epidemics or failure propagation is a problem that has been around much longer. Many of today’s networks contain highly connected nodes with a much higher frequency than expected according to a normal, bell-shaped distribution. As a consequence, for some of these networks even the variance or mean value of relevant quantities may not anymore be well-defined. This means that unpredictable or uncontrollable behavior may result. For example, it may then be impossible to contain epidemic spreading processes. Similar circumstances may make it impossible to contain the spread of computer viruses or misinformation – a problem that is not only relevant for the quick increase of cyberthreats, but which may also undermine the functionality of markets, societal or political institutions.

For example, finding an optimal subset of nodes in a network that is able to successfully disrupt the functioning of a corrupt or criminal organization is still a great challenge. In principle, the dismantling of a network into isolated subcomponents might stop the (mal)functioning of a system, i.e. the removal or deactivation of even a small set of influential nodes may allow one to fix the problem. However, this problem belongs to the class of computationally hard problems. For these problems, it is numerically demanding to find the best solution efficiently. The publication of Ren et al. presents a new, approximate dismantling solution.

A typical approach to fight organized crime and corruption is to try to identify the underlying organization’s network, and then to remove the leader of the organization. It turns out, however, that it often requires an extremely high effort to remove the higher echelons of such organizations, because of their special protection measures. Removal costs of criminals or corrupt persons largely depend on their position in the network. It has also been found that it is often ineffective to remove the boss of a corruption or criminal network, as someone else will quickly take the leadership position of the organization and continue running the criminal or corruption network; besides, the transition period is often characterized by an increase in the level of crime, until the power struggle is decided. Therefore, the dismantling problem, which has been studied primarily for situations with identical node removal costs before, has been generalized to arbitrary, non-uniform removal costs. This class of problems has different kinds of solutions. Specifically, the dismantling procedure does not go for the big nodes first. It is less costly (i.e. more effective) to dismantle the network by initially removing some medium-sized nodes.

Ren et al. present a new algorithm to solve the generalized network dismantling problem, and apply it to a variety of problems ranging from crime networks over epidemic spreading to corruption networks. The new approach is based on a combination of three sophisticated methods and is applicable to large-scale networks with millions of nodes. Understanding the theory behind (generalized) network dismantling opens up more research directions for all scientists interested in designing more robust and resilient systems in the future. It requires the combination of diverse fundamental insights, for example, from theoretical computer science, mathematics, statistical physics, and even game theory. 

The results of the study are relevant for the robustness and recommended (re)organization of current socio-technical systems for different realistic costs. In particular, the authors point that the method offers a possible solution for emergencies where cutting a dysfunctional network into pieces can restore the functionality. However, they also warn of potential misuses or dual uses. When not applied in appropriate contexts and ways, the use of the dismantling approach may undermine the proper functionality of networks. Therefore, they point out that related ethical issues must be always sufficiently, appropriately, and transparently addressed, when the method is applied. 

Visualization of a strategy to potentially reduce the epidemic spreading of a disease.