Modeling Information Diffusion in Online Social Networks with Partial Differential Equations (e-bog) af Xu, Kuai
Xu, Kuai (forfatter)

Modeling Information Diffusion in Online Social Networks with Partial Differential Equations e-bog

509,93 DKK (inkl. moms 637,41 DKK)
The book lies at the interface of mathematics, social media analysis, and data science. Its authors aim to introduce a new dynamic modeling approach to the use of partial differential equations for describing information diffusion over online social networks. The eigenvalues and eigenvectors of the Laplacian matrix for the underlying social network are used to find communities (clusters) of onl...
E-bog 509,93 DKK
Forfattere Xu, Kuai (forfatter)
Forlag Springer
Udgivet 16 marts 2020
Genrer Media studies
Sprog English
Format pdf
Beskyttelse LCP
ISBN 9783030388522
The book lies at the interface of mathematics, social media analysis, and data science. Its authors aim to introduce a new dynamic modeling approach to the use of partial differential equations for describing information diffusion over online social networks. The eigenvalues and eigenvectors of the Laplacian matrix for the underlying social network are used to find communities (clusters) of online users. Once these clusters are embedded in a Euclidean space, the mathematical models, which are reaction-diffusion equations, are developed based on intuitive social distances between clusters within the Euclidean space. The models are validated with data from major social media such as Twitter. In addition, mathematical analysis of these models is applied, revealing insights into information flow on social media. Two applications with geocoded Twitter data are included in the book: one describing the social movement in Twitter during the Egyptian revolution in 2011 and another predicting influenza prevalence. The new approach advocates a paradigm shift for modeling information diffusion in online social networks and lays the theoretical groundwork for many spatio-temporal modeling problems in the big-data era.