One-Hot Graph Encoder Embedding

Date
2022-11-28
Journal Title
Journal ISSN
Volume Title
Publisher
IEEE Transactions on Pattern Analysis and Machine Intelligence
Abstract
In this paper we propose a lightning fast graph embedding method called one-hot graph encoder embedding. It has a linear computational complexity and the capacity to process billions of edges within minutes on standard PC — making it an ideal candidate for huge graph processing. It is applicable to either adjacency matrix or graph Laplacian, and can be viewed as a transformation of the spectral embedding. Under random graph models, the graph encoder embedding is approximately normally distributed per vertex, and asymptotically converges to its mean. We showcase three applications: vertex classification, vertex clustering, and graph bootstrap. In every case, the graph encoder embedding exhibits unrivalled computational advantages.
Description
© 2022 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. This article was originally published in IEEE Transactions on Pattern Analysis and Machine Intelligence. The version of record is available at: https://doi.org/10.1109/TPAMI.2022.3225073
Keywords
Central limit theorem, community detection, graph embedding, one-hot encoding, vertex classification
Citation
C. Shen, Q. Wang and C. E. Priebe, "One-Hot Graph Encoder Embedding," in IEEE Transactions on Pattern Analysis and Machine Intelligence, doi: 10.1109/TPAMI.2022.3225073.