forked from manbo/internal-docs
新增论文概要
This commit is contained in:
21
papers/Topic5 Graph or heterogeneous graph priors/Intro.txt
Normal file
21
papers/Topic5 Graph or heterogeneous graph priors/Intro.txt
Normal file
@@ -0,0 +1,21 @@
|
||||
图表示学习/异构图(直接对应你选的(设备,寄存器)结构)
|
||||
|
||||
你明确了节点是 (设备,寄存器),这天然是二部图/异构图(Device 与 Register 两类实体)。STOUTER 用的是 base station graph;你这里更建议引用异构图建模经典方法,解释为什么要用二部图结构做“空间先验”。
|
||||
|
||||
Kipf & Welling. Semi-Supervised Classification with Graph Convolutional Networks (GCN). ICLR 2017.
|
||||
用途:基础 GCN;你做拓扑表征学习的常用基线引用。
|
||||
|
||||
Veličković et al. Graph Attention Networks (GAT). ICLR 2018.
|
||||
用途:注意力聚合,适合“不同邻边权重不同”(例如不同设备之间依赖强弱不同)。
|
||||
|
||||
Xu et al. How Powerful are Graph Neural Networks? (GIN). ICLR 2019.
|
||||
用途:强调结构表达能力;如果你需要强结构区分能力,可引用。
|
||||
|
||||
Schlichtkrull et al. Modeling Relational Data with Graph Convolutional Networks (R-GCN). ESWC 2018.
|
||||
用途:关系类型图卷积;很适合你在(设备,寄存器)图里加入多种边:read、write、same-device、process-link 等关系类型。
|
||||
|
||||
Hu et al. Heterogeneous Graph Transformer (HGT). WWW 2020.
|
||||
用途:异构图 Transformer;如果你后续把“设备类型/寄存器类型/功能码”都纳入异构建模,HGT 是很强的参考。
|
||||
|
||||
Hou et al. GraphMAE: Self-Supervised Masked Graph Autoencoders. KDD/相关会议 2022.
|
||||
用途:图自监督预训练;对应 STOUTER 的图 autoencoder 预训练思想,你可以用它支撑“先学图嵌入,再用于生成”。
|
||||
Reference in New Issue
Block a user