DegreeEncoder

class dgl.nn.pytorch.gt.DegreeEncoder(max_degree, embedding_dim, direction='both')

Do Transformers Really Perform Bad for Graph Representation中介绍的度编码器(Degree Encoder)
这个模块是一个可学习的度嵌入模块。

Parameters

  • max_degree(int) - 要编码的度数的上限。每个度数都将被限制在范围[0,max_degree]内。
  • embedding_dim(int) - 嵌入向量的输出维数。
  • direction(str, optional) - 要编码的度数方向。选项包括inoutbothboth都对两个方向的度进行编码并输出它们的相加。默认值:both

forward(degrees)

Parameters
degrees(Tensor) - -如果directionboth,则应该以具有零填充的批处理图的入度和出度进行堆叠,形状为(2,B,N)的张量。 否则,它应该在批处理的度数或度数外填充零。图,一个形状为(B,N)的张量,其中B是批量大小,N是最大节点数。

Returns
返回形状为(B,N,d)的度嵌入向量, 其中dembedding_dim

Return type
torch.Tensor

源代码

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
class DegreeEncoder(nn.Module):
def __init__(self, max_degree, embedding_dim, direction="both"):
super(DegreeEncoder, self).__init__()
self.direction = direction
if direction == "both":
self.encoder1 = nn.Embedding(max_degree + 1, embedding_dim, padding_idx=0)
self.encoder2 = nn.Embedding(max_degree + 1, embedding_dim, padding_idx=0)
else:
self.encoder = nn.Embedding(max_degree + 1, embedding_dim, padding_idx=0)
self.max_degree = max_degree

def forward(self, degrees):
degrees = th.clamp(degrees, min=0, max=self.max_degree)

if self.direction == "in":
assert len(degrees.shape) == 2
degree_embedding = self.encoder(degrees)
elif self.direction == "out":
assert len(degrees.shape) == 2
degree_embedding = self.encoder(degrees)
elif self.direction == "both":
assert len(degrees.shape) == 3 and degrees.shape[0] == 2
degree_embedding = self.encoder1(degrees[0]) + self.encoder2(degrees[1])
else:
raise ValueError(
f'Supported direction options: "in", "out" and "both", '
f"but got {self.direction}"
)
return degree_embedding

Example

Example1:

1
2
3
4
5
6
7
8
9
10
11
12
13
import dgl
from dgl.nn import DegreeEncoder
import torch as th
from torch.nn.utils.rnn import pad_sequence

g1 = dgl.graph(([0,0,0,1,1,2,3,3], [1,2,3,0,3,0,0,1]))
g2 = dgl.graph(([0,1], [1,0]))
in_degree = pad_sequence([g1.in_degrees(), g2.in_degrees()], batch_first=True)
out_degree = pad_sequence([g1.out_degrees(), g2.out_degrees()], batch_first=True)
print(in_degree.shape)
degree_encoder = DegreeEncoder(5, 16)
degree_embedding = degree_encoder(th.stack((in_degree, out_degree)))
print(degree_embedding.shape)

DegreeEncoder
http://jiqingjiang.github.io/p/b75828bf/
作者
Jiqing
发布于
2024年8月2日
许可协议