GCMC

Reference:

van den Berg et al. “Graph Convolutional Matrix Completion.” in SIGKDD 2018.

Reference code:

https://github.com/riannevdberg/gc-mc

class recbole.model.general_recommender.gcmc.BiDecoder(input_dim, output_dim, drop_prob, device, num_weights=3, act=<function BiDecoder.<lambda>>)[source]

Bases: torch.nn.modules.module.Module

Bi-linear decoder BiDecoder takes pairs of node embeddings and predicts respective entries in the adjacency matrix.

forward(u_inputs, i_inputs, users, items=None)[source]

Defines the computation performed at every call.

Should be overridden by all subclasses.

Note

Although the recipe for forward pass needs to be defined within this function, one should call the Module instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.

training: bool
class recbole.model.general_recommender.gcmc.GCMC(config, dataset)[source]

Bases: recbole.model.abstract_recommender.GeneralRecommender

GCMC is a model that incorporate graph autoencoders for recommendation.

Graph autoencoders are comprised of:

1) a graph encoder model \(Z = f(X; A)\), which take as input an \(N \times D\) feature matrix X and a graph adjacency matrix A, and produce an \(N \times E\) node embedding matrix \(Z = [z_1^T,..., z_N^T ]^T\);

2) a pairwise decoder model \(\hat A = g(Z)\), which takes pairs of node embeddings \((z_i, z_j)\) and predicts respective entries \(\hat A_{ij}\) in the adjacency matrix.

Note that \(N\) denotes the number of nodes, \(D\) the number of input features, and \(E\) the embedding size.

We implement the model following the original author with a pairwise training mode.

calculate_loss(interaction)[source]

Calculate the training loss for a batch data.

Parameters

interaction (Interaction) – Interaction class of the batch.

Returns

Training loss, shape: []

Return type

torch.Tensor

forward(user_X, item_X, user, item)[source]

Defines the computation performed at every call.

Should be overridden by all subclasses.

Note

Although the recipe for forward pass needs to be defined within this function, one should call the Module instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.

full_sort_predict(interaction)[source]

full sort prediction function. Given users, calculate the scores between users and all candidate items.

Parameters

interaction (Interaction) – Interaction class of the batch.

Returns

Predicted scores for given users and all candidate items, shape: [n_batch_users * n_candidate_items]

Return type

torch.Tensor

get_norm_adj_mat()[source]

Get the normalized interaction matrix of users and items.

Construct the square matrix from the training data and normalize it using the laplace matrix.

\[A_{hat} = D^{-0.5} \times A \times D^{-0.5}\]
Returns

Sparse tensor of the normalized interaction matrix.

get_sparse_eye_mat(num)[source]

Get the normalized sparse eye matrix.

Construct the sparse eye matrix as node feature.

Parameters

num – the number of rows

Returns

Sparse tensor of the normalized interaction matrix.

input_type = 2
predict(interaction)[source]

Predict the scores between users and items.

Parameters

interaction (Interaction) – Interaction class of the batch.

Returns

Predicted scores for given users and items, shape: [batch_size]

Return type

torch.Tensor

training: bool
class recbole.model.general_recommender.gcmc.GcEncoder(accum, num_user, num_item, support, input_dim, gcn_output_dim, dense_output_dim, drop_prob, device, sparse_feature=True, act_dense=<function GcEncoder.<lambda>>, share_user_item_weights=True, bias=False)[source]

Bases: torch.nn.modules.module.Module

Graph Convolutional Encoder GcEncoder take as input an \(N \times D\) feature matrix \(X\) and a graph adjacency matrix \(A\), and produce an \(N \times E\) node embedding matrix; Note that \(N\) denotes the number of nodes, \(D\) the number of input features, and \(E\) the embedding size.

forward(user_X, item_X)[source]

Defines the computation performed at every call.

Should be overridden by all subclasses.

Note

Although the recipe for forward pass needs to be defined within this function, one should call the Module instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.

training: bool
recbole.model.general_recommender.gcmc.orthogonal(shape, scale=1.1)[source]

Initialization function for weights in class GCMC. From Lasagne. Reference: Saxe et al., http://arxiv.org/abs/1312.6120