CORE

Reference:

Yupeng Hou, Binbin Hu, Zhiqiang Zhang, Wayne Xin Zhao. “CORE: Simple and Effective Session-based Recommendation within Consistent Representation Space.” in SIGIR 2022.

https://github.com/RUCAIBox/CORE

class recbole.model.sequential_recommender.core.CORE(config, dataset)[source]

Bases: recbole.model.abstract_recommender.SequentialRecommender

CORE is a simple and effective framewor, which unifies the representation spac for both the encoding and decoding processes in session-based recommendation.

static ave_net(item_seq, item_emb)[source]
calculate_loss(interaction)[source]

Calculate the training loss for a batch data.

Parameters

interaction (Interaction) – Interaction class of the batch.

Returns

Training loss, shape: []

Return type

torch.Tensor

forward(item_seq)[source]

Defines the computation performed at every call.

Should be overridden by all subclasses.

Note

Although the recipe for forward pass needs to be defined within this function, one should call the Module instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.

full_sort_predict(interaction)[source]

full sort prediction function. Given users, calculate the scores between users and all candidate items.

Parameters

interaction (Interaction) – Interaction class of the batch.

Returns

Predicted scores for given users and all candidate items, shape: [n_batch_users * n_candidate_items]

Return type

torch.Tensor

predict(interaction)[source]

Predict the scores between users and items.

Parameters

interaction (Interaction) – Interaction class of the batch.

Returns

Predicted scores for given users and items, shape: [batch_size]

Return type

torch.Tensor

training: bool
class recbole.model.sequential_recommender.core.TransNet(config, dataset)[source]

Bases: torch.nn.modules.module.Module

forward(item_seq, item_emb)[source]

Defines the computation performed at every call.

Should be overridden by all subclasses.

Note

Although the recipe for forward pass needs to be defined within this function, one should call the Module instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.

get_attention_mask(item_seq, bidirectional=False)[source]

Generate left-to-right uni-directional or bidirectional attention mask for multi-head attention.

training: bool