You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository was archived by the owner on Oct 17, 2023. It is now read-only.
During the training of the model using the mini-batch approach, the L2 regularization term does not involve all model parameters, but only uses the part of the model parameters corresponding to the involved embeddings. Is this a deliberate trick in the experiment?
During the training of the model using the mini-batch approach, the L2 regularization term does not involve all model parameters, but only uses the part of the model parameters corresponding to the involved embeddings. Is this a deliberate trick in the experiment?
在用mini-batch方式训练模型时,L2正则化项的计算并非使用了全部模型参数,而是只用了这一批次涉及到的用户、物品嵌入对应的那一部分模型参数。请问这是实验里有意为之的trick吗?