【PyTorch】Easy-to-use,Modular and Extendible package of deep-learning based CTR models.
APACHE-2.0 License
Bot releases are hidden (Show)
公众号:浅梦学习笔记 | 微信:deepctrbot | 学习小组 加入 主题集合 |
---|---|---|
Published by shenweichen over 2 years ago
公众号:浅梦学习笔记 | 微信:deepctrbot | 学习小组 加入 主题集合 |
---|---|---|
Published by shenweichen over 3 years ago
VarLenSparseFeat
error #176 #179 #180 @zanshuxunInteractingLayer
in Autoint
#74 @zanshuxunPublished by shenweichen over 3 years ago
Published by shenweichen almost 4 years ago
History
callback(example)History
in deepctr_torch.callbacks
Published by shenweichen almost 4 years ago
float64
in metric to prevent nan/inf loss when calculating loglossDCN
add a new parameter:cross_parameterization
: string, "vector"
or "matrix"
, way to parameterize the cross network.if set to "matrix"
then it will be DCN-M
EarlyStopping
and ModelCheckpoint
in deepctr_torch.callbacks
Published by shenweichen about 4 years ago
Fix RuntimeError: one of the variables needed for gradient computation has been modified by an inplace operation: [torch.cuda.FloatTensor [128]], which is output 0 of SelectBackward, is at version 2; expected version 1 instead. Hint: enable anomaly detection to find the operation that failed to compute its gradient, with torch.autograd.set_detect_anomaly(True)
in :
https://github.com/shenweichen/DeepCTR-Torch/issues/88
https://github.com/shenweichen/DeepCTR-Torch/issues/98
https://github.com/shenweichen/DeepCTR-Torch/issues/90
https://github.com/shenweichen/DeepCTR-Torch/issues/102
Published by shenweichen over 4 years ago
Add DIN
and DIEN
InteractingLayer
Published by shenweichen over 4 years ago
Refactor feature columns.
Different features can use different embedding_dim
Add linear part to some models
Add SequencePoolingLayer
(API)
use_double=True
in model.fit()
#15embedding_size
parameter of models is removed.Now we must set embedding_dim
(default 4) in SparseFeat
or VarLenSparseFeat
.Published by shenweichen about 5 years ago
Published by shenweichen about 5 years ago