Probabilistic time series modeling in Python
APACHE-2.0 License
Published by lostella over 2 years ago
Backporting fixes:
Published by lostella over 2 years ago
Backporting fixes:
Published by lostella over 2 years ago
Backporting fixes:
Published by lostella over 2 years ago
Backporting fixes:
Published by lostella over 2 years ago
ckpt_path
argument to PyTorchLightningEstimator
. (#1872)torch.isqf
(#1815)@validated
in from_hyperparameters. (#1826)Published by lostella about 3 years ago
Backporting fixes:
test/distribution/test_flows.py
to make test_flow_invertibility
pass (#1604)Published by lostella about 3 years ago
Backporting fixes:
Published by Schmedu over 3 years ago
Transform.apply
(#1494)unknown
to 0.0.0
. (#1457)time_feature/_base.py
(#1437)Transform.apply
(#1494)gluonts.mx
module (#1592)loader.py
(#1495)Published by Schmedu over 3 years ago
Backporting fixes:
Published by Schmedu over 3 years ago
Backporting fixes:
Published by Schmedu over 3 years ago
Backporting fixes:
Published by Schmedu over 3 years ago
Backporting fixes:
Published by Schmedu over 3 years ago
Backporting fixes:
Published by Schmedu over 3 years ago
Backporting fixes:
Published by Schmedu over 3 years ago
Backporting fixes:
time_feature/_base.py
(#1437)Published by Schmedu over 3 years ago
Backporting fixes:
time_feature/_base.py
(#1437)Published by lostella over 3 years ago
GluonTS adds improved support for PyTorch-based models, new options for existing models, and general improvements to components and tooling.
This release comes with a few breaking changes (but for good reasons). In particular, models trained and serialized prior to 0.7.0 may not be de-serializable using 0.7.0.
GluonEstimator
abstract class, as well as InstanceSplitter
and InstanceSampler
implementations. You are affected by this change only if you implemented custom models based on GluonEstimator
. The change makes it easier to define (and understand, in case you're reading the code) how fixed-length instances are to be sampled from the original dataset for training or validation purposes. Furthermore, this PR breaks data transformation into more explicit "pre-processing" steps (deterministic ones, e.g. feature engineering) vs "iteration" steps (possibly random, e.g. random training instance sampling), so that a cache_data
option is now available in the train
method to have the pre-processed data cached to memory, and be iterated quicker, whenever it fits.gluonts.time_features
into distinct types.ISSM
types, making it easier to define custom ones e.g. by having a custom set of seasonality patterns. Related changes to DeepStateEstimator
enable these customizations when defining a DeepState model.Trainer
:
input_names
argument from the __call__
method. Now the provided data loaders are expected to produce batches containing only the fields that the network being trained consumes. This can be easily obtained by transforming the dataset with SelectFields
.gluonts.mx
, with some exceptions (gluonts.model
and gluonts.nursery
). With the new structure, one is not forced to install MXNet unless they specifically require modules that depend on it.Evaluator
class lighter, by moving the evaluation metrics to gluonts.evaluation.metrics
instead of having them as static methods of the class.PyTorch support:
Distributions:
Models:
Datasets & tooling:
Published by Schmedu over 3 years ago
Backporting fixes:
Published by Schmedu over 3 years ago
Backporting fixes:
Published by lostella over 3 years ago
Backporting fixes: