Fast and Accurate ML in 3 Lines of Code
APACHE-2.0 License
Bot releases are visible (Hide)
Published by Innixma over 3 years ago
v0.2.0 introduces numerous optimizations that reduce Tabular average inference time by 4x and average disk usage by 10x compared to v0.1.0, as well as a refactored ImagePredictor API to better align with the other tasks and a 20x inference speedup in Vision tasks. This release contains 42 commits from 9 contributors.
This release is non-breaking when upgrading from v0.1.0, with four exceptions:
ImagePredictor.predict
and ImagePredictor.predict_proba
have different output formats.TabularPredictor.evaluate
and TabularPredictor.evaluate_predictions
have different output formats.TabularPredictor.fit
's hyperparameter_tune_kwargs
argument now have a different format.See the full commit change-log here: https://github.com/awslabs/autogluon/compare/v0.1.0...v0.2.0
Thanks to the 9 contributors that contributed to the v0.2.0 release!
Special thanks to the 3 first-time contributors! @taesup-aws, @ValerioPerrone, @lukemorrill
Full Contributor List (ordered by # of commits):
@Innixma, @zhreshold, @gradientsky, @jwmueller, @mseeger, @sxjscience, @taesup-aws, @ValerioPerrone, @lukemorrill
best_quality
preset by 4x (and 2x on others). @innixma, @gradientskybest_quality
preset by 10x. @innixmabest_quality
preset. @innixma (#1022)pip install autogluon.tabular[all,skex]
or pip install "scikit-learn-intelex<2021.3"
. Once installed, AutoGluon will automatically use it.quantile
as a new problem_type
to support quantile regression problems. @taesup-aws, @jwmueller (#1005, #1040)
.fit
hyperparameters argument. Refer to the below kaggle kernel for an example or check out RAPIDS official AutoGluon example.ag.early_stop
. @innixma (#1037)
hyperparameters={'XGB': {'ag.early_stop': 500}}
.time_limit
is small. For time_limit=3600
on datasets with over 100,000 rows, v0.2.0 has a 65% win-rate over v0.1.0. @innixma (#1059, #1084)extra_metrics
argument to .leaderboard
. @innixma (#1058).feature_importance
. @innixma (#989)
predictor.feature_importance(test_data, features=['A', 'B', 'C', ('AB', ['A', 'B'])])
.evalute
and .evaluate_predictions
to be easier to use and share the same code logic. @innixma (#1080)
presets
is empty. @zhresholdpredict
and predict_proba
methods in ImagePredictor
to have the same output formats as TabularPredictor
and TextPredictor
. @zhreshold (#1044)
predict
and predict_proba
when switching to v0.2.0.ImagePredictor
. @zhreshold (#1010)
autogluon.mxnet
and autogluon.extra
causing crash on import. @innixma (#1032)Published by Innixma over 3 years ago
v0.1.0 is our largest release yet, containing 173 commits from 20 contributors over the course of 5 months.
This release is API breaking from past releases, as AutoGluon is now a namespace package. Please refer to our documentation for using v0.1.0. New GitHub issues based on versions earlier then v0.1.0 will not be addressed, and we recommend all users to upgrade to v0.1.0 as soon as possible.
See the full commit change-log here: https://github.com/awslabs/autogluon/compare/v0.0.15...v0.1.0
Try it out yourself in 5 minutes with our Colab Tutorial.
Special thanks to the 20 contributors that contributed to the v0.1.0 release! Contributor List:
@innixma, @gradientsky, @sxjscience, @jwmueller, @zhreshold, @mseeger, @daikikatsuragawa, @Chudbrochil, @adrienatallah, @jonashaag, @songqiang, @larroy, @sackoh, @muhyun, @rschmucker, @aaronkl, @kaixinbaba, @sflender, @jojo19893, @mak-454
pip install autogluon.core
. For a full list of available submodules, see this link. @gradientsky (#694)ag_args_fit={'num_gpus': 1}
in TabularPredictor.fit()
to enable. @innixma (#896)sample_weight
support. Tabular can now handle user-defined sample weights for imbalanced datasets. @jwmueller (#942, #962)predictor.feature_importance()
returns confidence bounds on importance values. @innixma (#803)predictor.fit_extra()
enables the fitting of additional models on top of an already fit TabularPredictor
object (docs). @innixma (#768)hyperparameter_tune_kwargs
in a model's hyperparameters via 'ag_args': {'hyperparameter_tune_kwargs': hpo_args}
. @innixma (#883)Published by Innixma almost 4 years ago
Published by Innixma about 4 years ago
Published by Innixma about 4 years ago
Published by Innixma over 4 years ago
tuning_data
argument in TabularPrediction.fit()
with test data without the label column to improve data preprocessing and final predictive accuracy on the test data (#551).fit_weighted_ensemble()
function to TabularPredictor
class. Now the user can train additional weighted ensembles post-fit using any subset of the existing trained models (#550).AG_args_fit
argument to enable advanced model training control such as per-model time limit and memory usage (#531).excluded_model_types
argument to TabularPrediction.fit()
to enable simplified removal of model types without editing the hyperparameters
argument (#543).feature_types_metadata
object and AutoFeatureGenerator
(#548).Published by Innixma over 4 years ago
Published by Innixma over 4 years ago
Published by Innixma over 4 years ago
Published by Innixma over 4 years ago
Published by Innixma over 4 years ago
Published by Innixma over 4 years ago