The PyTorch-based audio source separation toolkit for researchers
MIT License
Bot releases are visible (Hide)
Published by mpariente about 1 year ago
⬆️ Upgrade ⬆️
Changes documented in #682 :
optimizer_idx
in lr_scheduler_step
in System
.torch.symeigh
by torch.linalg.eigh
in beamforming.pyon_*_end
with on_training_*_end
torch.testing.assert_allclose
with torch.testing.assert_close
.Happy coding 🙃
Published by mpariente over 1 year ago
Release before moving to asteroid
0.7.x with torch
and lightning
upgrades.
from_pretrained
(#668)Many thanks to all the contributors @actuallyaswin, @mystlee, @JunzheJosephZhu, @jbartolewska, @LeonieBorne, @mattiadg, @r-sawata and @zmolikova ! 💪 🤩 🙏
Published by mpariente over 2 years ago
Yes. That's it. Only Pytorch-Lightning support.
From 1.5.0 to the latest versions.
So changing to the new Trainer
API with the new accelerator
, strategy
and devices
arguments.
Thanks to @paulfd to push me to do it 💪
Published by mpariente over 2 years ago
Minor patch release before 0.6.0 that will upgrade pytorch-lightning version.
Thanks to @jc5201, @z-wony, @JorisCos, @ben-freist, @nicocasaisd and @zmolikova for their awesome contributions 🔥 💪 🙏
Published by mpariente almost 3 years ago
Libri_VAD
dataset 🚀Thanks to @ldelebec @hihunjin @nobel861017 @ben-freist @r-sawata @osanseviero @JunzheJosephZhu and @JorisCos for their awesome contributions 🔥 💪 🙏
Published by mpariente over 3 years ago
Nothing more, nothing less 🙃
Check the release below for the most recent release notes.
Published by mpariente over 3 years ago
⚠️ This release drops support for PyTorch under 1.8 and restricts PyTorch-Lightning under 1.3.
The next release (0.5.1) will add support to PyTorch-Lightning 1.3.0 which broke our CI. ⚠️
Thanks to @quancs @r-sawata @popcornell and @faroit for their contributions
Published by mpariente over 3 years ago
⚠️ Warning ⚠️ this is the last release that supports torch<1.8.
From asteroid 0.5.0 onwards, only torch>=1.8.0 will be supported. The main reason being the complex support, the fft
and linalg
packages.
Large thanks to the contributors ! 🙃
Published by mpariente over 3 years ago
sample_rate
to BaseModel
is deprecated, and it will raise an error in the future release.BaseModel
now takes a in_channels
argument which will be used in separate
and the asteroid-infer
CLI.FasNetTAC
thanks to @popcornell ! 🎉huggingface_hub
instead of "our own" code for interfacing with the Hub.Published by mpariente almost 4 years ago
Use in Asteroid
button, this is great! 🤩 Huge thanks to the HuggingFace team and @julien-c in particular. 🙏Thanks to all the contributors: @jonashaag, @popcornell, @julien-c, @iver56, @lubacien, @cliffzhao and the issue creators and bug-reporters 🙏
Published by mpariente almost 4 years ago
TorchScript
support for all asteroid models, unit tested for consistency 🚀MelGramFB
, a STFT that matches torch.stft
and new hooks for more extensibility!PITLossWrapper
+ new MixITWrapper
and SinkPITLossWrapper
⚡1.6.0
, 1.7.0
and torch-nightly
.filters
in Filterbank
is now a method instead of a property, for TorchScript
support (#237).PITLossWrapper
method best_perm_from_perm_avg_loss
, find_best_perm
now return batch indices of the best permutation, to match with the new hungarian algorithm and facilitate outside use of those methods (#243).sample_rate
argument won't be loadable anymore. Use asteroid-register-sr
to register the sample rate of the model (#285).losses
(#343 ) and blocks
(#344) that were deprecated since 0.2.0.kernel_size
argument from TDConvNet
(deprecated since v0.2.1) (#368).BaseModel._separate
is deprecated in favour of BaseModel.forward_wav
(#337).asteroid.filterbanks
has been outsourced to asteroid-filterbanks
(#346). Use from asteroid_filterbanks import
instead of from asteroid.filterbanks import
from now.asteroid-filterbanks.transforms
:
take_reim
will be removed.take_mag
is deprecated in favour of mag
.take_cat
is deprecated in favour of magreim
.from_mag_and_phase
is deprecated in favour of from_magphase
.asteroid.complex_nn.as_torch_complex
has been deprecated and will be removed. Use torch.view_as_complex
, torch_complex_from_magphase
, torch_complex_from_reim
or asteroid_filterbanks.transforms.from_torch_complex
instead (#358).[src] BC-breaking: Load models without sample_rate (#285)
[src] Remove deprecated losses (#343)
[src] Remove deprecated blocks (#344)
[src] BaseEncoderMaskerDecoder: remove old hooks (#309)
[src] Remove deprecated kernel_size in TDConvNet (#368)
[src&tests] Add sample_rate property (float) in BaseModel
. (#274)
[src] Add sample_rate argument to all supported models. (#284)
[src&tests] Automatic resampling in separate + CLI. (#283)
[src & tests] 🎉 TorchScript support 🎉 (#237)
[src & tests] Add Hungarian matcher to solve LSA in PITLossWrapper (#243)
[src&tests] Add jitable_shape and use it in EncMaskDec forward (#288)
[src&tests] Add shape checks to SDR and MSE losses (#299)
[docs] Add loss plot in the FAQ (#314)
[src] New asteroid.show_available_models (#313)
[egs] DAMP-VSEP vocal separation using ConvTasNet (#298)
[docs] DAMP-VSEP in the docs ! (#317)
[src&test] Add Sinkhorn PIT loss (#302)
[src] Add MixITWrapper loss (#320)
[egs] Add MixIT example recipe (#328)
[src] New Filterbank's hooks + add MelGram_FB (#334)
[src] New phase features and transforms (#333)
[src] Better names in asteroid.filterbanks.transforms (#342)
[src] Add asteroid-versions script to print installed versions (#349)
[install] Add conda environment.yml (#354)
[src] Add ebased_vad and deltas (#355)
[src&tests] Make get_metrics
robust against metrics failures (#275)
[egs] Don't override print() with pprint (#281)
[src] Refactor BaseEncoderMaskerDecoder.forward (#307)
[src&tests] Refactor DeMask for consistency (#304)
[docs] Replace GettingStarted notebook (#319)
[src] BaseModel takes sample_rate argument (#336)
[src&egs] Transition to asteroid_filterbanks (#346)
[src] Rename _separate to forward_wav (#337)
[docs] Build docs with 3.8
[docs] Links to GitHub code from the docs 🎉 (#363)
[CI&hub] TorchHub integration tests (#362)
[egs] Fix #277 DNS Challenge baseline's run.sh
[docs] Fix Reference and Example blocks in docs (#297)
[src] Fix #300: skip connection on good device (#301)
[src] DCUNet: Replace old hooks by new ones (#308)
[src] Fix schedulers serialization (#326)
[src] Improve Filterbank.forward error message (#327)
[egs] Fix: replace DPRNNTasNet with DPTNet (#331)
[src&jit] Fix DCCRN and DCUNet-Large (#276)
[CI] Catch warnings we expect (#351)
[src] Fix #279 OLA support for separate() and asteroid-infer (#305)
[docs] Docs fixes and improvements (#340)
[docs] Fix CLI output in docs (#357)
[src&tests] Fix complex and add tests (#358)
[docs] Fix docstrings (#365)
[src] Fix #360 Correct DCCRN RNN (#364)
A large thanks to all contributors for this release :
@popcornell @jonashaag @michelolzam @faroit @mhu-coder @JorisCos @groadabike @giorgiacantisani @tachi-hi @SouppuoS @sunits @guiruli08650129 @mcernak @zmolikova & @hbredin 🥰
(ping me if you got forgotten, I'll add you back, and sorry in advance 😉 )
Published by mpariente almost 4 years ago
Since v0.3.4, pytorch_lightning
released 1.0 which is incompatible. This release only limits lightning's version so that the install is compatible with the source code.
Published by mpariente about 4 years ago
v0.3.0
and v0.3.3
thanks to @groadabike (issue #255, #258)LambdaOverlapAdd
.BaseEncoderMaskerDecoder
(formerly BaseTasNet
) now has model hooks for easier extensibility (thanks to @jonashaag) : postprocess_encoded
, postprocess_masks
, postprocess_masked
, postprocess_decoded`DCUNet
(paper)and DCCRNet
(paper)Note : Next release (0.4.0) will have some small backward breaking changes, to support TorchScript and improve our PITLossWrapper
.
[hub] Add tmirzaev's model in the string-retrievable ones.
[src] BaseTasNet -> BaseEncoderMaskerDecoder + add model hooks (#266)
[src & tests] New complex ops + Add DCUNet and DCCRNet (#224)
[src&tests] Improve scheduler's docs + add plot method (#268)
[hub] Add software version section in published models (#261)
[docs] Add issue #250 to FAQ (#260)
[black] Update black to 20.8b1 (#265)
[black] Fix black 20.8b1 update (#267)
[black] Update to 20.8b1 + always lint
[egs] Fix declared unused variables in DeMask (#248)
[docs] Update article citation.
[src] Restore linear activation as default in ConvTasNet and DPRNN (#258)
[src] Fix uncalled optimizer in System without LR schedule (#259)
[src] Fix bug for DPTNetScheduler (#262)
[src] Fix LambdaOverlapAdd and improve docs (#271)
Thanks to our awesome contributors @popcornell @jonashaag @faroit @groadabike !
Published by mpariente about 4 years ago
-e
install) with register
/get
logic.asteroid-infer
CLI for easy enhancement/separationseparate
method in model tests (#241)Published by mpariente about 4 years ago
hub
.LambdaOverlapAdd
to easily process long files.Dataset
.Dataset
.black
and more extensive testingNote: Next releases will be based on pytorch-lightning>=0.8.0
.
mixture_consistency
in dsp
folder.Published by mpariente over 4 years ago
System
is not imported from asteroid/__init__.py
. This way, torch.hub
doesn't need pytorch-lightning
to load models. Replace your from asteroid import System
calls by from asteroid.engine.system import System
asteroid/utils
folder (#120).torch_utils
e.g from asteroid.torch_utils import pad_x_to_y
. Instead, we can use from asteroid import torch_utils; torch_utils.pad_x_to_y(...)
[src & egs] Publishing pretrained models !! (wham/ConvTasNet) (#125)
[src] Add License info on all (but MUSDB) supported datasets (#130)
[src & egs] Kinect-WSJ Dataset and Single channel DC Recipe (#131)
[src] Add licenses info and dataset name for model publishing
[docs] Add getting started notebook
[docs] Add notebook summary table
[egs] Enable pretrained models sharing on LibriMix (#132)
[egs] Enable wham/DPRNN model sharing (#135)
[model_cards] Add message to create model card after publishing
[model_cards] Add ConvTasNet_LibriMix_sepnoisy.md model card (Thanks @JorisCos)
[src & egs] Adding AVSpeech AudioVisual source separation (#127)
[src] Instantiate LibriMix from download with class method (#144)
[src] Add show_available_models in asteroid init
[src & tests] Bidirectional residual RNN (#146)
[src & tests] Support filenames at the input of separate
(#154)
[src & hub] Remove System to reduce torch.hub deps (back to #112)
[src & tests & egs] Refactor utils files into folder (#120)
[egs] GPU id
defaults to $CUDA_VISIBLE_DEVICES in all recipes (#128)
[egs] set -e in all recipes to exit or errors (#129)
[egs] Remove gpus args in all train.py (--id controls that in run.sh) (#134)
[hub] Change dataset name in LibriMix (fix)
[src] Add targets argument (to stack sources) to MUSDB18 (#143)
[notebooks] Rename examples to notebooks
[src] Enable using Zenodo without api_key argument (set ACCESS_TOKEN env variable)
[src] Deprecate inputs_and_masks.py (#117)
[src] Deprecate PITLossWrapper mode
argument (#119)
[src] Fix PMSQE loss (NAN backward + device placement) (#121)
[egs] Fix checkpoint.best_k_models in new PL version (#123)
[egs] Fix: remove shuffle=True in validation Loader (lightning error) (#124)
[egs] Corrections on LibriMix eval and train and evals scripts (#137)
[egs] Fix wavfiles saving in eval.py for enh_single and enh_both tasks (closes #139)
[egs] Fix wavfiles saving in eval.py for enh tasks (estimates)
[egs] Fix #139 : correct squeeze for enhancement tasks (#142)
[egs] Fix librimix run.sh and eval.py (#148)
Published by mpariente over 4 years ago
asteroid.models
+ hubconf.py
: Model definitions without installing Asteroid.Big thanks to all contributors @popcornell @etzinis @michelolzam @Ariel12321 @faroit @dditter @mdjuamart @sunits 😃
Published by mpariente over 4 years ago
Mainly, this release is to downgrade pytorch-lightning to 0.6.0 as we were having some performance problems with 0.7.1 (#58).
We also incorporate metrics calculation in Asteroid, though pb_bss_eval
which is a sub-package of pb_bss
available on PyPI.
BatchNorm
wrapper that handles 2D, 3D and 4D cases. retrievable from string bN
. (#60)get_metrics
method and strict dependency on pb_bss_eval
(#57) (#62)Published by mpariente over 4 years ago
argparse
interface with dictionary.perfect_synthesis_window
enables perfect synthesis with a large range if windows for even overlaps.Encoder
and Decoder
now support arbitrary number of input dimensions.SingleSrcMultiScaleSpectralLoss
from DDSP (magenta)Encoder
looses its post_process_inputs
and apply_mask
methods which were not really useful. We consider it is better the user applies these methods knowinglyBig thanks to the contributors on this release @popcornell @sunits @JorisCos