An Engine-Agnostic Deep Learning Framework in Java
APACHE-2.0 License
Bot releases are hidden (Show)
Published by roywei over 3 years ago
DJL v0.10.0 brings the new engines PaddlePaddle 2.0 and TFLite 2.4.1, updates PyTorch to 1.7.1, and introduces several new features:
Supports PaddlePaddle 2.0 engine inference: now you can run prediction using models trained in PaddlePaddle.
Introduces the PaddlePaddle Model Zoo with new models. Please see examples for how to run them.
Upgrades TFLite engine to v2.4.1. You can convert TensorFlow SavedModel to TFLite using this converter.
Introduces DJL Central to easily browse and view models available in DJL’s ModelZoo.
Introduces generic Bert Model in DJL (#105)
Upgrades PyTorch to 1.7.1
SoftmaxCrossEntropyLoss's
fromLogit
flag mean inputs are un-normalized (#639)This release is thanks to the following contributors:
Published by stu1130 almost 4 years ago
DJL 0.9.0 brings MXNet inference optimization, abundant PyTorch new feature support, TensorFlow windows GPU support and experimental DLR engine that support TVM models.
long[][] indices = {{0, 1, 1}, {2, 0, 2}};
float[] values = {3, 4, 5};
FloatBuffer buf = FloatBuffer.wrap(values);
manager.createCoo(FloatBuffer.wrap(values), indices, new Shape(2, 4));
// assum your torchscript model takes model({'input': input_tensor})
// you tell us this kind of information by setting the name
NDArray array = manager.ones(new Shape(2, 2));
array.setName("input1.input");
// saving ExtraFilesMap
Criteria<Image, Classifications> criteria = Criteria.builder()
...
.optOption("extraFiles.dataOpts", "your value") // <- pass in here
...
Engine | version |
---|---|
PyTorch | 1.7.0 |
TensorFlow | 2.3.1 |
fastText | 0.9.2 |
Thank you to the following community members for contributing to this release:
Frank Liu(@frankfliu)
Lanking(@lanking520)
Kimi MA(@kimim)
Lai Wei(@roywei)
Jake Lee(@stu1130)
Zach Kimberg(@zachgk)
0xflotus(@0xflotus)
Joshua(@euromutt)
mpskowron(@mpskowron)
Thomas(@thhart)
DocRozza(@docrozza)
Wai Wang(@waicool20)
Trijeet Modak(@uniquetrij)
Published by roywei about 4 years ago
DJL 0.8.0 is a release closely following 0.7.0 to fix a few key bugs along with some new features.
Thank you to the following community members for contributing to this release:
Dennis Kieselhorst, Frank Liu, Jake Cheng-Che Lee, Lai Wei, Qing Lan, Zach Kimberg, uniquetrij
Published by stu1130 about 4 years ago
DJL 0.7.0 brings SetencePiece for tokenization, GravalVM support for PyTorch engine, a new set of Nerual Network operators, BOM module, Reinforcement Learning interface and experimental DJL Serving module.
Conv2d.conv2d(NDArray input, NDArray weight, NDArray bias, Shape stride, Shape padding, Shape dilation, int groups);
<dependency>
<groupId>ai.djl</groupId>
<artifactId>bom</artifactId>
<version>0.7.0</version>
<type>pom</type>
<scope>import</scope>
</dependency>
implementation platform("ai.djl:bom:0.7.0")
cd serving && ./gradlew run --args="-m https://djl-ai.s3.amazonaws.com/resources/test-models/mlp.tar.gz"
FastTextWorkEmbedding
WarmUpTracker
MxPredictor
now doesn’t copy parameters by default, please make sure to use NaiveEngine
when you run inference in multi-threading environmentThank you to the following community members for contributing to this release:
Christoph Henkelmann, Frank Liu, Jake Cheng-Che Lee, Jake Lee, Keerthan Vasist, Lai Wei, Qing Lan, Victor Zhu, Zach Kimberg, aksrajvanshi, gstu1130, 蔡舒起
Published by keerthanvasist over 4 years ago
DJL 0.6.0 brings stable Android support, ONNX Runtime experimental inference support, experimental training support for PyTorch.
Thank you to the following community members for contributing to this release:
Christoph Henkelmann, Frank Liu, Jake Lee, JonTanS, Keerthan Vasist, Lai Wei, Qing, Qing Lan, Victor Zhu, Zach Kimberg, ai4java, aksrajvanshi
Published by roywei over 4 years ago
DJL 0.5.0 release brings TensorFlow engine inference, initial NLP support and experimental Android inference with PyTorch engine.
ai.djl.repository
, use ai.djl.api
instead.Published by roywei over 4 years ago
DJL 0.4.1 release includes an important performance Improvement on MXNet engine:
MxNDManager.newSubManager()
to repeatedly calling getFeature()
which will make JNA calls to native code.Same as v0.4.0 release:
Published by lanking520 over 4 years ago
DJL 0.4.0 brings PyTorch and TensorFlow 2.0 inference support. Now you can use these engines directly from DJL with minimum code changes.
Note: TensorFlow 2.0 currently is in PoC stage, users will have to build from source to use it. We expect TF Engine finish in the future releases.
There are a few changes in API and ModelZoo packages to adapt to multi-engine support. Please follow our latest examples to update your code base from 0.3.0 to 0.4.0.
Published by zachgk over 4 years ago
This is the v0.3.0 release of DJL
ai.djl.mxnet:mxnet-native-auto
dependency for automatic engine selection and a simpler build/installation processDJL is working to further improve the ease of use and correctness of our API. To that end, we have made a number of breaking changes for this release. Here are a few of the areas that had breaking changes:
Published by vrakesh almost 5 years ago
This is the v0.2.1 release of DJL
Published by vrakesh almost 5 years ago
This is the v0.2.0 release for DJL
Key Features
Engines currently supported
Javadocs
The javadocs are available here