Released 2021-06-07.
Major Features and Improvements
* Add `text` module to provide the basic dataset loading and preprocessing in NLP scenarios. [53](https://github.com/tinyms-ai/tinyms/pull/53) [#73](https://github.com/tinyms-ai/tinyms/pull/73)
* Upgrade the version of `mindspore` module dependencies from `v1.1.1` to `v1.2.0`. [81](https://github.com/tinyms-ai/tinyms/pull/81) [#84](https://github.com/tinyms-ai/tinyms/pull/84)
* Refactor the `Client` and `Server` communication interface in `serving` module. [76](https://github.com/tinyms-ai/tinyms/pull/76)
* Added server_path, start FlaskServer and add host and port parameters. [77](https://github.com/tinyms-ai/tinyms/pull/77)
* Implement TinyMS `hub` module to enable loading lots of pre-trained models, incluidng `lenet5_v1`, `resnet50_v1`, `alexnet_v1`, `vgg16_v1`, `mobilenet_v2` and `ssd300_v1`. [86](https://github.com/tinyms-ai/tinyms/pull/86) [#93](https://github.com/tinyms-ai/tinyms/pull/93)
* Publish the TinyMS Hub contributing guidelines in public to welcome pre-trained model from the comunity. [91](https://github.com/tinyms-ai/tinyms/pull/91)
* Refactor the model network entrypoint method to provide the unified interface. [85](https://github.com/tinyms-ai/tinyms/pull/85)
Model Park
* Add **5** models support: `AlexNet`, `DenseNet100`, `VGG16`, `SentimentNet`, `Bert`. [59](https://github.com/tinyms-ai/tinyms/pull/59) [#89](https://github.com/tinyms-ai/tinyms/pull/89) [#63](https://github.com/tinyms-ai/tinyms/pull/63) [#67](https://github.com/tinyms-ai/tinyms/pull/67)
API Change
* Refactor the `serving` entrypoint function with `Client` and `Server` class interface.
<table>
<tr>
<td style="text-align:center"> v0.1.0 </td> <td style="text-align:center"> v0.2.0 </td>
</tr>
<tr>
<td>
python
from tinyms.serving import start_server, server_started, list_servables, predict, shutdown
start_server()
if server_started():
list_servables()
predict('example.jpg', 'servable_name', dataset_name='mnist')
shutdown()
</td>
<td>
python
from tinyms.serving import Client, Server
server = Server()
server.start_server()
client = Client()
client.list_servables()
client.predict('example.jpg', 'servable_name', dataset_name='mnist')
server.shutdown()
</td>
</tr>
</table>
* Add a new interface load in model module to support load MindIR graph directly to perform model inference operation.
<table>
<tr>
<td style="text-align:center"> v0.2.0 </td>
</tr>
<tr>
<td>
python
>>> import tinyms as ts
>>> import tinyms.layers as layers
>>> from tinyms.model import Model, load
>>>
>>> net = layers.Conv2d(1, 1, kernel_size=3)
>>> model = Model(net)
>>> input = ts.ones([1, 1, 3, 3])
>>> model.export(input, "net", file_format="MINDIR")
...
>>> net = load("net.mindir")
>>> print(net(input))