@@ -295,22 +295,38 @@ Please see [configs](./configs) for the details about model performance and pret
295295
296296## What is New
297297
298- - 2023/6/16
299- 1. New version `0.2.2` is released! We upgrade to support `MindSpore` v2.0 while maintaining compatibility of v1.8
300- 2. New models:
301- - [ConvNextV2](configs/convnextv2)
302- - mini of [CoAT](configs/coat)
303- - 1.3 of [MnasNet](configs/mnasnet)
304- - AMP(O3) version of [ShuffleNetV2](configs/shufflenetv2)
305- 3. New features:
306- - Gradient Accumulation
307- - DynamicLossScale for customized [TrainStep](mindcv/utils/train_step.py)
308- - OneCycleLR and CyclicLR learning rate scheduler
309- - Refactored Logging
310- - Pyramid Feature Extraction
311- 4. Bug fixes:
312- - Serving Deployment Tutorial(mobilenet_v3 doesn' t work on ms1.8 when using Ascend backend)
313- - Some broken links on our documentation website.
298+ - 2024/1/17
299+
300+ Release `0.3.0` is published. We will drop MindSpore 1.x in the future release.
301+
302+ 1. New models:
303+ - Y-16GF of [RegNet](configs/regnet)
304+ - [SwinTransformerV2](configs/swintransformerv2)
305+ - [VOLO](configs/volo)
306+ - [CMT](configs/cmt)
307+ - [HaloNet](configs/halonet)
308+ - [SSD](examples/det/ssd)
309+ - [DeepLabV3](examples/seg/deeplabv3)
310+ - [CLIP](examples/clip) & [OpenCLIP](examples/open_clip)
311+ 2. Features:
312+ - AsymmetricLoss & JSDCrossEntropy
313+ - Augmentations Split
314+ - Customized AMP
315+ 3. Bug fixes:
316+ - Since the classifier weights are not fully deleted, you may encounter an error passing in the `num_classes` when creating a pre-trained model.
317+ 4. Refactoring:
318+ - The names of many models have been refactored for better understanding.
319+ - [Script](mindcv/models/vit.py) of `VisionTransformer`.
320+ - [Script](train_with_func.py) of Mixed(PyNative+jit) mode training.
321+ 5. Documentation:
322+ - A guide of how to extract multiscale features from backbone.
323+ - A guide of how to finetune the pre-trained model on a custom dataset.
324+ 6. BREAKING CHANGES:
325+ - We are going to drop support of MindSpore 1.x for it' s EOL.
326+ - Configuration ` filter_bias_and_bn` will be removed and renamed as ` weight_decay_filter` ,
327+ due to a prolonged misunderstanding of the MindSpore optimizer.
328+ We will migrate the existing training recipes, but the signature change of function `create_optimizer` will be incompatible
329+ and the old version training recipes will also be incompatible. See [PR/752](https://github.com/mindspore-lab/mindcv/pull/752) for details.
314330
315331See [RELEASE](RELEASE.md) for detailed history.
316332
0 commit comments