334 Star 1.5K Fork 864

MindSpore / docs

加入 Gitee
与超过 1200万 开发者一起发现、参与优秀开源项目,私有仓库也完全免费 :)
免费加入
克隆/下载
exponential_decay_lr.md 2.69 KB
一键复制 编辑 原始数据 按行查看 历史
luojianing 提交于 2023-07-21 15:16 . replace target=blank

Function Differences with tf.compat.v1.train.exponential_decay

View Source On Gitee

tf.compat.v1.train.exponential_decay

tf.compat.v1.train.exponential_decay(
    learning_rate,
    global_step,
    decay_steps,
    decay_rate,
    staircase=False,
    name=None
) -> Tensor

For more information, see tf.compat.v1.train.exponential_decay.

mindspore.nn.exponential_decay_lr

mindspore.nn.exponential_decay_lr(
    learning_rate,
    decay_rate,
    total_step,
    step_per_epoch,
    decay_epoch,
    is_stair=False
) -> list[float]

For more information, see mindspore.nn.exponential_decay_lr.

Differences

TensorFlow: calculate the learning rate based on the exponential decay function.

MindSpore: MindSpore API basically implements the same function as TensorFlow.

Categories Subcategories TensorFlow MindSpore Differences
Parameters Parameter 1 learning_rate learning_rate -
Parameter 2 global_step total_step Same function, different parameter names
Parameter 3 decay_steps decay_epoch Same function, different parameter names
Parameter 4 decay_rate decay_rate -
Parameter 5 staircase is_stair Same function, different parameter names
Parameter 6 name - Not involved
Parameter 7 - step_per_epoch The number of steps per epoch, TensorFlow does not have this parameter

Code Example

The two APIs achieve the same function and have the same usage.

# TensorFlow
import tensorflow as tf

learning_rate = 1.0
decay_rate = 0.9
step_per_epoch = 2
epochs = 3
lr = []
for epoch in range(epochs):
    learning_rate = tf.compat.v1.train.exponential_decay(learning_rate, epoch, step_per_epoch, decay_rate, staircase=True)
    learning_rate = learning_rate().numpy().item()
    lr.append(round(float(learning_rate), 2))
print(lr)
# [1.0, 1.0, 0.9]

# MindSpore
import mindspore.nn as nn

learning_rate = 1.0
decay_rate = 0.9
total_step = 3
step_per_epoch = 2
decay_epoch = 1
output = nn.exponential_decay_lr(learning_rate, decay_rate, total_step, step_per_epoch, decay_epoch)
print(output)
# [1.0, 1.0, 0.9]
1
https://gitee.com/mindspore/docs.git
git@gitee.com:mindspore/docs.git
mindspore
docs
docs
r2.0

搜索帮助