How to pass "step" to ExponentialDecay in GradientTape

I tried to use an optimizers.schedules.ExponentialDecay isntance as the learning_rate to Adm optimizer, but i don't know how to pass "step" to it when train the model in GradientTape.

I use tensorflow-gpu-2.0-alpha0 and python3.6. And i read the doc https://tensorflow.google.cn/versions/r2.0/api_docs/python/tf/optimizers/schedules/ExponentialDecay but with no idea how to tackle it.

initial_learning_rate = 0.1
lr_schedule = tf.keras.optimizers.schedules.ExponentialDecay(
    initial_learning_rate,
    decay_steps=100000,
    decay_rate=0.96)

optimizer = tf.optimizers.Adam(learning_rate = lr_schedule)

for epoch in range(self.Epoch):
    ...
    ...
    with GradientTape as tape:
        pred_label = model(images)
        loss = calc_loss(pred_label, ground_label)
    grads = tape.gradient(loss, model.trainable_variables)
    optimizer.apply_gradients(zip(grads, model.trainable_variables))

# I tried this but the result seem not right.
# I want to pass "epoch" as "step" to lr_schedule