How to pass "step" to ExponentialDecay in GradientTape

I tried to use an optimizers.schedules.ExponentialDecay isntance as the learning_rate to Adm optimizer, but i don't know how to pass "step" to it when train the model in GradientTape.

I use tensorflow-gpu-2.0-alpha0 and python3.6. And i read the doc but with no idea how to tackle it.

initial_learning_rate = 0.1
lr_schedule = tf.keras.optimizers.schedules.ExponentialDecay(

optimizer = tf.optimizers.Adam(learning_rate = lr_schedule)

for epoch in range(self.Epoch):
    with GradientTape as tape:
        pred_label = model(images)
        loss = calc_loss(pred_label, ground_label)
    grads = tape.gradient(loss, model.trainable_variables)
    optimizer.apply_gradients(zip(grads, model.trainable_variables))

# I tried this but the result seem not right.
# I want to pass "epoch" as "step" to lr_schedule