Sparklyr: ft_vector_assembler with ft_min_max_scaler are doing nothing but dont fail

I am running sparklyr 1.0.0 and SparkR 2.4.1. I am trying to perform the ft_min_max_scaler() example from the sparklyr documentation but its not working properly. The functions run without error but in the output there appears to be no changes to the input.

I started trying this example code after my other code using these functions wasn't working properly.

Here is the code:

sc <- spark_connect(master = "spark://<address>:7077",spark_home = "C:/Users/paul/Downloads/spark-2.4.1-bin-hadoop2.7", app_name = "sparklyr",config=config)


data(iris)

iris_tbl <- sdf_copy_to(sc, iris, name = "iris_tbl", overwrite = TRUE)

iris_tbl

features <- c("Sepal_Length", "Sepal_Width", "Petal_Length", "Petal_Width")

iris_tbl %>%
ft_vector_assembler(input_col = features,
                  output_col = "features_temp",uid="assembler") %>%
  ft_min_max_scaler(input_col = "features_temp",
                output_col = "features",uid="scaler")

I would expect that the 4 attributes listed would be scaled to [0,1] but they are not changing and are all greater than 1.

Update:

The following code fixed the problem but I still need to figure out how to copy the "features" column back into the dataframe:

iris_tbl<-iris_tbl %>%
ft_vector_assembler(input_col = features,
                  output_col = "features_temp",uid="assembler") %>%
ft_min_max_scaler(input_col = "features_temp",
                output_col = "features",uid="scaler") %<% collect()

 iris_tbl$features