How dose dropout affect Weights and Bias?

I applied dropout in my network , and it worked , but i can't interpret dropout effects on weight and bias, to be more specific , i can't interpret why appling droput and not applying dropout have a great effect on mean, std, min, max of weights and biases

What i know is that dropout is regularization, which is applied to the units of network and turns the units off randomly.

Network model

input(1x784) -> hidden layer(784x128) -> hidden layer(128x256) -> output(256x10)

The graph below shows the min, max ,mean ,std value of applying dropout and not applying dropout.

There is a lot of difference between applying droppout and not doing it.

but i don't know why there's a difference

max

min

mean

stddev

Could you explain why there is such a difference?