Why a tensor object gets a new memory location after repeating the already executed operation?

I'm performing a simple addition operation using two tensors.




In the above screenshots, it has been shown that even after performing the same operation twice different memory locations have been allocated to the same result every time, however, this is not the case with plain python code:


So, is it happening because in TensorFlow for every operation it actually generates the computational graph and it doesn't really persist the results which means even when the same operation is executed twice it gives birth to two computational graphs which eventually results in two different memory locations?

How many English words
do you know?
Test your English vocabulary size, and measure
how many words do you know
Online Test
Powered by Examplum