Does it mean that var2 is another variable with initialization similar to var1? Or is var2 an alias for var1 (I tried and it doesn't seem to)?
How are var1 and var2 related?
How is a variable constructed when the variable we are getting doesn't really exists?
tf.get_variable(name) creates a new variable called name (or add _ if name already exists in the current scope) in the tensorflow graph.
In your example, you're creating a python variable called var1.
The name of that variable in the **Tensorflow graph is not ** var1, but is Variable:0.
Every node you define has its own name that you can specify or let tensorflow give a default (and always different) one. You can see the name value accessing the name property of the python variable. (ie print(var1.name)).
On your second line, you're defining a Python variablevar2 whose name in the tensorflow graph is var1.
If you, instead, want to define a variable (node) called var1 in the tensorflow graph and then getting a reference to that node, you cannot simply use tf.get_variable("var1"), because it will create a new different variable valled var1_1.
If you define a variable with a name that has been defined before, then TensorFlow throws an exception. Hence, it is convenient to use the tf.get_variable() function instead of tf.Variable(). The function tf.get_variable() returns the existing variable with the same name if it exists, and creates the variable with the specified shape and initializer if it does not exist.