A library for easy and efficient manipulation of tensor networks.
APACHE-2.0 License
minor bug fixes and maintenance
Published by mganahl over 3 years ago
Bug fixes in BaseMPS class.
Published by mganahl almost 4 years ago
Published by chaserileyroberts almost 4 years ago
Added the NconBuilder class.
Published by chaserileyroberts about 4 years ago
tn.from_topology
methodtn.Tensor
Published by chaserileyroberts over 4 years ago
A minor release that doesn't change user facing behavior. Many methods were optimized and several minor bugs were fixed.
Published by chaserileyroberts over 4 years ago
DRMG is now supported on the NumPy, JAX, and PyTorch backends.
tn.FiniteDMRG
classtn.FiniteMPO
along with several prebuilt MPO classes like FiniteTFI
and FiniteXXZ
.jit
s several of its methods.tn_keras
layers. These layers are HIGHLY EXPERIMENTAL, so please don't run them in production yet. :)tn.get_neighbors
method.ZN
symmetry group for the symmetric
backend.Published by chaserileyroberts over 4 years ago
Minor release for backend bug fix.
Published by chaserileyroberts over 4 years ago
Added a new symmetric
backend. Current symmetries include U1 and Z2, but more will be added in the very near future.
BlockSparseTensor
, ChargeArray
, Index
, U1Charge
, Z2Charge
, and BaseCharge
.tn.replicate_nodes
as a cleaner API to tn.copy()
Published by chaserileyroberts over 4 years ago
InifiniteMPS
class.+
, -
, *
, and /
support for Node
swith tn.DefaultBackend(....):
support to allow more modular control of the default backends.QuOperator
, QuVector
and QuAdjointVector
.contractors
.Published by chaserileyroberts almost 5 years ago
The TensorNetwork class has been REMOVED and will raise an error if you try to use it. To upgrade your code, please see our simple upgrade guide.
tn.FiniteMPS
class for your standard MPS applicationstn.reduce_density
method for creating a reduced density matrices.ignore_edge_order
argument to the contractor methods.tn.split_edge
method.tn.contractor
methods for contracting a subset of your nodes.tn.NodeCollection
context manager. This allows you to collect all of your created nodes into a single list or set easily.Published by chaserileyroberts about 5 years ago
We have switched to using a "Free Node" paradigm. Examples of what this looks like can be found on our latest README.
Users have told us that the TensorNetwork
object is more cumbersome than helpful, so we will be totally removing it in the next release. This release is out to help users upgrade their code. This is a BREAKING CHANGE. We have made a simple tutorial to upgrade your code.
tensornetwork
is now as import tensornetwork as tn
TensorNetwork
object. Prefer now to just create your nodes using tn.Node
. All operations such as net.split_node
can now be called like tn.split_node
.tn.reachable
method to get a set of all nodes reachable from another node or set of nodes. This usually can be a drop in replacement for when you had to originally give a TensorNetwork
object in methods like all of the contractors
.Published by chaserileyroberts about 5 years ago
We have changed the default backend to numpy. This is a BREAKING CHANGE. To fix your code, you can simply do tensornetwork.set_default_backend("tensorflow")
at the top of your main file.
node[:3]
to get the edges for the first 3 axes in a node.optimal
, branch
and greedy
now require an output_edge_order
when there is more than one dangling edge in the network. This is to prevent a user from accidentally depending on a non-deterministic edge order after the network contraction.CopyNode
s can now be created outside of a TensorNetwork
, so now net.add_node(tensornetwork.CopyNode(...))
is the preferred way to add a copy node to a network. We will be removing net.add_copy_node
in the future.net.split_node_qr
and net.split_node_rq
methods that do QR decomposition of a Node
.net.copy
operation that will copy a TensorNetwork
. This copied network will keep the same tensors objects between the two nodes. This is to make taking gradients of nodes relative to the final contracted value much easier.conj
option to net.copy(conj=True)
. This will copy the TensorNetwork
and conjugate all of the tensors in the network. This is useful for calculating things like reduced density matrices.net.save(..)
and tensornetwork.load(...)
methods for saving and loading a TensorNetwork
object.Published by chaserileyroberts about 5 years ago
Added integration to opt_einsum
's deterministic contraction algorithms. See https://optimized-einsum.readthedocs.io/en/latest/path_finding.html for definitions on how these algorithms work.
contractors.optimal
to find the optimal contraction based on required flops.contractors.branch
, which uses branch heuristics to determine which paths to explore.contractors.greedy
, which uses a fast greedy heuristic.contractors.auto
, which chooses which of the above algorithms to use based on network size.contractors.custom
, which allows users to develop their own contraction algorithms.Published by chaserileyroberts about 5 years ago
tensornetwork.set_default_backend("pytorch")
or TensorNetwork(backend="pytorch")
to enable it.edge1 ^ edge2
as an alias to net.connect(edge1, edge2)
Published by chaserileyroberts over 5 years ago
net.switch_backend
methodnumpy
or jax
Published by chaserileyroberts over 5 years ago
greedy
contraction algorithm. This will greedily contract the lowest cost node pair first.bucket
contraction algorithm. This algorithm is optimized for tensor networks with a lot of copy tensors.naive
contraction algorithm. Now it should work even after some edge have been contracted.@
operator. Doing node1 @ node2
is equal to running net.contract_between(node1, node2)
graphviz
visualization integration. Simply do tensornetwork.to_graphviz(net)
to get a graphviz object that is isomorphic to your network.net.remove_node(node)
method.node.shape
and edge.dimension
properties.Published by chaserileyroberts over 5 years ago
tensordot2
that compiles ~20% faster than tf.tensordot