Operations Window
In the Operations window some basic structural operations and tests can be performed on the network.
For all operations, a set of relevant nodes can be selected/deselected either by toggling the checkboxes in the Operations panel, or by clicking on the nodes in the network plot.
Selected nodes are indicated in green and can be selected by the user for testing. "Measured nodes" typically indicates the information on sensors stored in the network structure file.
Invariant modules and immersion
Immersion is the construction of a new network, where a selected set of nodes is maintained and unselected nodes are removed, while the time series of all maintained nodes remain invariant. This network operation is closely related to a so-called Kron reduction, i.e. Gaussian elimination of the unselected nodes.
The invariant modules test, tests which modules remain invariant after immersing the unselected nodes in the network. This test can be done separate from, typically before, the actual immersion.
Parallel path and loop test
A target module (directed network) or subnetwork (undirected network) can be selected either using the dropdown menu in the Parallel Path panel, or by clicking on the module/links in the network plot.
A selected module with input and output remains invariant after immersion, if the parallel path and loop test is satisfied. This test verifies whether
- Every path from to passes through a node that is in the set of selected nodes;
- Every loop around passes through a node that is in the set of selected nodes;
and presents the result in the plotted graph.
The test is described in A. Dankers, P.M.J. Van den Hof, X. Bombois and P.S.C. Heuberger (2016). Identification of dynamic models in complex networks with predictior error methods - predictor input selection. IEEE Trans. Automatic Control, Vol. 61, no. 4, pp. 937-952, April 2016.
For undirected networks, multiple links can be selected, and a similar test is performed, where the two nodes connecting to a link are now both input and output, and where paralell paths / loops can also pass through the ground nodes.
The following operations are only available for directed networks
Confounding variables
For selected sets of input nodes and output nodes, it is analyzed whether there are confounding variables for the estimation problem from inputs to outputs.
Confounding variables are unmeasured (disturbance) signals, that have paths to both an input and an output of an estimation problem.
Present confounding variables are indicated by the corresponding input and output, while in the network plot the confounding paths are highlighted in red.
If the checkbox "proper confounding variables only" is checked, the analysis is limited to confounding variables that induce a causal effect on the estimated input/output map. Confounding variables with an anti-causal effect are discarded then.
Algebraic loops
Algebraic loops in the network can be made visible. An algebraic loop is a cycle on which no delays occur. Algebraic loops can be complicating factors when identifying single modules with a direct method. Typically their effect can be mitigated by adding external excitation signals.
H.H.M. Weerts, P.M.J. Van den Hof and A.G. Dankers (2016), Identification of dynamic networks operating in the presence of algebraic loops. Proc. 55th IEEE Conference on Decision and Control, 12-14 December 2016, Las Vegas, AZ, pp. 4606-4611.
Canonical noise model
This operation transforms the network to a network where only the selected nodes have a direct contribution from disturbances, and the unselected nodes are disturbance-free. In this transformation the time series of the selected node variables, as well as the modules in network matrix , remain invariant. The transformation only changes the noise model .
The transformation is attractive in the analysis of network identifiability, and is described in S. Shi, X. Cheng and P.M.J. Van den Hof (2023), Single module identifiability in linear dynamic networks with partial excitation and measurement. IEEE Trans. Automatic Control, Vol. 68, no. 1, pp. 285-300, January 2023.