Consider a batch of data over a sliding window, collected in a measurement
vector Y and input vector U. As in Chapter 6, the idea of a consistency
test is to apply a linear transformation to a batch of data, AiY + BiU + ci.
The matrices Ai, Bi and vector G are chosen so that the norm of the linear
transformation is small when there is no change/fault according to hypothesis
Hi, and large when fault Hi has appeared.