Algebraic Neural Networks: Stability to Deformations
In this work we study the stability of algebraic neural networks (AlgNNs) with commutative algebras which unify CNNs and GNNs under the umbrella of algebraic signal processing. An AlgNN is a stacked layered structure where each layer is conformed by an algebra 𝒜, a vector space ℳ and a homomorphism ρ:𝒜→End(ℳ), where End(ℳ) is the set of endomorphims of ℳ. Signals in each layer are modeled as elements of ℳ and are processed by elements of End(ℳ) defined according to the structure of 𝒜 via ρ. This framework provides a general scenario that covers several types of neural network architectures where formal convolution operators are being used. We obtain stability conditions regarding to perturbations which are defined as distortions of ρ, reaching general results whose particular cases are consistent with recent findings in the literature for CNNs and GNNs. We consider conditions on the domain of the homomorphisms in the algebra that lead to stable operators. Interestingly, we found that these conditions are related to the uniform boundedness of the Fréchet derivative of a function p:End(ℳ)→End(ℳ) that maps the images of the generators of 𝒜 on End(ℳ) into a power series representation that defines the filtering of elements in ℳ. Additionally, our results show that stability is universal to convolutional architectures whose algebraic signal model uses the same algebra.
READ FULL TEXT