glassoformer: a query-sparse transformer for post-fault power grid voltage prediction

01/22/2022
by   Yunling Zheng, et al.
0

We propose GLassoformer, a novel and efficient transformer architecture leveraging group Lasso regularization to reduce the number of queries of the standard self-attention mechanism. Due to the sparsified queries, GLassoformer is more computationally efficient than the standard transformers. On the power grid post-fault voltage prediction task, GLassoformer shows remarkably better prediction than many existing benchmark algorithms in terms of accuracy and stability.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset