Deep Declarative Networks: A New Hope

09/11/2019
by   Stephen Gould, et al.
38

We introduce a new class of end-to-end learnable models wherein data processing nodes (or network layers) are defined in terms of desired behavior rather than an explicit forward function. Specifically, the forward function is implicitly defined as the solution to a mathematical optimization problem. Consistent with nomenclature in the programming languages community, we name our models deep declarative networks. Importantly, we show that the class of deep declarative networks subsumes current deep learning models. Moreover, invoking the implicit function theorem, we show how gradients can be back-propagated through declaratively defined data processing nodes thereby enabling end-to-end learning. We show how these declarative processing nodes can be implemented in the popular PyTorch deep learning software library allowing declarative and imperative nodes to co-exist within the same network. We provide numerous insights and illustrative examples of declarative nodes and demonstrate their application for image and point cloud classification tasks.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset