A simple approach to design quantum neural networks and its applications to kernel-learning methods

10/19/2019
by   Changpeng Shao, et al.
0

We give an explicit simple method to build quantum neural networks (QNNs) to solve classification problems. Besides the input (state preparation) and output (amplitude estimation), it has one hidden layer which uses a tensor product of log M two-dimensional rotations to introduce log M weights. Here M is the number of training samples. We also have an efficient method to prepare the quantum states of the training samples. By the quantum-classical hybrid method or the variational method, the training algorithm of this QNN is easy to accomplish in a quantum computer. The idea is inspired by the kernel methods and the radial basis function (RBF) networks. In turn, the construction of QNN provides new findings in the design of RBF networks. As an application, we introduce a quantum-inspired RBF network, in which the number of weight parameters is log M. Numerical tests indicate that the performance of this neural network in solving classification problems improves when M increases. Since using exponentially fewer parameters, more advanced optimization methods (e.g. Newton's method) can be used to train this network. Finally, about the convex optimization problem to train support vector machines, we use a similar idea to reduce the number of variables, which equals M, to log M.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset