Genetic-algorithm-optimized neural networks for gravitational wave classification
Gravitational-wave detection strategies are based on a signal analysis technique known as matched filtering. Matched filtering is known to be optimal under certain conditions, yet in practice, these conditions are only approximately satisfied while the algorithm is computationally expensive. Despite the success of matched filtering for signal detection, due to these limitations, there has been recent interest in developing deep convolutional neural networks (CNNs) for signal detection. Designing these networks remains a challenge as most procedures adopt a trial and error strategy to set the hyperparameter values. We propose and develop a new method for hyperparameter optimization based on genetic algorithms (GAs). We compare six different GA variants and explore different choices for the GA-optimized fitness score. We show that the GA can discover high-quality architectures when the initial hyperparameter seed values are far from a good solution as well as refining already good networks. For example, when starting from the architecture proposed by George and Huerta, the network optimized over the 20-dimensional hyperparameter space has 78 increase in accuracy for our test problem. Using genetic algorithm optimization to refine an existing network should be especially useful if the problem context (e.g. statistical properties of the noise, signal model, etc) changes and one needs to rebuild a network. In all of our experiments, we find the GA discovers significantly less complicated networks as compared to the seed network, suggesting it can be used to prune wasteful network structures. While we have restricted our attention to CNN classifiers, GA hyperparameter optimization can be applied within other machine learning settings, including alternative architectures for signal classification, parameter inference, or other tasks.
READ FULL TEXT