Sparse Prediction with the k-Support Norm
We derive a novel norm that corresponds to the tightest convex relaxation of sparsity combined with an ℓ_2 penalty. We show that this new k-support norm provides a tighter relaxation than the elastic net and is thus a good replacement for the Lasso or the elastic net in sparse prediction problems. Through the study of the k-support norm, we also bound the looseness of the elastic net, thus shedding new light on it and providing justification for its use.
READ FULL TEXT