Which Factorization Machine Modeling is Better: A Theoretical Answer with Optimal Guarantee

01/30/2019
by   Ming Lin, et al.
8

Factorization machine (FM) is a popular machine learning model to capture the second order feature interactions. The optimal learning guarantee of FM and its generalized version is not yet developed. For a rank k generalized FM of d dimensional input, the previous best known sampling complexity is O[k^3d·polylog(kd)] under Gaussian distribution. This bound is sub-optimal comparing to the information theoretical lower bound O(kd). In this work, we aim to tighten this bound towards optimal and generalize the analysis to sub-gaussian distribution. We prove that when the input data satisfies the so-called τ-Moment Invertible Property, the sampling complexity of generalized FM can be improved to O[k^2d·polylog(kd)/τ^2]. When the second order self-interaction terms are excluded in the generalized FM, the bound can be improved to the optimal O[kd·polylog(kd)] up to the logarithmic factors. Our analysis also suggests that the positive semi-definite constraint in the conventional FM is redundant as it does not improve the sampling complexity while making the model difficult to optimize. We evaluate our improved FM model in real-time high precision GPS signal calibration task to validate its superiority.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset