Tuesday, March 07, 2017

Deep Semi-Random Features for Nonlinear Function Approximation



Deep Semi-Random Features for Nonlinear Function Approximation by Kenji Kawaguchi, Bo Xie, Le Song

We propose semi-random features for nonlinear function approximation. Semi-random features are defined as the product of a random nonlinear switching unit and a linear adjustable unit. The flexibility of semi-random feature lies between the fully adjustable units in deep learning and the random features used in kernel methods. We show that semi-random features possess a collection of nice theoretical properties despite the non-convex nature of its learning problem. In experiments, we show that semi-random features can match the performance of neural networks by using slightly more units, and it outperforms random features by using significantly fewer units. Semi-random features provide an interesting data point in between kernel methods and neural networks to advance our understanding of the challenge of nonlinear function approximation, and it opens up new avenues to tackle the challenge further.




Join the CompressiveSensing subreddit or the Google+ Community or the Facebook page and post there !
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

No comments:

Printfriendly