Skip to content

Instantly share code, notes, and snippets.

@jonpsy
Last active June 7, 2020 13:28
Show Gist options
  • Save jonpsy/cf0222c5b7eb5813478c881a1d220a5c to your computer and use it in GitHub Desktop.
Save jonpsy/cf0222c5b7eb5813478c881a1d220a5c to your computer and use it in GitHub Desktop.
Random Fourier Features comparison between shogun and sklearn
Display the source blob
Display the rendered blob
Raw
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
@karlnapf
Copy link

in shogun, you can compare against the gaussian kernel computed with shogun, no need to implement your own

@karlnapf
Copy link

and the same for sklearn actually.

@karlnapf
Copy link

Because this still doesnt test whether the parametrization is the same you know. The results have to match when passing the same log_width to GaussianKernel and the RFF embedding

@jonpsy
Copy link
Author

jonpsy commented May 10, 2020

About the bandwidth, I've seen np.log(100) in our ipynb notebooks so I went with that(for ex: Gaussian Kernel in Classification.ipynb).
Next, here's what "get_width" gives return std::exp(m_log_width * 2.0) * 2.0.

@karlnapf
Copy link

that is probably for high dimensional data then?
Ok thanks for the formula, seems ok then. Still you want to compare against shogun's kernel not your own

@jonpsy
Copy link
Author

jonpsy commented May 10, 2020

Will be updated, be assured

@jonpsy
Copy link
Author

jonpsy commented Jun 6, 2020

Looks like it's done!

@karlnapf
Copy link

karlnapf commented Jun 7, 2020

Looks good now :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment