Skip to content

Instantly share code, notes, and snippets.

View saimn's full-sized avatar

Simon Conseil saimn

  • Institut de Physique des 2 Infinis (IP2I, CNRS)
  • Lyon, France
View GitHub Profile
@saimn
saimn / gist:bc2190e6e5b7a03284ae1d2ac0eec37b
Created March 14, 2018 17:23 — forked from tspriggs/gist:989b57a495ab2831a021fb376e18fabc
custom model to fit a 2D moffat model to data, equate that to F_OIII_xy, then pass it to a 1D custom Gaussian equation to get a model spectrum. The idea is to have x, y and l (wavelength) as inputs, then the amplitude is fixed as it is calculated. When I run it I normally get mismatched dimension errors: wavelength is a 271 long list, and then x…
def Moffat_3d_test(x, mean, stddev, Gauss_bkg, Gauss_grad,
Moffat_amplitude, x_0, y_0, gamma, alpha, Moffat_bkg):
# Moffat
rr_gg = ((x_fit - x_0)**2 + (y_fit - y_0)**2) / gamma**2
F_OIII_xy = Moffat_amplitude * (1 + rr_gg)**(-alpha) + Moffat_bkg
# Prep for Gauss 1D
Gauss_std = np.sqrt(stddev**2 + std_MUSE**2)
A_OIII_xy = F_OIII_xy / (np.sqrt(2*np.pi) * Gauss_std)
check_1.append(A_OIII_xy)
model_spectra = []