View RoundingError.ipynb
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
View loop_rooargset.py
def loop_iterator(iterator):
object = iterator.Next()
while object:
yield object
object = iterator.Next()
def iter_collection(rooAbsCollection):
iterator = rooAbsCollection.createIterator()
return loop_iterator(iterator)
View newton optimization.py
import numpy as np
import tensorflow as tf
# Newton's optimization method for multivariate function in tensorflow
def cons(x):
return tf.constant(x, dtype=tf.float32)
def compute_hessian(fn, vars):
mat = []
View 4Stefano.ipynb
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
View safe_roofit.py
def safe_factory(func):
def wrapper(self, *args):
result = func(self, *args)
if not result:
raise ValueError('invalid factory input "%s"' % args)
return result
return wrapper
ROOT.RooWorkspace.factory = safe_factory(ROOT.RooWorkspace.factory)
View CartPole-v0.py
# from http://kvfrans.com/simple-algoritms-for-solving-cartpole/
import gym
from gym import wrappers
import numpy as np
env = gym.make('CartPole-v0')
def run_episode(env, parameters):
observation = env.reset()
View pisa
# Ricerca di fisica esotica a LHC con risonanze a due corpi
## E/gamma energy calibration
* multivariate regression optimized on MC to calibrate the energy of electron / converted / unconverted photons
* intercalibration of calorimeter layers from 2012 + additional uncertainty
* energy scale and resolution corrections validated with 13 TeV $Z\to ee$
* For $E>100-200$ GeV resolution dominated by the constant term $c=0.6%-1.5%$
* Scale uncertainty (0.4%-2%) for diphoton analysis
* Preliminary photon energy resolution at 300 GeV: $\pm 80\%-100\%$
View UncorrelatedCategories.ipynb
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
View ttest_cpp.ipynb
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
View chi2_updated
The data have been binned in such a way that every bin contains more than 10 events. For every bin the integral of the S+B postfit pdf has been computed ($E_i$).
In the table the value of the Pearson-$\chi^2 = \sum_i (E_i - O_i)^2 / E_i$ is reported with the number of bins of $m_{\gamma\gamma}$.
Note that the $\chi^2$ is taking into account only the physical pdf, and not the product of constraints. In fact it is difficult to compute the number of degrees of freedom. We have 5 NPs for the background (4 "$\alpha$" + normalization) plus all the NPs for the statistical fluctuations (100+). Since these parameters are constrained they don't count -1 in the sum of the degree of freedom (something between 0 and -1). To try to evaluate their contribution we can imagine to add the constraint pdf to the computation of the $\chi^2$. This means to add 1 "bin", to subtract 1 dof, and to add a contribution to the $\chi^2$. This can (?) be evaluated as
$$-2\log (pdf(x | x_{true})) + 2\log(pdf(x_{true}|x_{true}))$$
for e