Skip to content

Instantly share code, notes, and snippets.

@christophergandrud
Created February 4, 2021 09:14
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save christophergandrud/400464f049ec0baf5edfa9c5cbbfbfa5 to your computer and use it in GitHub Desktop.
Save christophergandrud/400464f049ec0baf5edfa9c5cbbfbfa5 to your computer and use it in GitHub Desktop.
Find the Cinelli and Hazlett (2020) robustness value for a generalised linear model.
"""
robustness_value(;fit::StatsModels.TableRegressionModel, treatment::String = "D", q::Int64 = 1)
Find the Cinelli and Hazlett (2020) robustness value for a generalised linear model.
The response will be a in the range 0 and 1. Values closer to 0 indicate that
the conclusions from the original model are highly subject to omitted variable
bias. Values closer to 1 indicate that the original model is less likely to be
caused by omitted variable bias.
"""
function robustness_value(;fit::StatsModels.TableRegressionModel,
treatment::String = "D", q::Int64 = 1)
# Extract coefficient table from fitted model
m = coeftable(fit)
# find treatment parameter row in coefficient table
treat_position = findfirst(x -> x == treatment, m.rownms)
t_value_position = findfirst(x -> x == "t", m.colnms)
t_value = m.cols[t_value_position][treat_position][1]
df = dof_residual(fit)
fq = abs(t_value / sqrt(df)) * q
1/2 * (sqrt(fq^4 + 4 * fq^2) - fq^2)
end
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment