Skip to content

Instantly share code, notes, and snippets.

@KhanradCoder
Created September 12, 2023 01:20
Show Gist options
  • Star 2 You must be signed in to star a gist
  • Fork 1 You must be signed in to fork a gist
  • Save KhanradCoder/2c5213fbf58c0fe505b96eb8816586f9 to your computer and use it in GitHub Desktop.
Save KhanradCoder/2c5213fbf58c0fe505b96eb8816586f9 to your computer and use it in GitHub Desktop.
rederive newton's law of universal gravitation with pysr
import numpy as np
num_samples=1000
noise = np.random.normal(1,
0.01, #0.001
size=num_samples)
input_vars = np.random.randn(num_samples, 3)
F = noise*(input_vars[:,0]*input_vars[:,1])/(input_vars[:,2]**2)
from pysr import PySRRegressor
model = PySRRegressor(
procs=4,
niterations=100,
binary_operators=["*", "+", "-", "/"],
unary_operators=["square", "cube", "exp"],
nested_constraints={
"square": {"square": 1, "cube": 1, "exp": 0},
"cube": {"square": 1, "cube": 1, "exp": 0},
"exp": {"square": 1, "cube": 1, "exp": 0},
},
# ^ Nesting constraints on operators. For example,
# "square(exp(x))" is not allowed, since "square": {"exp": 0}.
complexity_of_operators={"/": 2, "exp": 3},
# ^ Custom complexity of particular operators.
precision=64,
# ^ Higher precision calculations.
loss="loss(prediction, target) = sum((target .- prediction).^2) / length(target)",
)
model.fit(input_vars, F)
print(model)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment