Skip to content

Instantly share code, notes, and snippets.

@nmayorov
Last active September 19, 2021 02:52
Show Gist options
  • Star 4 You must be signed in to star a gist
  • Fork 1 You must be signed in to fork a gist
  • Save nmayorov/6098a514cc3277d72dd7 to your computer and use it in GitHub Desktop.
Save nmayorov/6098a514cc3277d72dd7 to your computer and use it in GitHub Desktop.
Large scale bundle adjustment in scipy
Display the source blob
Display the rendered blob
Raw
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
@TungTNguyen
Copy link

Thanks for sharing!
You should add that the notebook is for Python 3, so others can know beforehand :-)
Best

@bvanderj
Copy link

This is a really easy tutorial on how to python can be optimized to solve large-scale, sparse problems.

One short-coming i seem to have run into with scipy.least_squares and bundle adjustment in particular, is that i don't see a clear way to apply weights to the minimization. Yes, i can constrain the bounds of the parameters, but in some bundle adjustment applications, it's necessary to weight the observation itself (e.g. a very well known pixel), rather than just constrain the parameter. All this would involve is the pre-multiplication of the jacobian and the residual vector by a weight matrix. However, I've tried various ways and have not achieved the desired result.

Any ideas?

@JepsonNomad
Copy link

@TungTNguyen the code works for Python 2.7.13 if these two changes are made:

Change line 2 in[4] to: urllib.urlopen(URL, FILE_NAME)
Change line 2 in[5] to: with bz2.BZ2File(file_name, "r") as file:

Tested on mid 2012 Macbook pro with OS version 10.13.6.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment