Skip to content

Instantly share code, notes, and snippets.

@devernay
Forked from bl0up/gsoc2016.md
Created August 11, 2021 14:25
Show Gist options
  • Save devernay/560438d202370d54ef53350ac6f72ba3 to your computer and use it in GitHub Desktop.
Save devernay/560438d202370d54ef53350ac6f72ba3 to your computer and use it in GitHub Desktop.

Structured light with sinusoidal patterns and phase unwrapping

This project was about adding structured light methods, based on sinusoidal patterns, along with phase unwrapping to OpenCV. Our work was based on the following reference papers:

  • [1] Cong, Pengyu, et al. "Accurate dynamic 3D sensing with Fourier-assisted phase shifting." IEEE Journal of Selected Topics in Signal Processing 9.3 (2015): 396-408.
  • [2] Lei, Hai, et al. "A novel algorithm based on histogram processing of reliability for two-dimensional phase unwrapping." Optik-International Journal for Light and Electron Optics 126.18 (2015): 1640-1644.

The code can be found here and some data were also added here. Examples of wrapped phase maps stored in yml files can be found in opencv_extra/cv/testdata. They can be used with the sample created for the phase unwrapping module.


Links to contributions


What has been done

Fourier-assisted phase-shifting

Fourier-assisted phase-shifting (FAPS) is a combination of two other structured light methods that rely on the computation of phase maps:

  • Fourier transform profilometry (FTP)
  • Three-step phase-shifting (PSP)

FAPS takes advantage of FTP and PSP to allow precise reconstruction of dynamic scenes.

In our work, care was taken to allow users to choose between one of the three methods.

Generate patterns

The structured light module included in OpenCV can generate three sinusoidal patterns. The following parameters are easily set by users:

  • Patterns width and height, corresponding to the projector's resolution
  • Phase shift between two consecutive patterns (usually 2pi/3)
  • Patterns direction (horizontal or vertical)
  • Number of periods along the chosen direction
  • Presence of cross markers in the patterns (cross markers are used to get absolute phase maps, see reference paper 1 for details)
  • Number of pixels between two consecutive markers on a row/column

The following images show two examples of patterns generated with the structured light module.

pattern one pattern two

Compute relative phase maps

Once the patterns have been generated, they can be projected on the scene to compute relative phase maps. FTP only needs one pattern to compute the phase while PSP and FAPS require three of them. The computed phase maps are wrapped and need to be unwrapped with the phase unwrapping module.

The following pictures show an example of projected pattern, a wrapped phase map and the same map after phase unwrapping. They have been generated thanks to the example "capsinpattern" available in the structured light module.

projected pattern wrapped phase map unwrapped phase map

Phase unwrapping

Phase unwrapping has been implemented in an independent module since the problem is not limited to structured light. The chosen algorithm belongs to the quality-guided type. It computes the quality (or reliability) of each pixel in the wrapped phase map and follows the path going from the highest quality pixel to the lowest in order to remove the discontinuities. By doing so, error propagation is limited to low quality regions, as shown in the following images. The first one shows a wrapped phase map containing noise. The second one shows the corresponding reliabilities map. When the pixel is whit, it means that it is not reliable. Finally, the last image shows the unwrapped result. Those results can be reproduced thanks to the "unwrap" example in the phase unwrapping module and the "wrappedNoisyPeaks.yml" file that was added here.

wrapped peaks reliabilies unwrapped peaks

What has yet to be done

Fourier-assisted phase-shifting

Pro-cam correspondences

Phase maps are supposed to be used to find pro-cam correspondences, but we are still facing an ambiguity because patterns only change in one direction. Indeed, if horizontal patterns are used, all the columns in the projector phase map will be the same, as shown in the following picture:

projector phase map

To solve the problem, [1] used the epipolar geometry and manually aligned the camera and the projector so that epipolar lines are vertical. By doing so, they can search for corresponding pixels on the phase maps columns. Manual calibration is not user-friendly and makes the overall hard to use.

Several alternatives have been tested without any luck so far:

  • We performed pro-cam claibration as described here. The idea was to use the calibration results to rectify the phase maps and get vertical epipolar lines but the rectification gave poor results, the projector view (on the left) is almost completely dark.
  • We tried to compute the [fundamental matrix](http://docs.opencv.org/2.4/modules/calib3d/doc/camera_calibration_and_3d_reconstruction.html#Mat findFundamentalMat(InputArray points1, InputArray points2, int method, double param1, double param2, OutputArray mask)) to be able to find epipolar lines equations and look for corresponding pixels on non-vertical lines. The data points needed to use the method were generated with a 7x5 chessboard projected on a plane, as shown in this image. The plane orientation was changed several times (around ten times) and each time, 35 coordinates were found. These 10x35 points, detected by the camera and known in advance for the projector, were used in the findFundamentalMatrix. Doing so seemed wrong though because one point in the projector had multiple matches in the camera. Indeed, coordinates in the projector were ten times the same while coordinates in the camera were different for the ten plane orientations. This kind of situation is highly unlikely to be compatible with the algorithm. The resulting fundamental matrix was wrong, containing mostly zeros. To avoid the multiple matches problem, we decided to keep only one chessboard corner per view, meaning that ten points were used instead of 350, but the fundamental matrix was wrong again.

In both cases (i.e pro-cam calibration and fundamental matrix from 10 points), it is possible that bad results were caused by a lack of variability in the data. Some tests will be carried out to verify this hypothesis and one is nearly near completion. The idea is to project textured images on a non planar scene and try to detect matching features with ORB. By doing so, we avoid a degenerate configuration (i.e coplanar points) and we should also create enough variability in the data range.

From relative to absolute phase maps

Phase unwrapping results depend on the pixel that was chosen as starting point. An example can be seen in the following image where we can see that the unwrapped phase is vertically shifted. The phase map is said to be relative.

shift

This situation makes it impossible to find corresponding values in the two devices. To solve the problem, authors have inserted cross markers in the patterns to create some phase references. Here again, they rely on the fact that epipolar lines are vertical to find corresponding markers. Therefore, solving the previous problem should directly solve this one too.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment