Skip to content

Instantly share code, notes, and snippets.

@dhermes
Last active September 17, 2018 22:33
Show Gist options
  • Save dhermes/3551f053e3f81a85d488c7cdb22a18c8 to your computer and use it in GitHub Desktop.
Save dhermes/3551f053e3f81a85d488c7cdb22a18c8 to your computer and use it in GitHub Desktop.
Discussion of parametric curves; particularly classification and implicitization
Display the source blob
Display the rendered blob
Raw
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
\documentclass[letterpaper,10pt]{article}
\usepackage[margin=1in]{geometry}
\usepackage{amsmath,amsthm,amssymb,tikz,pgfplots,csquotes}
\usetikzlibrary{decorations.markings}
\usepackage{graphicx} %% For including images
\usepackage{fancyvrb} %% For Verbatim environment
% http://stackoverflow.com/a/11331659/1068170
\usepackage{upquote}
\usepackage{hyperref}
\hypersetup{colorlinks=true,urlcolor=blue}
\usepackage{embedfile}
\embedfile{\jobname.tex}
\embedfile{make_images_for_tex.py}
\embedfile{stationary_pt_test.py}
\embedfile{writeup_image00.png}
\usepackage{fancyhdr}
\pagestyle{fancy}
\lhead{Danny Hermes}
\rhead{Implicitizing Coordinate-Wise Polynomial Curves}
\renewcommand{\headrulewidth}{0pt}
\renewcommand{\qed}{\(\blacksquare\)}
\begin{document}
\textbf{NOTE:} To extract the \LaTeX\ source and other sources files from
this PDF file, execute:
\begin{Verbatim}[commandchars=\\\{\}]
pdftk \jobname.pdf unpack_files output .
\end{Verbatim}
This is an algebraic problem concerning parametric curves that are used
as sides of curved mesh elements with known coordinates.
\tableofcontents
\section{Motivation} Consider the points
\[(x, y) = (0, 5), (1, 9), (4, 14)\]
along the curved edge in Figure~\ref{fig:curved-element-p-is-1}.
Given three constraints, we'd like to be able to fit a parabola
to those points. However on closer inspection, we consider that a
parabola is defined by a focus (two degrees of freedom) and
a directrix line (two d.o.f.). Thus determining a unique parabola
is in general impossible (since the system is underdetermined).
\begin{center}
\begin{figure}
\centering
\includegraphics[scale=0.3]{writeup_image00.png}
\caption{Curved Mesh Element}
\label{fig:curved-element-p-is-1}
\end{figure}
\end{center}
\subsection{Enforced Uniqueness}
Instead of fitting a parabola in space, we fit each coordinate as
a parabola with respect to a shared parameter \(s\):
\begin{align*}
x &= A_2 s^2 + A_1 s + A_0 \\
y &= B_2 s^2 + B_1 s + B_0
\end{align*}
Since we think of our points geometrically as the beginning, midpoint
and end, we choose \(s = 0, \frac{1}{2}, 1\). Then solving for the
coefficients of the \(x\)-parameterization becomes
\[\left[ \begin{array}{c c c} 1 & 0 & 0 \\ 1 & \frac{1}{2} & \frac{1}{4} \\
1 & 1 & 1 \end{array}\right]
\left[ \begin{array}{c} A_0 \\ A_1 \\ A_2 \end{array}\right] =
\left[ \begin{array}{c} 0 \\ 1 \\ 4 \end{array}\right]\]
We recognize a Vandermonde matrix and know it must have a unique solution
since we chose unique \(s\)-values.
Thus by parameterizing \(x\) and \(y\) on a coordinate-wise quadratic-curve
(not necessarily a parabola), \fbox{we guarantee a unique set of coefficients}
to fit our three datapoints.
\section{Example}
\subsection{Solving the System from the Motivating Example}
To find the parameters of our curve \(\gamma(s)\) we solve the system
\[\left[ \begin{array}{c c c} 1 & 0 & 0 \\ 1 & \frac{1}{2} & \frac{1}{4} \\
1 & 1 & 1 \end{array}\right]
\left[ \begin{array}{c c} A_0 & B_0 \\ A_1 & B_1 \\ A_2 & B_2 \end{array}\right] =
\left[ \begin{array}{c c} x_0 & y_0 \\ x_1 & y_1 \\ x_2 & y_2 \end{array}\right]\]
This can be solved as
\[\left[ \begin{array}{c c} A_0 & B_0 \\ A_1 & B_1 \\ A_2 & B_2 \end{array}\right] =
\left[ \begin{array}{c c c} 1 & 0 & 0 \\ -3 & 4 & -1 \\
2 & -4 & 2 \end{array}\right]
\left[ \begin{array}{c c} x_0 & y_0 \\ x_1 & y_1 \\ x_2 & y_2 \end{array}\right]\]
So with our curve, we find
\[x = 4 s^2, \quad y = 2 s^2 + 7 s + 5.\]
\subsection{Classification} Our goal was a parabola, but we don't really know
what we have. In order to figure this out, we ``implicitize'' the curve (in
layman's terms, we eliminate the \(s\)).
Luckily, the \textbf{resultant}, an object used quite frequently in elimination
theory, can lend us a hand. The polynomial
\[f(x, y) = \operatorname{Res}_s\left(x(s) - x, y(s) - y\right)\]
is equal to zero precisely when \textbf{there exists} \(s\)\footnote{in the
algebraic closure of \(\mathbf{R}\), i.e. \(\mathbf{C}\)}
such that
\[x(s) - x = y(s) - y = 0.\]
Thus \(f(x, y) = 0\) gives a level curve in \(\mathbf{R}^2\) that
contains\footnote{We say ``contains'' since limiting to \(s \in \mathbf{R}\)
may not cover all points in \(\mathbf{R}^2\) with \(f(x, y) = 0\)}
\[\gamma(s) = (x(s), y(s)).\]
Computing the resultant (via the determinant of a Sylvester matrix) gives
\[f(x, y) = \det\left[ \begin{array}{c c c c}
4 & 0 & -x & 0 \\
0 & 4 & 0 & -x \\
2 & 7 & 5 - y & 0 \\
0 & 2 & 7 & 5 - y
\end{array}\right] = 4(x^2 - 4xy + 4y^2 - 29x - 40y + 100)\]
In general, we may not be able to classify generic curves, but this
specifically is a conic section. Such curves are \textbf{fully classified} by
their discriminant, and here our discriminant
\[(-16)^2 - 4(4)(16) = 0\]
tells us our curve is actually a \textbf{parabola}.
\section{Degree Two: General Case}
The general question of classifying
\begin{align*}
x &= x(s) = A_2 s^2 + A_1 s + A_0 \\
y &= y(s) = B_2 s^2 + B_1 s + B_0
\end{align*}
can again be done by using the resultant to eliminate \(s\). Before,
\(f(x, y) \in \mathbf{R}\left[x, y\right]\). But now our coefficients
will come from the ring
\(R = \mathbf{R}\left[A_2, A_1, A_0, B_2, B_1, B_0\right]\), i.e.
\(f \in R\left[x, y\right]\).
\subsection{Coefficients Computation}
We want to compute the coefficients of
\[f(x, y) = \operatorname{Res}_s\left(x(s) - x, y(s) - y\right) =
\det(M)\]
for the Sylvester Matrix
\[M = M(x, y) = \left[ \begin{array}{c c c c}
A_2 & A_1 & A_0 - x & 0 \\
0 & A_2 & A_1 & A_0 - x \\
B_2 & B_1 & B_0 - y & 0 \\
0 & B_2 & B_1 & B_0 - y
\end{array}\right].\]
This also gives us a quick way of computing \(f(x, y)\) for a known
\((x, y)\) without computing the coefficients of \(f\) itself.
To actually compute \(f(x, y)\), one can consider the columns
including \(x, y\) values as a sum of three vectors and use the
multilinearity property of the determinant.
Writing \(M(0, 0)\) as columns
\[M(0, 0) = \left[ \begin{array}{c c c c}
v_1 & v_2 & v_3 & v_4 \end{array}\right]\]
we have
\[\det\left(M(x, y)\right) = \det\left(v_1, v_2, v_3 - x b_1 - y b_3,
v_4 - x b_2 - y b_4\right)\]
where the \(b_j\) are the canonical basis vectors of
\(\mathbf{R}^4\).
We can expand each of the \(3\) terms in the third and
fourth components (via multilinearity) to get a total of \(9\)
determinants. Grouping terms we find:
%% A2, A1, A0, B2, B1, B0 = sympy.symbols('A2, A1, A0, B2, B1, B0')
%% v1 = sympy.Matrix([[A2, 0, B2, 0]]).T
%% v2 = sympy.Matrix([[A1, A2, B1, B2]]).T
%% v3 = sympy.Matrix([[A0, A1, B0, B1]]).T
%% v4 = sympy.Matrix([[0, A0, 0, B0]]).T
%% def cols_to_mat(*cols):
%% n = len(cols)
%% M = sympy.zeros(n)
%% for i, col in enumerate(cols):
%% M[:, i] = col
%% return M
%% b1 = sympy.Matrix([[1, 0, 0, 0]]).T
%% b2 = sympy.Matrix([[0, 1, 0, 0]]).T
%% b3 = sympy.Matrix([[0, 0, 1, 0]]).T
%% b4 = sympy.Matrix([[0, 0, 0, 1]]).T
%% cols_to_mat(v1, v2, v3, v4).det()
%% cols_to_mat(v1, v2, -b1, v4).det() + cols_to_mat(v1, v2, v3, -b2).det()
%% cols_to_mat(v1, v2, -b3, v4).det() + cols_to_mat(v1, v2, v3, -b4).det()
%% cols_to_mat(v1, v2, -b1, -b2).det()
%% cols_to_mat(v1, v2, -b3, -b2).det() + cols_to_mat(v1, v2, -b1, -b4).det()
%% cols_to_mat(v1, v2, -b3, -b4).det()
\begin{align*}
\text{Const}\left(f(x, y)\right) &= \det\left(v_1, v_2, v_3, v_4\right) \\
&= \left(A_2 B_0 - A_0 B_2\right)^2 - \left(A_1 B_0 - A_0 B_1\right)\left(
A_2 B_1 - A_1 B_2\right) \\
\text{Coeff}_{x}\left(f(x, y)\right) &=
\det\left(v_1, v_2, -b_1, v_4\right) + \det\left(v_1, v_2, v_3, -b_2\right) \\
&= - A_2 B_1^2 + A_1 B_1 B_2 + 2 B_0 A_2 B_2 - 2 A_0 B_2^2 \\
\text{Coeff}_{y}\left(f(x, y)\right) &=
\det\left(v_1, v_2, -b_3, v_4\right) + \det\left(v_1, v_2, v_3, -b_4\right) \\
&= - B_2 A_1^2 + B_1 A_1 A_2 + 2 A_0 A_2 B_2 - 2 B_0 A_2^2 \\
\text{Coeff}_{x^2}\left(f(x, y)\right) &=
\det\left(v_1, v_2, -b_1, -b_2\right) \\
&= B_2^2 \\
\text{Coeff}_{xy}\left(f(x, y)\right) &=
\det\left(v_1, v_2, -b_3, -b_2\right) + \det\left(v_1, v_2, -b_1, -b_4\right) \\
&= -2 A_2 B_2 \\
\text{Coeff}_{y^2}\left(f(x, y)\right) &=
\det\left(v_1, v_2, -b_3, -b_4\right) \\
&= A_2^2
\end{align*}
In particular, we find we always have discriminant
\[\left(-2 A_2 B_2\right)^2 - 4\left(B_2^2\right)\left(A_2^2\right) = 0.\]
Hence when \(n = 2\), we \textbf{always} have a \textbf{parabola}\footnote{
unless \(A_2 = B_2 = 0\), but that situation effectively amounts to a
degree one parameterization}.
\section{Degree Three Examples}\label{degree3}
If we were to compute the general coefficients for degree three,
we would have more terms than can fit on this page. Instead, we
cover four examples which showcase the different types of stationary
points that an implicitized curve can exhibit.
%% See stationary_pt_test.py, case3()
\subsection{Cubic}\label{degree3-cubic}
Given
\begin{align*}
x = x(s) &= s^3 + s^2 + s + 1 \\
y = y(s) &= 2s^3 + 2s^2 + 1
\end{align*}
we have
\begin{align*}
f(x, y) &= \det\left[ \begin{array}{c c c c c c}
1 & 1 & 1 & 1 - x & 0 & 0 \\
0 & 1 & 1 & 1 & 1 - x & 0 \\
0 & 0 & 1 & 1 & 1 & 1 - x \\
2 & 2 & 0 & 1 - y & 0 & 0 \\
0 & 2 & 2 & 0 & 1 - y & 0 \\
0 & 0 & 2 & 2 & 0 & 1 - y
\end{array}\right] \\
&= 8 x^3 - 12 x^2 y + 6 x y^2 - y^3 - 4 x^2 + 4 x y - y^2 - 2 x - 3 y + 5 \\
&= -(y - 2x)^3 - (y - 2x)^2 - 2x - 3y + 5.
\end{align*}
To find the stationary points where
\[f = f_x = f_y = 0\]
we first find where \(f_x = f_y = 0\). Using the resultant to remove
\(x\):
\[\operatorname{Res}_x\left(f_x, f_y\right) = 9216\]
thus there are no solutions.
Hence \(f(x, y)\) has no stationary points. Via the coordinate transformation
\[x_1 = y - 2x, \quad y_1 = - 4y\]
our curve becomes
\[0 = -f = x_1^3 + x_1^2 - x_1 + 4y - 5 \Longrightarrow
y_1 = x_1^3 + x_1^2 - x_1 - 5.\]
Thus we take that cubic and transform it
\[\left[ \begin{array}{c c} -2 & 1 \\ 0 & -4 \end{array}\right]
\left[ \begin{array}{c} x \\ y \end{array}\right] =
\left[ \begin{array}{c} x_1 \\ y_1 \end{array}\right]
\Longleftrightarrow
-\frac{1}{8} \left[ \begin{array}{c c} 4 & 1 \\ 0 & 2 \end{array}\right]
\left[ \begin{array}{c} x_1 \\ y_1 \end{array}\right] =
\left[ \begin{array}{c} x \\ y \end{array}\right]\]
via two rotations and a stretch (via the SVD):
\[-\frac{1}{8} \left[ \begin{array}{c c}
4 & 1 \\ 0 & 2 \end{array}\right] \approx
\left[ \begin{array}{c c}
0.9889 & -0.1487 \\
0.1487 & 0.9889\end{array}\right]
\left[ \begin{array}{c c}
0.5199 & 0 \\
0 & 0.2404\end{array}\right]
\left[ \begin{array}{c c}
-0.9510 & 0.3092 \\
-0.3092 & -0.9510
\end{array}\right]^T.\]
%% See stationary_pt_test.py, case4()
\subsection{Isolated Point}\label{degree3-isolated}
Given
\begin{align*}
x = x(s) &= s^3 + s^2 + s + 1 \\
y = y(s) &= 2s^3 + 3s^2 + 1
\end{align*}
we can show
\begin{align*}
f(x, y) &= 8 x^3 - 12 x^2 y + 6 x y^2 - y^3 + 9 x^2 - 2 y^2 - 24 x + 16 \\
&= (2x - y)^3 + 9 x^2 - 2 y^2 - 24 x + 16.
\end{align*}
To find stationary points, we again use the resultants:
\begin{align*}
\operatorname{Res}_x\left(f_x, f_y\right) &= 144(y + 12)(y + 48) \\
\operatorname{Res}_y\left(f_x, f_y\right) &= 36(x + 4)(x + 20)
\end{align*}
Of these four choices (two each for \(x, y\)), only
\(f(-20, -48) = 0\).
Solving
\begin{align*}
x(s) = x &= -20 \Longrightarrow s \in \left\{-3, 1 \pm i \sqrt{6}
\right\} \\
y(s) = y &= -48 \Longrightarrow s \in \left\{-\frac{7}{2},
1 \pm i \sqrt{6} \right\}
\end{align*}
so we see
\[(x, y) = (-20, -48) \Longleftrightarrow s = 1 \pm i \sqrt{6}.\]
In particular, this point occurs on \(f(x, y)\) as an
\textbf{isolated point}, but does not occur on our real
parameterized curve \(\gamma(s) = (x(s), y(s))\).
Notice that the Hessian at this point is
\[H = \left[ \begin{array}{c c}
f_{xx} & f_{xy} \\
f_{yx} & f_{yy}
\end{array}\right] = \left[ \begin{array}{c c}
210 & -96 \\
-96 & 44
\end{array}\right]\]
with \(\det(H) = 24\) hence \(H\) is positive definite
which indicates that \((-20, -48)\) is a relative
minimum.
%% See stationary_pt_test.py, case8()
\subsection{Cusp}\label{degree3-cusp}
Given
\begin{align*}
x = x(s) &= 2s^3 - 3s^2 - 12s \\
y = y(s) &= -s^3 + 2s^2 + 4s
\end{align*}
we can show
\begin{align*}
f(x, y) &= -x^3 - 6 x^2 y - 12 x y^2 - 8 y^3
+ 4 x^2 + 24 x y + 33 y^2 + 16 x + 48 y \\
&= -(x + 2y)^3 + 4 x^2 + 24 x y + 33 y^2 + 16 x + 48 y.
\end{align*}
To find stationary points, we again use the resultants:
\begin{align*}
\operatorname{Res}_x\left(f_x, f_y\right) &= 36(y - 8)^2 \\
\operatorname{Res}_y\left(f_x, f_y\right) &= 144(x + 20)^2
\end{align*}
This lone point also satisfies \(f(-20, 8) = 0\).
Solving
\begin{align*}
x(s) = x &= -20 \Longrightarrow s \in \left\{-\frac{5}{2}, 2, 2
\right\} \\
y(s) = y &= 8 \Longrightarrow s \in \left\{-2, 2, 2\right\}
\end{align*}
so we see
\[(x, y) = (-20, 8) \Longleftrightarrow
s = 2 \text{ (double root)}.\]
Since \(x'(s) = 6(s - 2)(s + 1)\) and \(y'(s) =
-(s - 2)(3s + 2)\), on our path \(\gamma(s)\) we have
\[\left.\frac{d\gamma}{ds}\right|_{s=2} = (0, 0).\]
Hence no tangent can be defined at this point, so it is
a \textbf{cusp}.
The Hessian at this point is
\[H = \left[ \begin{array}{c c}
32 & 72 \\
72 & 162
\end{array}\right] =
\left[ \begin{array}{c c}
4 & -9 \\
9 & 4
\end{array}\right]
\left[ \begin{array}{c c}
194 & \\
& 0
\end{array}\right]
\left[ \begin{array}{c c}
4 & -9 \\
9 & 4
\end{array}\right]^{-1}.\]
%% See stationary_pt_test.py, case6()
\subsection{Self-Crossing}\label{degree3-self-crossing}
Given
\begin{align*}
x = x(s) &= -2s^3 - 2s^2 \\
y = y(s) &= -s^3 + s^2 + s + 1
\end{align*}
we can show
\begin{align*}
f(x, y) &= -x^3 + 6 x^2 y - 12 x y^2 + 8 y^3 -
20 x^2 + 24 x y - 32 y^2 - 16 x + 40 y - 16 \\
&= (2y - x)^3 - 20 x^2 + 24 x y - 32 y^2 - 16 x + 40 y - 16
\end{align*}
To find stationary points, we again use the resultants:
\begin{align*}
\operatorname{Res}_x\left(f_x, f_y\right) &= 192(8y - 11)(96y - 97) \\
\operatorname{Res}_y\left(f_x, f_y\right) &= 3072(4x + 1)(48x + 7)
\end{align*}
Of these four choices (two each for \(x, y\)), only
\(f\left(-\frac{1}{4}, \frac{11}{8}\right) = 0\).
Solving
\begin{align*}
x(s) = x &= -\frac{1}{4} \Longrightarrow s \in \left\{-\frac{1}{2},
\frac{-1 \pm \sqrt{5}}{4}\right\} \\
y(s) = y &= \frac{11}{8} \Longrightarrow s \in \left\{\frac{3}{2},
\frac{-1 \pm \sqrt{5}}{4}\right\}
\end{align*}
so we see
\[(x, y) = \left(-\frac{1}{4}, \frac{11}{8}\right)
\Longleftrightarrow s = \frac{-1 \pm \sqrt{5}}{4}.\]
Since we have two valid real inputs which produce the same point,
this point must be a \textbf{self-crossing} for \(\gamma(s)\).
At these points we have
\[\left.\frac{d\gamma}{ds}\right|_{s=\frac{-1 \pm \sqrt{5}}{4}} = \left(
\frac{-5 \mp \sqrt{5}}{4}, \frac{-5 \pm 7 \sqrt{5}}{8}\right) \Longrightarrow
\left.\frac{dy}{dx}\right|_{s=\frac{-1 \pm \sqrt{5}}{4}} = \frac{3}{2}
\mp \sqrt{5}.\]
So we see two things. First, at the stationary point, the derivative
of the tangent line along \(\gamma(s)\) can be defined. However,
trying to solve for \(s\) in
\(\gamma(s) = \left(-\frac{1}{4}, \frac{11}{8}\right)\) cannot be done
uniquely. This is exactly equivalent to a failure in the implicit
function theorem.
The Hessian at this point is
\[H = \left[ \begin{array}{c c}
-22 & -12 \\
-12 & 8
\end{array}\right]\]
with \(\det(H) = -320\), hence it is indefinite and
\(\left(-\frac{1}{4}, \frac{11}{8}\right)\) must be a
\textbf{saddle point} on the graph of \(z = f(x, y)\).
%% FUN 3D PLOT
%% x, y, s, f_symb = sympy.symbols('x, y, s, f')
%% x_s, y_s, f, solns = case6(x, y, s, f_symb)
%% s_vals = np.linspace(-1, 0.5, 1000)
%% x_vals = eval(str(x_s), {}, {'s': s_vals})
%% y_vals = eval(str(y_s), {}, {'s': s_vals})
%% plt.plot(x_vals, y_vals)
%% plt.show()
%% from mpl_toolkits.mplot3d import Axes3D
%% X = np.linspace(-0.8, 0.1, 201)
%% Y = np.linspace(0.8, 2.0, 201)
%% X, Y = np.meshgrid(X, Y)
%% Z = eval(str(f), {}, {'x': X, 'y': Y})
%% fig = plt.figure()
%% ax = fig.gca(projection='3d')
%% from matplotlib import cm
%% surf = ax.plot_surface(X, Y, Z,
%% rstride=1, cstride=1, cmap=cm.coolwarm,
%% linewidth=0, antialiased=False)
%% Z2 = np.zeros(X.shape)
%% surf2 = ax.plot_surface(X, Y, Z2,
%% rstride=1, cstride=1, cmap=cm.coolwarm,
%% linewidth=0, antialiased=False)
%% ax.set_zlim(-8.2, 6.1)
%% ax.view_init(elev=0, azim=30)
%% plt.show()
\section{Classifying Without Implicitizing}
In the general case we have
\begin{align*}
x(s) &= A_d s^d + \cdots \\
y(s) &= B_e s^e + \cdots
\end{align*}
where the degrees \(d, e\) are non-negative integers. We assume that
one of \(d, e\) is greater than \(1\). If \(d, e < 2\) we simply
have lines or points and these are quite easy to classify (lines have
no stationary points).
Consider a stationary
point \((x_0, y_0)\) (assuming one exists). By definition of the
resultant we have
\[F_1(x) = f(x, y_0) =
(-1)^{de} B_e^d \prod_{\sigma \, : \, y\left(\sigma\right) = y_0}
\left[x\left(\sigma\right) - x\right]\]
where the \(\sigma\) are roots (with multiplicity) of \(y(s) - y_0 = 0\).
We know that for our stationary point
\[F_1(x_0) = f(x_0, y_0) = 0, \quad F_1'(x_0) = f_x(x_0, y_0) = 0\]
hence \(F_1(x)\) has (at least) a double\footnote{WLOG assume
\(e \geq 2\) since we could just swap \(x, y\) if need be} root at \(x = x_0\).
This means that (at least) two roots (counting multiplicity) of
\(y(s) - y_0\) produce
\[x\left(\sigma\right) = x_0, \; x\left(\sigma'\right) = x_0, \;
y\left(\sigma\right) = y_0, \; y\left(\sigma'\right) = y_0.\]
In other words, the stationary points occur
at places where the parameterization \(\gamma(s) = (x(s), y(s))\) is
non-unique. This makes sense; we know that stationary points are places
where the implicit function theorem fails for both \(x\) and \(y\)
since \(\nabla f = 0\).
In order to find values \(s \in \mathbf{C}\) where \(\gamma(s)\) is
non-unique we define
\[X(s, t) = \frac{x(s) - x(t)}{s - t}, \quad
Y(s, t) = \frac{y(s) - y(t)}{s - t}.\]
Setting \(X(s, t) = Y(s, t) = 0\) gives
non-trivial\footnote{\(X(s, t) = 0\) for a fixed \(t\) gives
roots that are functions of \(s\). In our definition of
\(X(s, t)\) we divided by \(s - t\) to eliminate the trivial
root function \(r(s) = s\). Taylor's
Theorem says
\(x(s) = x(t) + (s - t) x'(t) + (s - t)^2 x''(t) / 2 + \cdots\) so
unless \(x'(t) \equiv 0 \Longleftrightarrow x(t) \equiv C\) we know that
\((s - t)^2\) does not
divide \(x(s) - x(t)\) for all values of \(t\) hence
the root function \(r(s) = s\) occurs exactly once
among the roots of \(x(s) - x(t)\).}
pairs \((s, t)\) where \(x(s) = x(t)\) and \(y(s) = y(t)\). By construction
we have the symmetry \(X(s, t) = X(t, s)\) and \(Y(s, t) = Y(t, s)\).
Due to the symmetry, the polynomials
\[I(s) = I_1(s) = \operatorname{Res}_t\left(X(s, t), Y(s, t)\right) \quad
\text{and} \quad I_2(t) = \operatorname{Res}_s\left(X(s, t), Y(s, t)\right)\]
must be identical. So solving one of them can show where stationary points
occur (or show that none do).
\subsection{Cubic}
Re-visiting the example from~\ref{degree3-cubic}, we have
\begin{align*}
X(s, t) &= (s^2 + s t + t^2) + (s + t) + 1 \\
Y(s, t) &= 2 (s^2 + s t + t^2) + 2(s + t) \\
\operatorname{Res}_t\left(X(s, t), Y(s, t)\right) &= 4
\end{align*}
This confirms the absence of a stationary point.
However, if we convert to homogeneous coordinates
\begin{align*}
x(S, T, U) &= (S^2 + S T + T^2) + (S + T) U + U^2 \\
y(S, T, U) &= 2 (S^2 + S T + T^2) + 2(S + T) U \\
\operatorname{Res}_S\left(x, y\right) &= 4 U^4 \\
\operatorname{Res}_T\left(x, y\right) &= 4 U^4 \\
\operatorname{Res}_U\left(x, y\right) &= 4\left(S^2 + ST + T^2\right)^2
\end{align*}
This has two solutions (both points at infinity)
\(\left(1:\omega:0\right)\) and \(\left(\omega:1:0\right)\)
where \(\omega\) is one of the roots of
\[\omega^2 + \omega + 1 = 0.\]
\subsection{Isolated Point}
Re-visiting the example from~\ref{degree3-isolated}, we have
\begin{align*}
X(s, t) &= (s^2 + s t + t^2) + (s + t) + 1 \\
Y(s, t) &= 2 (s^2 + s t + t^2) + 3 (s + t) \\
\operatorname{Res}_t\left(X(s, t), Y(s, t)\right) &= (s - 1)^2 + 6
\end{align*}
This confirms the stationary point at \(s = 1 \pm i \sqrt{6}\)
and
\[X\left(1 \pm i \sqrt{6}, t\right) = (t + 3)\left(t -
\left[1 \mp i \sqrt{6}\right]\right), \qquad
Y\left(1 \pm i \sqrt{6}, t\right) = (2t + 7)\left(t -
\left[1 \mp i \sqrt{6}\right]\right).\]
\subsection{Cusp}
Re-visiting the example from~\ref{degree3-cusp}, we have
\begin{align*}
X(s, t) &= 2 (s^2 + s t + t^2) - 3 (s + t) - 12 \\
Y(s, t) &= -(s^2 + s t + t^2) + 2 (s + t) + 4 \\
\operatorname{Res}_t\left(X(s, t), Y(s, t)\right) &= (s - 2)^2
\end{align*}
This confirms the stationary point at \(s = 2\) and
\[X(2, t) = (t - 2)(2t + 5), \qquad Y(2, t) = -(t - 2)(t + 2).\]
\subsection{Self-Crossing}
Re-visiting the example from~\ref{degree3-self-crossing}, we have
\begin{align*}
X(s, t) &= -2 (s^2 + s t + t^2) - 2(s + t) \\
Y(s, t) &= -(s^2 + s t + t^2) + (s + t) + 1 \\
\operatorname{Res}_t\left(X(s, t), Y(s, t)\right) &= (4 s + 1)^2 - 5
\end{align*}
This confirms the stationary point at \(s = \frac{-1 \pm \sqrt{5}}{4}\)
and
\[X\left(\frac{-1 \pm \sqrt{5}}{4}, t\right) = -(2t + 1)\left(t -
\left[\frac{-1 \mp \sqrt{5}}{4}\right]\right), \qquad
Y\left(\frac{-1 \pm \sqrt{5}}{4}, t\right) = -\left(t - \frac{3}{2}\right)\left(t -
\left[\frac{-1 \mp \sqrt{5}}{4}\right]\right).\]
\subsection{Disjoint Pieces of Implicit Curve}
Since polynomials \(x(s), y(s)\) are continuous, the curve
\(\gamma\left(\mathbf{R}\right)\) must be continuous. We'd like to be
able to show that the only points that can be produced by
\(s \in \mathbf{C} - \mathbf{R}\) are \textbf{isolated points}.
In order to have \textbf{another} curve \textbf{disjoint} from
\(\gamma\left(\mathbf{R}\right)\), we'd need
a set of inputs \(s \in \mathbf{C} - \mathbf{R}\) which produce that
curve. We'd like to show that for such \(s = a + bi\) with \(b \neq 0\), there
are finitely many pairs \((a, b)\) such that
\[\operatorname{Im}\left(x(a + bi)\right) =
\operatorname{Im}\left(y(a + bi)\right) = 0.\]
Unfortunately
this is easy to show false. The curve \(x(s) = s^2, y(s) = s^4\) is
implicitized as \(f(x, y) = \left(y - x^2\right)^2\). However
\(\gamma\left(\mathbf{R}\right)\) is only the right half of
the parabola while
\(\gamma\left(i \mathbf{R}\right)\) is the left half. But for
this example since \(f(x, y)\) is not square-free
we always have \(f = 0 \Longrightarrow f_x = f_y = 0\)
hence every point is a stationary point. To see this in another way
note that \(X(s, t) = s + t\) divides
\(Y(s, t) = X(s, t) (s^2 + t^2)\) hence forces
\(I(s) \equiv 0\).
From the Wikipedia
\href{https://en.wikipedia.org/wiki/Algebraic_curve#Singular_points}{page}
on algebraic curves:
\begin{displayquote}
The singular points of a curve defined by a polynomial
\(f(x, y)\) are the solutions of the system of equations:
\[f(x, y) = f_x(x, y) = f_y(x, y) = 0.\]
This implies that the number of singular points is finite as
long as \(f(x, y)\) is square-free.
\end{displayquote}
The page goes on to count the number of possible singular points and
says B\'{e}zout's theorem justifies this argument, but the theorem
can only be applied when \(f(x, y)\) is square-free.
Note that if we can factor
\[f(x, y) = A(x, y) B(x, y)\]
with \(\gcd\left(A, B\right) = 1\). Let \(g(x, y) = \gcd\left(f, f_x\right)\),
then since \(g \mid f\), we know \(g(x, y) = A_g(x, y) B_g(x, y)\) for
\(A_g \mid A\) and \(B_g \mid B\) (since polynomial rings are UFDs). Since
\(\gcd(B, A_g) = 1\) and \(A_g \mid A\) we must have
\[0 \equiv f_x \equiv A_x B + A B_x \equiv A_x B \bmod{A_g} \Longrightarrow
A_x \equiv 0 \bmod{A_g}.\]
Thus we see that \(A_g \mid \gcd\left(A, A_x\right)\) and
\(B_g \mid \gcd\left(B, B_x\right)\). Since
\(\gcd\left(A, A_x\right) \mid f_x\) as well we see that
\(\gcd\left(A, A_x\right) \mid A_g\) hence
\[\gcd\left(f, f_x\right) = \gcd\left(A, A_x\right)
\gcd\left(B, B_x\right).\]
When \(f(x, y) = p(x, y)^n\) the power of an irreducible, we have WLOG
\(B(x, y) = 1\). Since \(f_x = n p^{n - 1} p_x\) we see that
\(\gcd(f, f_x) = p^{n - 1} \cdot \gcd(p, p_x) = p^{n - 1}\) (since
\(\deg p_x < \deg p\) and \(p\) is irreducible, \(\gcd(p, p_x) = 1\)).
Thus, when we write \(f\) as the power of irreducibles
\[f = p_1^{n_1} p_2^{n_2} \cdots p_k^{n_k} \Longrightarrow
\gcd(f, f_x) = p_1^{n_1 - 1} p_2^{n_2 - 1} \cdots p_k^{n_k - 1}.\]
In particular, if \(f\) is square-free (i.e. \(n_1 = n_2 = \cdots = n_k = 1\))
then \(\boxed{\gcd(f, f_x) = 1}\).
\section{Parameterizing an Algebraic Curve}
An algebraic curve admits a parameterization if and only if
it has genus zero (reference needed).
\subsection{Isolated Point} Re-(re-)visiting the example
from~\ref{degree3-isolated}
%% >>> import sympy
%% >>> x, y = sympy.symbols('x, y')
%% >>> f = (2 * x - y)**3 + 9 * x**2 - 2 * y**2 - 24 * x + 16
%% >>> sympy.resultant(f.diff(x), f.diff(y), y).factor()
%% 36*(x + 4)*(x + 20)
%% >>> sympy.resultant(f.diff(x), f.diff(y), x).factor()
%% 144*(y + 12)*(y + 48)
%% >>> sympy.Matrix([[f, f.diff(x), f.diff(y), f.diff(x, 2)]]).subs({x: -4, y: -12})
%% Matrix([[32, 0, 0, 114]])
%% >>> sympy.Matrix([[f, f.diff(x), f.diff(y), f.diff(x, 2)]]).subs({x: -4, y: -48})
%% Matrix([[59648, 9504, -4608, 978]])
%% >>> sympy.Matrix([[f, f.diff(x), f.diff(y), f.diff(x, 2)]]).subs({x: -20, y: -12})
%% Matrix([[-18144, 4320, -2304, -654]])
%% >>> sympy.Matrix([[f, f.diff(x), f.diff(y), f.diff(x, 2)]]).subs({x: -20, y: -48})
%% Matrix([[0, 0, 0, 210]])
\[f(x, y) = (2x - y)^3 + 9 x^2 - 2 y^2 - 24 x + 16\]
we have already seen a unique singular point
\[f(-20, -48) = f_x(-20, -48) = f_y(-20, -48) = 0.\]
Since \(f_{xx}(-20, -48) = 210\), this is only a double point,
hence the genus for our degree \(d = 3\) curve is
\[g = \binom{d - 1}{2} - \binom{2}{2} = 0.\]
\subsection{Cusp} Re-(re-)visiting the example from~\ref{degree3-cusp}
%% >>> import sympy
%% >>> x, y = sympy.symbols('x, y')
%% >>> f = -(x + 2 * y)**3 + 4 * x**2 + 24 * x * y + 33 * y**2 + 16 * x + 48 * y
%% >>> sympy.resultant(f.diff(x), f.diff(y), y).factor()
%% 144*(x + 20)**2
%% >>> sympy.resultant(f.diff(x), f.diff(y), x).factor()
%% 36*(y - 8)**2
%% >>> sympy.Matrix([[f, f.diff(x), f.diff(y), f.diff(x, 2)]]).subs({x: -20, y: 8})
%% Matrix([[0, 0, 0, 32]])
\[f(x, y) = -(x + 2y)^3 + 4 x^2 + 24 x y + 33 y^2 + 16 x + 48 y\]
we have already seen a unique singular point
\[f(-20, 8) = f_x(-20, 8) = f_y(-20, 8) = 0.\]
Since \(f_{xx}(-20, 8) = 32\), this is only a double point,
hence the genus for our degree \(d = 3\) curve is
\[g = \binom{d - 1}{2} - \binom{2}{2} = 0.\]
\subsection{Self-Crossing} Re-(re-)visiting the example
from~\ref{degree3-self-crossing}
%% >>> import sympy
%% >>> x, y = sympy.symbols('x, y')
%% >>> f = (2 * y - x)**3 - 20 * x**2 + 24 * x * y - 32 * y**2 - 16 * x + 40 * y - 16
%% >>> sympy.resultant(f.diff(x), f.diff(y), y).factor()
%% 3072*(4*x + 1)*(48*x + 7)
%% >>> sympy.resultant(f.diff(x), f.diff(y), x).factor()
%% 192*(8*y - 11)*(96*y - 97)
%% >>> x1 = sympy.Rational(-1, 4)
%% >>> x2 = sympy.Rational(-7, 48)
%% >>> y1 = sympy.Rational(11, 8)
%% >>> y2 = sympy.Rational(97, 96)
%% >>> sympy.Matrix([[f, f.diff(x), f.diff(y), f.diff(x, 2)]]).subs({x: x1, y: y2})
%% Matrix([[15925/110592, 2135/768, 35/128, -211/8]])
%% >>> sympy.Matrix([[f, f.diff(x), f.diff(y), f.diff(x, 2)]]).subs({x: x2, y: y1})
%% Matrix([[-13325/110592, -595/256, -455/384, -181/8]])
%% >>> sympy.Matrix([[f, f.diff(x), f.diff(y), f.diff(x, 2)]]).subs({x: x2, y: y2})
%% Matrix([[125/432, 0, 0, -27]])
%% >>> sympy.Matrix([[f, f.diff(x), f.diff(y), f.diff(x, 2)]]).subs({x: x1, y: y1})
%% Matrix([[0, 0, 0, -22]])
\[f(x, y) = (2y - x)^3 - 20 x^2 + 24 x y - 32 y^2 - 16 x + 40 y - 16\]
we have already seen a unique singular point
\[f\left(-\frac{1}{4}, \frac{11}{8}\right) =
f_x\left(-\frac{1}{4}, \frac{11}{8}\right) =
f_y\left(-\frac{1}{4}, \frac{11}{8}\right) = 0.\]
Since \(f_{xx}\left(-\frac{1}{4}, \frac{11}{8}\right) = 22\), this is
only a double point,
hence the genus for our degree \(d = 3\) curve is
\[g = \binom{d - 1}{2} - \binom{2}{2} = 0.\]
\subsection{Cubic} Re-(re-)visiting the example
from~\ref{degree3-cubic}
%% >>> import sympy
%% >>> x, y = sympy.symbols('x, y')
%% >>> f = -(y - 2 * x)**3 - (y - 2 * x)**2 - 2 * x - 3 * y + 5
%% >>> sympy.resultant(f.diff(x), f.diff(y), y).factor()
%% 576
%% >>> X, Y, Z = sympy.symbols('X, Y, Z')
%% >>> g = (f.subs({x: X / Z, y: Y / Z}) * Z**3).factor()
%% >>> g_XY = sympy.resultant(g.diff(X), g.diff(Y), Z).factor()
%% >>> g_XY
%% 576*(2*X - Y)**4
%% >>> M = sympy.Matrix([[g, g.diff(X), g.diff(Y), g.diff(Z)]])
%% >>> sympy.Matrix([map(sympy.factor, M.subs({Y: 2 * X}))])
%% Matrix([[-Z**2*(8*X - 5*Z), -2*Z**2, -3*Z**2, -Z*(16*X - 15*Z)]])
%% >>> M.subs({Y: 2 * X, Z: 0})
%% Matrix([[0, 0, 0, 0]])
%% >>> g.diff(Z, 2).subs({Y: 2 * X, Z: 0})
%% -16*X
\[f(x, y) = -(y - 2x)^3 - (y - 2x)^2 - 2x - 3y + 5\]
we have already seen there is no singular point since
\[\gcd(f_x, f_y) = 1.\]
This would seem to indicate that our degree \(d = 3\) curve has
genus \(g = \binom{d - 1}{2} > 0\). However, we \textbf{know}
that \(f(x, y)\) can be parameterized (we constructed it by
\textbf{starting} from a parameterization).
This is because the double point is a point at infinity. To
see this, we convert to a homogeneous
\[g(X, Y, Z) = Z^3 f\left(\frac{X}{Z}, \frac{Y}{Z}\right) =
-(Y - 2X)^3 - (Y - 2X)^2 Z - 2X Z^2 - 3 Y Z^2 + 5 Z^3.\]
Noting that
\[\operatorname{Res}_Z\left(g_X, g_Y\right) = 576 (2 X - Y)^4\]
we search for our double points among
\begin{align*}
g(X, 2X, Z) &= Z^2 (5 Z - 8 X) \\
g_X(X, 2X, Z) &= -2 Z^2 \\
g_Y(X, 2X, Z) &= -3 Z^2 \\
g_Z(X, 2X, Z) &= Z (15 Z - 16 X)
\end{align*}
and we see that \(Z = 0\). So, our double point is
\((1:2:0)\). Luckily, it is not a triple point since
\[g_{ZZ}(1, 2, 0) = -16.\]
Thus, our \text{double} point gives a genus
\[g = \binom{d - 1}{2} - \binom{2}{2} = 0.\]
\subsection{Non-Example: Failure} All the previous examples
%% >>> import sympy
%% >>> x, y = sympy.symbols('x, y')
%% >>> f = x**3 - x - y**2
%% >>> X, Y, Z = sympy.symbols('X, Y, Z')
%% >>> g = (f.subs({x: X / Z, y: Y / Z}) * Z**3).factor()
%% >>> g_XY = sympy.resultant(g.diff(X), g.diff(Y), Z).factor()
%% >>> g_XY
%% 12*X**2*Y**2
%% >>> g_XZ = sympy.resultant(g.diff(X), g.diff(Z), Z).factor()
%% >>> g_XZ
%% 12*X**4 - Y**4
%% >>> M = sympy.Matrix([[g, g.diff(X), g.diff(Y), g.diff(Z)]])
%% >>> M.subs({X: 0, Y: 0, Z: 1})
%% Matrix([[0, -1, 0, 0]])
we considered were from \textbf{existing} parameterizations.
Instead, consider an elliptic curve:
\[f(x, y) = x^3 - x - y^2.\]
To make sure we don't miss a singular point consider:
\[g(X, Y, Z) = Z^3 f\left(\frac{X}{Z}, \frac{Y}{Z}\right) =
X^3 - X Z^2 - Y^2 Z.\]
Noting that
\[\operatorname{Res}_Z\left(g_X, g_Y\right) = 12 X^2 Y^2 \quad
\text{and} \quad
\operatorname{Res}_Z\left(g_X, g_Z\right) = 12 X^4 - Y^4\]
we must have \(X = Y = 0\) at a double point. However
\[g(0, 0, 1) = g_Y(0, 0, 1) = g_Z(0, 0, 1) = 0 \quad \text{but}
\quad g_X(0, 0, 1) = -1.\]
Hence there are no singular points and the curve has
genus
\[g = \binom{3 - 1}{2} = 1.\]
\subsection{Generic Cubic} Consider
%% >>> import sympy
%% >>> x, y = sympy.symbols('x, y')
%% >>> A, B, C, D = sympy.symbols('A, B, C, D')
%% >>> f = A * x**3 + B * x**2 + C * x + D - y
%% >>> X, Y, Z = sympy.symbols('X, Y, Z')
%% >>> g = (f.subs({x: X / Z, y: Y / Z}) * Z**3).factor()
%% >>> g_XY = sympy.resultant(g.diff(X), g.diff(Y), Z).factor()
%% >>> g_XY
%% 9*A**2*X**4
%% >>> M = sympy.Matrix([[g, g.diff(X), g.diff(Y), g.diff(Z)]])
%% >>> sympy.Matrix([map(sympy.factor, M.subs({X: 0}))])
%% Matrix([[Z**2*(D*Z - Y), C*Z**2, -Z**2, Z*(3*D*Z - 2*Y)]])
%% >>> g.diff(Z, 2).subs({X: 0, Y: 1, Z: 0})
%% -2
\[f(x, y) = A x^3 + B x^2 + C x + D - y \Longleftrightarrow
g(X, Y, Z) = A X^3 + B X^2 Z + C X Z^2 + D Z^3 - Y Z^2.\]
Since
\[\operatorname{Res}_Z\left(g_X, g_Y\right) = 9 A^2 X^4\]
we must have \(X = 0\) which gives
\begin{align*}
g(0, Y, Z) &= Z^2 (D Z - Y) \\
g_X(0, Y, Z) &= C Z^2 \\
g_Y(0, Y, Z) &= -Z^2 \\
g_Z(0, Y, Z) &= Z (3 D Z - 2 Y).
\end{align*}
Hence this always has a singular point at
\((0:1:0)\). It is a double (not triple) point since
\(g_{ZZ}(0, 1, 0) = -2\)
thus it has a genus
\[g = \binom{3 - 1}{2} - \binom{2}{2} = 0.\]
This can be confirmed by the simple parameterization
\[x(s) = s, \quad y(s) = A s^3 + B s^2 + C s + D.\]
\subsection{Generic Quartic} Consider
%% >>> import sympy
%% >>> x, y = sympy.symbols('x, y')
%% >>> A, B, C, D, E = sympy.symbols('A, B, C, D, E')
%% >>> f = A * x**4 + B * x**3 + C * x**2 + D * x + E - y
%% >>> X, Y, Z = sympy.symbols('X, Y, Z')
%% >>> g = (f.subs({x: X / Z, y: Y / Z}) * Z**4).factor()
%% >>> g_XY = sympy.resultant(g.diff(X), g.diff(Y), Z).factor()
%% >>> g_XY
%% 64*A**3*X**9
%% >>> M = sympy.Matrix([[g, g.diff(X), g.diff(Y), g.diff(Z)]])
%% >>> sympy.Matrix([map(sympy.factor, M.subs({X: 0}))])
%% Matrix([[Z**3*(E*Z - Y), D*Z**3, -Z**3, Z**2*(4*E*Z - 3*Y)]])
%% >>> import itertools
%% >>> for v1, v2 in itertools.combinations_with_replacement([X, Y, Z], 2):
%% ... val = g.diff(v1).diff(v2).subs({X: 0, Y: 1, Z: 0})
%% ... print((v1, v2, val))
%% ...
%% (X, X, 0)
%% (X, Y, 0)
%% (X, Z, 0)
%% (Y, Y, 0)
%% (Y, Z, 0)
%% (Z, Z, 0)
%% >>> g.diff(Z, 3).subs({X: 0, Y: 1, Z: 0})
%% -6
\begin{align*}
f(x, y) &= A x^4 + B x^3 + C x^2 + D x + E - y \\
\Longleftrightarrow g(X, Y, Z) &=
A X^4 + B X^3 Z + C X^2 Z^2 + D X Z^3 + E Z^4 - Y Z^3.
\end{align*}
Since
\[\operatorname{Res}_Z\left(g_X, g_Y\right) = 64 A^3 X^9\]
we must have \(X = 0\) which gives
\[g_Y(0, Y, Z) = -Z^3\]
hence (as with the generic cubic) there is a singular point at
\((0:1:0)\). It is a triple (not quadruple) point since
\[g_{XX} = g_{XY} = g_{YY} = g_{YZ} = g_{ZZ} = g_{ZX} = 0
\quad \text{at} \quad (0:1:0)\]
and \(g_{ZZZ}(0, 1, 0) = -6\).
Hence the genus for our degree \(d = 4\) curve is
\[g = \binom{d - 1}{2} - \binom{3}{2} = 0.\]
This can be confirmed by the simple parameterization
\[x(s) = s, \quad y(s) = A s^4 + B s^3 + C s^2 + D s + E.\]
\end{document}
import matplotlib.pyplot as plt
import numpy as np
import seaborn
seaborn.set_palette('husl')
IMAGE_TEMPLATE = 'writeup_image%02d.png'
def image0():
filename = IMAGE_TEMPLATE % (0,)
num_points = 200
node_points = np.array([
[0, 5.0],
[1.0, 9.0],
[4.0, 14.0],
])
specific_s = np.array([
[1, 0, 0],
[1, 0.5, 0.25],
[1, 1, 1],
])
coeffs = np.linalg.solve(specific_s, node_points)
s_values = np.linspace(0, 1, num_points)
stacked_s = s_values[:, np.newaxis]**np.array([[0, 1, 2]])
x_y_vals = stacked_s.dot(coeffs)
x_values = x_y_vals[:, 0]
y_values = x_y_vals[:, 1]
curved_edge, = plt.plot(x_values, y_values)
edge_color = curved_edge.get_color()
# Horizontal edge
x_start = np.min(node_points[:, 0])
x_stop = np.max(node_points[:, 0])
y_start = np.min(node_points[:, 1])
plt.plot(np.linspace(x_start, x_stop, num_points),
y_start * np.ones(num_points), color=edge_color)
y_stop = np.max(node_points[:, 1])
# Vertical edge
plt.plot(x_stop * np.ones(num_points),
np.linspace(y_start, y_stop, num_points), color=edge_color)
marked_x = node_points[:, 0]
marked_y = node_points[:, 1]
horizontal_midpt = 0.5 * (x_start + x_stop)
vertical_midpt = 0.5 * (y_start + y_stop)
marked_x = np.append(marked_x, [x_stop, horizontal_midpt, x_stop])
marked_y = np.append(marked_y, [y_start, y_start, vertical_midpt])
plt.plot(marked_x, marked_y, color='black',
marker='o', linestyle='None')
plt.axis('equal')
x_width = x_stop - x_start
y_width = y_stop - y_start
plt.xlim(horizontal_midpt - 0.6 * x_width,
horizontal_midpt + 0.6 * x_width)
plt.ylim(vertical_midpt - 0.6 * y_width,
vertical_midpt + 0.6 * y_width)
plt.axis('off')
plt.savefig(filename)
import matplotlib.pyplot as plt
import numpy as np
import sympy
def case1(x, y, s, f_symb):
# CASE 1: ISOLATED POINT (-20, 12) AT s = (-1 +/- SQRT(-35)) / 6
# y1**2 = (6*x2 - 35)*(12*x2 + 35)**2
# x = 5*x2/2 - y1/3 - 305/24
# y = -3*x2/2 + y1/3 + 61/8
x_s = - 9 * s**3 + 18 * s**2 - 2 * s + 1
y_s = 9 * s**3 - 9 * s**2 + 5 * s
f = (x**3 + 3 * x**2 * y + 3 * x * y**2 + y**3 - 5 * x**2 -
37 * x * y - 41 * y**2 + 52 * x + 52 * y - 48)
# CHECK: f.subs({x: x_s, y: y_s}).expand()
fx = f.diff(x)
fy = f.diff(y)
res_x = sympy.resultant(fx, fy, y)
res_y = sympy.resultant(fx, fy, x)
solns = sympy.solve([res_x, res_y])
for soln in solns:
f_val = f.subs(soln)
soln[f_symb] = f_val
# f(-20, 12) = 0
# f_x(-20 , 12 ) = f_y(-20 , 12 ) = 0
# f_x(-20 , 13/4) = f_y(-20 , 13/4) = 0
# f_x(-65/12, 12 ) = f_y(-65/12, 12 ) = 0
# f_x(-65/12, 13/4) = f_y(-65/12, 13/4) = 0
# sympy.resultant(f, f.diff(y), y).factor()
# sympy.solve(sympy.resultant(f, f.diff(y), y).factor())
# sympy.resultant(f, f.diff(x), x).factor()
# sympy.solve(sympy.resultant(f, f.diff(x), x).factor())
return x_s, y_s, f, solns
def case1_numerical():
x_verts = [-20, 5 - np.sqrt(4000.0/243), 5 + np.sqrt(4000.0/243)]
y_verts = np.linspace(-10, 80, 500)
ones_verts = np.ones(y_verts.shape)
for x_vert in x_verts:
curr_x = ones_verts * x_vert
plt.plot(curr_x, y_verts, color='blue')
x_for_y_vert = np.linspace(-40, 20, ones_verts.size)
y_vert = 12 # No real roots in 243*y**2 - 486*y + 275
curr_y = y_vert * ones_verts
plt.plot(x_for_y_vert, curr_y, color='red')
min_x = -25
max_x = 10
num_pts = int((max_x - min_x) * 100)
x_values = np.linspace(-25, 10, num_pts + 1)
scatter_pts = []
for x in x_values:
coeffs = [
1,
3 * x - 41,
3 * x**2 - 37 * x + 52,
x**3 - 5 * x**2 + 52 * x - 48,
]
roots = np.roots(coeffs)
roots = np.real_if_close(roots)
# root, = roots[np.isreal(roots)]
real_roots = roots[np.isreal(roots)]
if len(real_roots) < 1:
raise ValueError('Expected at least one real')
# print x, len(real_roots)
for root in real_roots:
scatter_pts.append((x, root.real))
# root = real_roots[0]
# root = root.real
# y_values.append(root)
# y_values = np.array(y_values)
scatter_pts = np.array(scatter_pts)
plt.scatter(scatter_pts[:, 0], scatter_pts[:, 1], s=3)
plt.axis('equal')
plt.show()
def case2(x, y, s, f_symb):
# CASE 2: SELF-INTERSECTION (-5399/1296, 41149/2592) AT s = (37 +/- SQRT(2229)) / 72
x_s = - 36 * s**3 + 36 * s**2 + 7 * s - 4
y_s = - 18 * s**3 + 72 * s**2 - 52 * s + 7
f = (2 * x**3 - 12 * x**2 * y + 24 * x * y**2 -
16 * y**3 + 1428 * x**2 + 660 * x * y +
744 * y**2 - 6320 * x - 5393 * y - 16689) / 2
# CHECK: f.subs({x: x_s, y: y_s}).expand()
fx = f.diff(x)
fy = f.diff(y)
res_x = sympy.resultant(fx, fy, y)
res_y = sympy.resultant(fx, fy, x)
solns = sympy.solve([res_x, res_y])
for soln in solns:
f_val = f.subs(soln)
soln[f_symb] = f_val
# FX = f.diff(x).subs({x: x_s, y: y_s}).factor()
# FY = f.diff(y).subs({x: x_s, y: y_s}).factor()
# EXTRA = (FY / x_s.diff(s)).factor()
# CHECK: EXTRA.coeff(s**2) > 0
# CHECK: EXTRA + (FX / y_s.diff(s)).factor() == 0
return x_s, y_s, f, solns
def case3(x, y, s, f_symb):
# CASE 3: NO STATIONARY POINT (VALID CUBIC)
x_s = s**3 + s**2 + s + 1
y_s = 2 * s**3 + 2 * s**2 + 1
f = (y - 2 * x)**3 + (y - 2 * x)**2 + 2 * x + 3 * y - 5
# CHECK: f.subs({x: x_s, y: y_s}).expand()
fx = f.diff(x)
fy = f.diff(y)
res_x = sympy.resultant(fx, fy, y)
res_y = sympy.resultant(fx, fy, x)
solns = sympy.solve([res_x, res_y])
for soln in solns:
f_val = f.subs(soln)
soln[f_symb] = f_val
return x_s, y_s, f, solns
def case4(x, y, s, f_symb):
# CASE 4: ISOLATED POINT (-20, -48) AT s = 1 +/- SQRT(-6)
# f(-20 + h, -48 + k)
# = 0 + 0 * h + 0 * k - 105 * h**2 + 96 * h * k - 22 * k**2 -
# 8 * h**3 + 12 * h**2 * k - 6 * h * k**2 + k**3
# = kp**3 - 22 * kp**2 + 8 * h * kp - h**2
# WHERE kp = k - 2 * h
x_s = s**3 + s**2 + s + 1
y_s = 2 * s**3 + 3 * s**2 + 1
f = (y - 2 * x)**3 - 9 * x**2 + 2 * y**2 + 24 * x - 16
# CHECK: f.subs({x: x_s, y: y_s}).expand()
fx = f.diff(x)
fy = f.diff(y)
res_x = sympy.resultant(fx, fy, y)
res_y = sympy.resultant(fx, fy, x)
solns = sympy.solve([res_x, res_y])
for soln in solns:
f_val = f.subs(soln)
soln[f_symb] = f_val
return x_s, y_s, f, solns
def case5(x, y, s, f_symb):
# CASE 5: ISOLATED POINT (307/125, 392/125) AT s = (-1 +/- SQRT(-3)) / 5
x_s = 4 * s**3 + 5 * s**2 + 2 * s + 3
y_s = - s**3 + 5 * s**2 + 2 * s + 4
f = ((x + 4 * y)**3
- 832 * x**2 + 344 * x * y - 937 * y**2
+ 2333 * x + 2332 * y - 4834)
# CHECK: f.subs({x: x_s, y: y_s}).expand()
fx = f.diff(x)
fy = f.diff(y)
res_x = sympy.resultant(fx, fy, y)
res_y = sympy.resultant(fx, fy, x)
solns = sympy.solve([res_x, res_y])
for soln in solns:
f_val = f.subs(soln)
soln[f_symb] = f_val
return x_s, y_s, f, solns
def case6(x, y, s, f_symb):
# CASE 6: SELF-INTERSECTION (-1/4, 11/8) AT s = (-1 +/- SQRT(5)) / 4
# s = 1/2 --> (x, y) = (-3/4, 13/8)
# --> GRAD(f) = (-5, -14), dGAMMA/ds = (-7/2, 5/4)
# --> DIR_DERIV(f, ( 5/sqrt(221), 14/sqrt(221) ) ) = -sqrt(221)
# --> DIR_DERIV(f, (-5/sqrt(221), -14/sqrt(221) ) ) = sqrt(221)
x_s = -2 * s**3 - 2 * s**2
y_s = - s**3 + s**2 + s + 1
f = ((x - 2 * y)**3 +
20 * x**2 - 24 * x * y + 32 * y**2 +
16 * x - 40 * y + 16)
# CHECK: f.subs({x: x_s, y: y_s}).expand()
fx = f.diff(x)
fy = f.diff(y)
res_x = sympy.resultant(fx, fy, y)
res_y = sympy.resultant(fx, fy, x)
solns = sympy.solve([res_x, res_y])
for soln in solns:
f_val = f.subs(soln)
soln[f_symb] = f_val
return x_s, y_s, f, solns
def case7(x, y, s, f_symb):
# CASE 7: CUSP (0, 0) AT s = 0
x_s = s**2
y_s = s**3
f = y**2 - x**3
# CHECK: f.subs({x: x_s, y: y_s}).expand()
fx = f.diff(x)
fy = f.diff(y)
res_x = sympy.resultant(fx, fy, y)
res_y = sympy.resultant(fx, fy, x)
solns = sympy.solve([res_x, res_y])
for soln in solns:
f_val = f.subs(soln)
soln[f_symb] = f_val
return x_s, y_s, f, solns
def case8(x, y, s, f_symb):
# CASE 8: CUSP (-20, 8) AT s = 2
# x'(s) = 6(s + 1)(s - 2)
# y'(s) = -(s - 2)(3s + 2)
x_s = 2 * s**3 - 3 * s**2 - 12 * s
y_s = - s**3 + 2 * s**2 + 4 * s
f = (x + 2 * y)**3 - 4 * x**2 - 24 * x * y - 33 * y**2 - 16 * x - 48 * y
# CHECK: f.subs({x: x_s, y: y_s}).expand()
fx = f.diff(x)
fy = f.diff(y)
res_x = sympy.resultant(fx, fy, y)
res_y = sympy.resultant(fx, fy, x)
solns = sympy.solve([res_x, res_y])
for soln in solns:
f_val = f.subs(soln)
soln[f_symb] = f_val
return x_s, y_s, f, solns
def case8_plot():
s_vals = np.linspace(2 - 3, 2 + 1.5, 1000)
x_vals = 2 * s_vals**3 - 3 * s_vals**2 - 12 * s_vals
y_vals = - s_vals**3 + 2 * s_vals**2 + 4 * s_vals
plt.plot(x_vals, y_vals)
plt.show()
def case9(x, y, s, f_symb):
# CASE 9: CUSP (7, -4) AT s = -1
# x'(s) = 6(s + 1)(s - 2)
# y'(s) = (s + 1)(7 - 3s)
x_s = 2 * s**3 - 3 * s**2 - 12 * s
y_s = - s**3 + 2 * s**2 + 7 * s
f = (x + 2 * y)**3 - 22 * x**2 - 78 * x * y - 69 * y**2 - 7 * x - 12 * y
# CHECK: f.subs({x: x_s, y: y_s}).expand()
fx = f.diff(x)
fy = f.diff(y)
res_x = sympy.resultant(fx, fy, y)
res_y = sympy.resultant(fx, fy, x)
solns = sympy.solve([res_x, res_y])
for soln in solns:
f_val = f.subs(soln)
soln[f_symb] = f_val
return x_s, y_s, f, solns
def case9_plot():
s_vals = np.linspace(-1 - 2, -1 + 3.5, 1000)
x_vals = 2 * s_vals**3 - 3 * s_vals**2 - 12 * s_vals
y_vals = - s_vals**3 + 2 * s_vals**2 + 7 * s_vals
plt.plot(x_vals, y_vals)
plt.show()
def main():
x, y, s, f_symb = sympy.symbols('x, y, s, f')
x_s, y_s, f, solns = case1(x, y, s, f_symb)
for soln in solns:
print soln
# g = 2 * s**2 + 1
# h = -s**3 + s**2 + 2 * s
# SH = sympy.resultant(g - x, h - y, s)
# sympy.resultant(g - x, g.diff(s), s)
# sympy.resultant(h - y, h.diff(s), s)
# sympy.Poly(SH, x)
# sympy.Poly(SH, y)
# SH = 8y^2 + (-x^3) + ...
# sympy.solve((h - sympy.Rational(9, 8)).factor())
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment