Let $n_0$ be some $n$ such that $f(n) \ge 0$ and $g(n) \ge 0$ for all $n \ge n_0$. This is given to us since
both functions are asymptotically nonnegative. Let $c_1 = 1/2$ and $c_2 = 4$. Let $i(n)= max \{ f(n) , g(n) \}$ and let
$h(n)=f(n)+g(n)$.
Then we will conclude that $0 \le c_1 i(n) \le h(n) \le c_2 i(n)$ for all $n \ge n_0$. This will show that $h(n) = θ(i(n))$,
which I proved by accident, but I will then invoke the θ property of symmetry to finally assert that $i(n) = θ(h(n))$.
We take on the right-hand side of this inequality first. Suppose $c_2i(k) \lt h(k)$ for some $k \ge n_0$. Then
$$
\begin{align}
4i(k) &\lt h(k) \\
4\ max \{ f(k) , g(k) \} &\lt f(k) + g(k)
\end{align}
$$
We consider each possibility of the $max$ function. Suppose $f(k) \ge g(k)$. Then:
$$
\begin{align}
4 f(k) &\lt f(k) + g(k) \\
3 f(k) &\lt g(k) \\
f(k) &\lt g(k)/3
\end{align}
$$
We know $g(k)/3 \le g(k)$. This implies that $f(k) \lt g(k)$. This contradicts our supposition that $f(k) \ge g(k)$.
So $f(k)$ can't be the dominant value of the max function. Suppose the alternative, that $g(k) \ge f(k)$. Then:
$$
\begin{align}
4 g(k) &\lt f(k) + g(k) \\
3 g(k) &\lt f(k) \\
g(k) &\lt f(k)/3
\end{align}
$$
We know $f(k)/3 \le f(k)$. So $g(k) \lt f(k)$. But this contradicts our supposition that $g(k) \ge f(k)$.
So the $max$ function is broken no matter which of its values we choose as its dominant value.
So there does not exist any $k \gt n_0$ such that $c_2 i(k) \lt h(k)$. So $h(n) \le c_2 i(k)$ for all $n \ge n_0$.
We now take on the left-hand side of the inequality, $c_1 i(n) \le h(n)$. Suppose $c_1 i(k) \gt h(k)$ for some $k \ge n_0$.
Then:
$$
\begin{align}
1/2 i(k) &\gt h(k) \\
1/2\ max \{ f(k) , g(k) \} &\gt f(k) + g(k)
\end{align}
$$
Let us consider each possible value of $max \{f(k), g(k)\}$. Suppose $f(k) \ge g(k)$. Then:
$$
\begin{align}
1/2 f(k) &\gt f(k) + g(k) \\
-1/2 f(k) &\gt g(k)
\end{align}
$$
We know $-1/2 f(k) \le 0$ and that $g(k) \ge 0$. So $-1/2 f(k) \le g(k)$. But this contradicts our assertion
that $-1/2 f(k) \gt g(k)$. So $f(k)$ can't be the dominant value of the $max$ function. Suppose the alternative,
that $g(k) \ge f(k)$. Then:
$$
\begin{align}
1/2 g(k) &\gt f(k) + g(k) \\
-1/2 g(k) &\gt f(k)
\end{align}
$$
We know $-1/2 g(k) \le 0$ and that $f(k) \ge 0$. So $-1/2 g(k) \le f(k)$. But this contradicts our assertion
that $-1/2 g(k) \gt f(k)$. So $g(k)$ can't be the dominant value of the $max$ function either, and the $max$
function is broken. So there does not exist any $k \ge n_0$ such that $c_1 i(k) \gt h(k)$. So $c_1 i(k) \le h(n)$ for all
$n \ge n_0$.
We can trivially see that $c_1 i(n)$ and $c_2 i(n)$ are both greater than or equal to $0$.
Therefore, $0 \le c_1 i(n) \le h(n) \le c_2 i(n)$ for all $n \ge n_0$, where $c_1 = 1/2$ and $c_2 = 4$. So,
$f(n) + g(n) = θ(max\{f(n),g(n)\})$, and thus, by the property of symmetry, $max\{f(n),g(n)\} = θ(f(n) + g(n))$.
Let A's running time be expressed as $f(n)$. To talk about A's running time is to talk about $f(n)$. To say that $f(n)$ is at least $O(n^2)$ is to say that there exists a constant $c \gt 0$ and an $n_0$ such that for all $n \ge n_0$, $0 \le f(n) \le c n^2$.
So we're saying that there exists a way for $n^2$ to outgrow $f(n)$ for all $n \ge n_0$. To say that $f(n)$ is at least this
large is meaningless, because in all of its largeness, it has failed to keep up with the growth of $n^2$ as specified
by O-notation. It is not a large function at all, from what we've said by bounding it from above.
The only thing $f(n)$ is larger than is 0, which is redundant to say, because we are assuming it is asymptotically
nonnegative anyway, which we do for all algorithms' running times.
So we've said that the running time of algorithm A is at least nothing. "Gee, thanks" someone would reply to us.
$2^{n+1} = O(2^n)$. Consider its implication, $0 \le 2^{n+1} \le c 2^n$. This implies:
$$
\begin{align}
2^{n+1} &\le c 2^n \\
2 &\le c
\end{align}
$$
So if $c=3$, this holds for any value of $n \ge 1$.
It is not the case that $2^{2n} = O(2^n)$. Consider its implication that $2^{2n} \le c 2^n$.
This simplifies to $2^n \le c$. This assertion is rendered false by increasing $n$ to as large
as needed to overcome any chosen value for $c$, as $c$ will be constant as a function of $n$, but $2^n$ grows as a function of $n$.
We first attend to the "only if" part. Let $f(n) = θ(g(n))$ be given. We have:
$$
\begin{align}
0 \le c_1 g(n) \le f(n) \le c_2 g(n)
\end{align}
$$
For some constants $c_1 > 0$ and $c_2 > 0$, for all $n \ge n_0$, where $n_0$ is some sufficiently large value for $n$.
Then we have $f(n) = Ω(g(n))$ for $c = c_1$ and the same $n_0$. We also have $f(n) = O(g(n))$ for $c = c_2$ and the
same $n_0$. The "only if" part is proven.
We now attend to the "if" direction. Let $f(n) = O(g(n))$ and $f(n) = Ω(g(n))$. The first gives us:
$$
\begin{align}
0 \le f(n) \le c g(n) \\
\end{align}
$$
for all $n \ge n_j$, where $n_j$ is the $n_0$ we normally use in the above inequality.
Let $c_2 = c$ from this inequality. We also have, from the Ω-notation:
$$
\begin{align}
0 \le c g(n) \le f(n)
\end{align}
$$
for all $n \ge n_k$, where $n_k$ is the $n_0$ we normally use in the preceding inequality. Let $c_1 = c$ from this
inequality.
If $n_k \ge n_j$, then the inequality from the O-notation still holds for all $n \ge n_k$. So we have that
$$
\begin{align}
0 \le c_1 g(n) \le f(n) \le c_2 g(n)
\end{align}
$$
for all $n \ge n_k$. Suppose $n_j \ge n_k$. Then the inequality from the Ω-notation still holds for all $n \ge n_j$. We
then have that:
$$
\begin{align}
0 \le c_1 g(n) \le f(n) \le c_2 g(n)
\end{align}
$$
For all $n \ge n_j$. So regardless of whether $n_j$ or $n_k$ is largest, we have $f(n) = θ(g(n))$ for all $n \ge n_0$ where $n_0 = max\{n_j, n_k\}$. So $f(n) = θ(g(n)) \iff f(n) = O(g(n))\ and\ f(n) = Ω(g(n))$.
We handle the "only if" direction first. Suppose $f(n) = θ(g(n))$. Then we know $f(n)$ is $O(g(n))$ for all cases, and that it is $Ω(g(n))$ for all cases, too, from theorem 3.1. So we have handled the "only if" direction.
Now suppose $f(n) = O(g(n))$ in the worst-case, and that $f(n) = Ω(g(n))$ in the best-case. Is the best case $O(g(n))$? It must be, because when we say best-case, we mean that $f(n)$ performs at least as cost-effectively as the worst-case, and the worst-case is already $O(g(n))$. So the best-case must also be $O(g(n))$, and $f(n) = O(g(n))$ in all cases. Is the worst-case $Ω(g(n))$? It must be, because when we say worst-case, we mean that $f(n)$ performs at least as cost-ineffectively as the best case, and the best-case is already $Ω(g(n))$. So the worst-case must also be $Ω(g(n))$, and $f(n) = Ω(g(n))$ in all cases.
So $f(n) = θ(g(n))$. This takes care of the "if" direction.
So $f(n) = θ(g(n)) \iff f(n) = O(g(n)) \text{in the worst case and} f(n) = Ω(g(n)) \text{in the best case}$.
Suppose $f(n) = o(g(n))$ and $f(n) = ω(g(n))$. Then from o-notation, we have for all $c > 0$, there exists some $n_0$ such that $0 \le f(n) \lt cg(n)$ for all $n \ge n_0$ where $n_0$ is some asymptotially large $n$. From ω-notation, we
have for all $c > 0$, there exists some $n_0$ such that $0 \le c g(n) \lt f(n)$ for all $n \ge n_0$ where $n_0$ is some
asymptotically large $n$. Let $c = c_1$ and $c_1 > 0$.
Let $n_1$ be the lower bound of $n$ from the o-notation such that its inequality holds for $c_1$. Let $n_2$ be the
lower bound of $n$ from the ω-notation such that its
inequality holds for $c_1$. Either $n_1 \ge n_2$ or $n_2 \ge n_1$. Let $n_1 \ge n_2$. Then the ω-notation's inequality still
holds for $n \ge n_1$.
Then we have from the o-notation that $f(n) \lt c_1 g(n)$ and from the ω-notation that $c_1 g(n) \lt f(n)$,
for all $n \ge n_1$.
This is a contradiction. Suppose $n_2 \ge n_1$. Then the o-notation still holds for $n \ge n_2$.
Then from the o-notation we have that $f(n) \lt c_1 g(n)$ and from the ω-notation we have that
$c_1 g(n) \lt f(n)$ for all $n \ge n_2$. This too is a contradiction. So regardless of which is largest, $n_1$ or $n_2$,
the existence of $f(n)$ is impossible. So $o(g(n)) \cap ω(g(n))$ is the empty set.
$Ω(g(n,m))$ is the set of all functions $f(n,m)$ such that there exist positive constants $c$, $n_0$, and $m_0$ such that
$0 \le c g(n,m) \le f(n,m)$ for all $n \ge n_0$ or $m \ge m_0$.
$θ(g(n,m))$ is the set of all functions $f(n,m)$ such that there exist positive constants $c_1$, $c_2$, $n_0$, $m_0$ such that
$0 \le c_1 g(n,m) \le f(n,m) \le c_2 g(n,m)$ for all $n \ge n_0$ or $m \ge m_0$.