Why do data structures like splay trees or dynamic arrays have good average-case guarantees? Explain the idea of amortized analysis in this context, and connect it to practical considerations such as memory hierarchy and cache-friendliness.
Explain the “lens laws” in functional programming, using examples in a language other than Haskell. Why are these laws important, and how do they help in reasoning about programs that manipulate data structures?
Matching problems such as bipartite matching or stable marriage are fundamental in algorithms. Explain how randomized approaches can improve the performance or robustness of matching algorithms. Why might randomisation help in avoiding worst-case behaviour, and what provable guarantees can still be made?