Lower Bounds and Upper Bounds of a Problem

Da (Derek) Kuang
3 min readMar 4, 2020

Tonight, I feel like I finally figured out the main idea of a problem’s lower bounds and upper bounds. I will try to explain those concepts here as a reference for my future.

Problems and Cases

First, we have to set up some concepts before analyzing any problem. It is because “problem” is quite an abstract concept (even though it might be one of your most familiar English words).

From top to bottom, first, we meet all kinds of problems every day. By problem, what I mean is that you have a general description of a task to be done (for example, sorting the poker cards in my hand); you know what the input, (n cards of poker, n could be any integer) is; the excepted result, (make the cards placed in ascending order). Here n is the size of the problem.

The size of the problem (, in our example, the number of cards) makes the problem a general description of a series of situations (where I have two cards in hand, three cards, etc.). Every time you choose a specific size n, you get an instance of the problem. Note that the “instance” is still an abstract description. For example, even though you know five cards are in your hand, there are still many combinations for those cards. They could be some random number, but without any surprise, they could be in ascending order by themselves (luck day, uh?). Therefore, for each instance, we call each specific input a case. Each instance could spawn many distinct cases.

Let’s go over those concepts again with a concrete example. Now you have a stack of poker on the desk, and you are asked to take some cards from the stack and sort them. In other words, you get a problem. Then you may ask, “how many cards shall I take?”. You are asking for an instance of the problem. You take five cards as told. Looking at the card in your hand, you are looking at a particular case of the problem.

Asymptotic of Algorithms

Now you know what a problem is and what a case is. I feel like they are the most confusing two words in the lower bound analysis. The second part of the discussion is the asymptotic efficiency of the algorithm. The method you will use to solve a problem is the algorithm, and each algorithm has its own best case(s) and worst-case(s) under each instance.

I will not cover the definition of the asymptotic symbols here since it has been covered in almost every introduction level algorithm course.

Feel free to use my notes as a reference if you would like to refresh those concepts: https://github.com/wryVotary/analysis-of-algorithms

Bounds of Problem

For each size of instance n, an algorithm has the best case and worst case. Most of the time, looking at a certain algorithm, the worst cases are more attractive to us since we want to know its limitation. Similarly, looking at a problem, we would like to know the bounds of worst-case running time for all the algorithms under each instance of the problem. To be more specific,

  • The upper bound of a problem means solving it and how much time it will cost.
  • The lower bound of a problem means solving the problem and how much time at least it will need.

We believe that the upper and lower bound can unveil the essence of the problem, especially the lower bound. Think about it, if you have a funky and crazy fast algorithm, how to argue that no other algorithm can be better? If your advisor gives you a problem, how to argue that the reason that you are not able to solve it is not your laziness (sometimes it is! at least for me xD) but the problem itself? We use the lower bound of the problem.

To prove the bounds of a problem,

  • Upper bound: To prove O(f(n)) for the problem, we need to show that there is an algorithm that solves the problem and takes no more than f(n) time on each input of length n.
  • Lower bound: To prove \Omega(g(n)) for the problem, we need to show that for each algorithm A and each length n, there is an input on which A takes at least g(n) time.

--

--