What would be the Big O or Theta of a loop that runs forever? Just curious, was thinking about it today. Could you even bound it?

Since the loop never terminates, this program does not constitute an algorithm, you could say it is `O(∞)`

but it is still a moot point.

Remember that when we say an algorithm is e.g. O(n^{2}), that means the upper bound is the square of some n (for the particular meaning of "bound" implied by big-oh notation). Usually the n is a vague notion of the **size of the input**.

A `while true`

loop with an empty body has no input and runs forever, so trying to work out a big-oh bound for it is akin to trying to divide infinity by zero.

If you mean "what is the big-oh complexity of an algorithm that receives some input of size n and might loop forever?", then your algorithm doesn't even **decide** the problem (it can fail to terminate for some inputs), so there's no way to put an upper bound on its runtime. It's not O(anything).

Similar Questions

What is a plain English explanation of Big O? With as little formal definition as possible and simple mathematics.

Is the big o for the following O(n^2*log(n)) or O(n^3*log(n))? for (int i=0;i<n;i++){ for(int j=0;j<i;j++){ for(int k=0;k<n;k*=2){ System.out.print(test); } } }

I think this is probably a beginner question about big-O notation. Say, for example, I have an algorithm that breaks apart an entire list recursively(O(n)) and then puts it back together (O(n)). I ass

Suppose I have 2 files with size of 100G each. And I want to merge them into one, and then delete them. In linux we can use cat file1 file2 > final_file But that needs to read 2 big files, and th

What is the Big-O of this loop if someWork(..) does exactly i operations? Algorithm someWork(..) does more work as i increases. How to represent the solution in sigma notation? i <--2 while (i <

Some standard books on Algorithms produce this: 0 ≤ f(n) ≤ c⋅g(n) for all n > n0 While defining big-O, can anyone explain to me what this means, using a strong example which can help me to visual

Is O(n) considered as faster compared to O(n log n)? If I have a function that does a loop, which is O(n) then a merge sort outside the loop O(n log n) then the run time would be O(n log n) I assume?

I need help understanding/doing Big O Notation. I understand the purpose of it, I just don't know how to determine the complexity given a piece of code. Determine the Big O notation for each of the

I have noticed that big-O of 1000n or 10n is the same thing as O(n), but big-O of 2^n and 3^n are different: O(2^n) and O(3^n), what I don't get is why can't we ignore the constants in this case (2 or

I have a Big O notation question. Say I have a Java program that does the following things: Read an Array of Integers into a HashMap that keeps track of how many occurrences of the Integers exists in

I understand these algorithm functions for calculating complexities in algorithm: n = input size O(n) = linear O(n2) = quadratic but I find it difficult to understand the logarithmic (lg) functions

I have got a function and want to denote it in terms of bigO notation. f(n) = log4n+n*(1/3). Is this function O(n)? Thanks for your help

I have a few problems that I'm trying to find the Big O for. The ones that are confusing me are the (N*N) problems: for (i=1, sum=0, i <= N; i++) { for (j=1; j <= N*N; j++) { sum++; } } I'm gue

I don´t really know how to express in big O-notation. I´ve seen several sources talking about this but it only made me more uncertain. When I write in big-O should I just ignore the constants? exampl

For 3-way Quicksort (dual-pivot quicksort), how would I go about finding the Big-O bound? Could anyone show me how to derive it? Thank you!

I have to find the big-O Notation of the following expression: 2n + n(logn)10 + (1/2)n If I ignore the coefficients, I get 2n + n (log n)10 plus some term involving 1/2. If I ignore the coefficients

I'm struggling with understanding Big O and I'm unable to quite comprehend the multiple ways that I read, and watch, that are used to find the Big O of a function. It seems that every video or post ha

Maybe I'm mistaken in my understanding of Big-O notation (it has been a while since I've taken a course on algorithms) but the following has never made too much sense to me: This would be considered O

DB4O doesn't seem to provide a method to check if the database (ObjectContainer) is closed. So right now, this is the code I use to see if it's closed. I get the feeling there is a better way to do th

How would Big-O notation help in my day-to-day C# programming? Is it just an academic exercise?

A question in one of my past exams is a multi-choice question: Choose the FALSE statement: 7(log n) + 5n + n(log log n) + 3n(ln n) is A. O(n^2) B. Ω(n^2) C. O(n log^2 n) D. Ω(n) E. Θ(n log n) First I

Is there someplace where I can get a Big-O style analysis / comparison of traditional data structures such as linked lists, various trees, hashes, etc vs. cache aware data structures such as Judy tree

How would we be able to guess the big-O speed of a program given that we have the values of n and corresponding running times? Mind you, I do not come from a CS background and I have done some reading

why is the statement true : Log2(n) is O(log3(n)) I don't understand this, does big O not mean upper bond of something?. Isn't log2(n) bigger than log3(n) ? When I graph them, log2(n) is above log3(n)

I'm just learning about Big O notation, and I've been going through a few thought puzzles, and I just thought I'd verify with people whether I'm thinking through things correctly. In Javascript would

I'm wondering what is the big O notation for each of the below statement: Sum = 0; for i=1 to N^2 do: for j=1 to N do: 'Sum += 1; Sum = 0 is O(1) for sure, because it will only be executed once. But

I know that the relation n = Big-O(1) is false. But if we use induction involving Big-O it can be proved. But the fallacy is we cannot induct Big-O. But my question is how we can disprove the relation

Big 0 attempts to answer the question of inefficiency in algorithmic complexity. N+1 describes inefficiency as it relates to database queries in terms of separate queries to populate each item in a co

I've just been introduced to Big-O notation and I've been given some questions. However I'm confused as to how to determine the value of n0. I have to show that 3n^3 +20n^2 + 5 is O(n^3). So far I hav

I can't solve a problem; can someone help me? What is the Big O notation for the below statement:- for (int i=2;i<=n;i=i*4) sum++;

I can clearly see than N^2 is bounded by c2^N, but how do i prove it by using formal definition of big-O. I can simply prove it by M.I. Here is my attempt.. By definition, there for any n>n0, ther

I can't figure out the smallest upper barriers for those two i thought about log3(n) for the first one and O(n!) for the second, but i'm not sure about that, because i have not really understood the s

How can I represent its complexity with Big-O notation? I am little bit confused since the second for loop changes according to the index of the outer loop. Is it still O(n^2)? or less complex? Thanks

I've seen two different formal definitions of O notation: f(n) = O(g(n)) if there are constants n0, c where for any n0, we have f(n) < cg(n) And f(n) O(g(n)) if there are constants n0, c where for

I need help in this question. I really don't understand how to do it. Show, either mathematically or by an example, that if f(n) is O(g(n)), a*f(n) is O(g(n)), for any constant a > 0.

I'm confused, I thought that you use Big O for worst-case running time and Ω is for the best-case? Can someone please explain? And isn't (lg n) the best-case? and (nlg n) is the worst case? Or am I m

I have been given some code to work out big O runtimes on them, could someone tell me if I am on the right track or not? //program1 int i, count = 0, n = 20000; for(i = 0; i < n * n; i++) { count++

How can I specify computational complexity of an algorithm in Big-O notation whose execution follows the following pattern, according to input sizes? Input size: 4 Number of execution steps: 4 + 3 +

while (n >= 1) n /= 2; I am unable to get the Big-O notation for this

This might seem like an inane question but with all the buzz about big data I was curious as to how the typical datasets used in big data are sourced? Twitter keywords seem to be a common source - but

Please recommend a datastructure having O(logn) deletion and I want index of the element in the datastructure in O(1) or O(logn)??

I asked a question about Big-Oh / Big-Theta but they acquired constants in them It is Big Oh and does not have any visible constants in it so I don't know where to start off with this since it is a s

sometimes, we got a BIG file, but we don't know its file type. so we need to use a tool to peek the very first bytes of the file to find out its file type. but the file is too big to the normal text e

This question already has an answer here: Big-O summary for Java Collections Framework implementations? 7 answers Question pretty much says it all. Specifically, I would like the Big-O of all t

I am currently studying basic algorithms for Big Oh. I was wondering if anyone can show me what the code for (n log n) in Java using Big Oh would be like or direct me to any SO page where one exists.

I'm learning programming using YouTube and whatever I can find. I stumbled upon some Big O problems and I'm quite confused on one of them. for (int i = 0; i < n; i++) for (int j = 0; j < 2; j++)

I know that this class uses a Red-Black tree but does that mean that it is O(log(n)) to get the least element or is it O(1) or something else? Thanks!

Lets say I have a routine that scans an entire list of n items 3 times, does a sort based on the size, and then bsearches that sorted list n times. The scans are O(n) time, the sort I will call O(n lo

What is the performance in Big-O notation of the following algorithm? It's a function I wrote to print all permutations of a string. I know for an input of length n there are n! different permutations

What would the Big O notation be for the following nested loops? for (int i = n; i > 0; i = i / 2){ for (int j = n; j > 0; j = j / 2){ for (int k = n; k > 0; k = k / 2){ count++; } } } My t