This question already has an answer here:

- What does O(log n) mean exactly? 25 answers

Like the Big O notation "O(1)" can describe following code:

```
O(1):
for (int i = 0; i < 10; i++) {
// do stuff
a[i] = INT;
}
O(n):
for (int i = 0; i < n; i++) {
// do stuff
a[i] = INT;
}
O(n^2):
for (int i = 0; i < n; i++) {
for (int j = 0; j < n; j++) {
// do stuff
a[i][j] = INT;
}
}
```

- What code can O(log(n)) describe?

Another question:

- What solutions are there for "Big O problems" (what to do, when getting a lot of data as an input)?

Classic example:

```
while (x > 0) {
x/=2;
}
```

This will be:

```
Iteration | x
----------|--------
0 | x
1 | x/2
2 | x/4
... | ...
... | ...
k | x/2^k
```

2^{k} = x → Applying log to both sides → `k = log(x)`

For O(logn), please have a look at any code that involves divide and conquer strategy Example: Merge sort & quick sort(expected running time is O(nlogn) in these cases)

Binary Search is an example O(log(n)). http://en.wikipedia.org/wiki/Binary_search_algorithm.

From definition, log(n) (I mean here log with base 2, but the base really doesn't matter), is the number of times, that you have to multiply 2 by itself to get n. So, O(log(n)) code example is:

```
i = 1
while(i < n)
i = i * 2
// maybe doing addition O(1) code
```

In real code examples, you can meet O(log(n)) in binary search, balanced binary search trees, many resursive algoritmhs, priority queues.

It might be worth emphasizing that the lower complexity algorithms you described are subsets of the the higher complexity ones. In other words,

```
for (int i = 0; i < 10; i++) {
// do stuff
a[i] = INT;
}
```

is in O(1), but also in O(n), O(n²), and, if you wanted to be clever, O(log(n)).Why? Because all constant time algorithms are bounded by some linear, quadratic, etc. functions.

What solutions are there for "Big O problems" (what to do, when getting a lot of data as an input)?

This question doesn't make a lot of sense to me. "A lot of data" is a quite arbitrary. Still, keep in mind that Big O isn't the only measure of time complexity; apart from measuring worst case time complexity, we can also examine best-case and average case, though these can be a bit trickier to calculate.

In the case of binary search, you are trying to find the maximum number of iterations, and therefore the maximum number of times the search space can be split in half. This is accomplished by dividing the size of the search space, n, by 2 repeatedly until you get to 1.

Let's give the number of times you need to divide n by 2 the label x. Since dividing by 2, x times is equivalent to dividing by 2^x, you end up having to solve for this equation:

n/2^x = 1, which becomes n = 2^x,

So using logarithm, x = log(n), so BIG - O of binary search is O(log(n))

To reiterate: x is the number of times you can split a space of size n in half before it is narrowed down to size 1.

http://www.quora.com/How-would-you-explain-O-log-n-in-algorithms-to-1st-year-undergrad-student

Similar Questions

I am taking now the big O in ICS202 course, and I really find some dificulty to figure it out from a code, Is there any videos,web pages or blogs that can help me with that?

I know in Big O Notation we only consider the highest order, leading polynomial term because we are basically placing this theoretic worst case bound on compute-time complexity but sometimes I get con

I'm just learning about Big O notation, and I've been going through a few thought puzzles, and I just thought I'd verify with people whether I'm thinking through things correctly. In Javascript would

Wikipedia says: The statement f(x) is O(g(x)) as defined above is usually written as f(x) = O(g(x)). Some consider this to be an abuse of notation, since the use of the equals sign could be mislead

I'm asked to give a big-O estimates for some pieces of code but I'm having a little bit of trouble int sum = 0; for (int i = 0; i < n; i = i + 2) { for (int j = 0; j < 10; j + +) sum = sum + i +

I am doing the exercise of Skiena's book on algorithms and I am stuck in this question: I need to calculate the big O of the following algorithm: function mystery() r=0 for i=1 to n-1 do for j=i+1 to

What will be the Big O notation for the above complexity? Is it O(n)

This question already has an answer here: Plain English explanation of Big O 21 answers What is Big O notation? Do you use it? I missed this university class I guess :D Does anyone use it and g

With regards to Big O notation what is the best method of solving the following problem? Given a String array of size N. Iterate through the array and return a single string containing the string that

I know that big O notation is a measure of how efficint a function is but I don\t really get how to get calculate it. def method(n) sum = 0 for i in range(85) sum += i * n return sum Would the answe

This maybe a trivial/ mathematical concept that I cant seem to work my head around. So if the processing time T(n) of a certain algorithm is both Ω(n) and O(n^3), how can i prove that the T(n) is Θ(n^

show that n2/log(n) + 105×n×log(n5)) = O(n2/log(n)) I am having a hard time solving this. If someone could explain to me why that is true, that would be fantastic.

If I understand Big-O notation correctly, k should be a constant time for the efficiency of an algorithm. Why would a constant time be considered O(1) rather than O(k), considering it takes a variable

I am doing a question which asks to find the complexity of a nested for loop simplified using big O notation. The question is: for i <- 1 to n do for j <- 1 to n do for k <- 1 to (i+j) do a u

Im receiving this error when trying to compile my code. $ g++ -o BangBangControlTest BangBangControl.o BangBangControlTest.o ld: duplicate symbol _heating_unit in BangBangControlTest.o and BangBangCo

I am just starting to learn about Big O notation and had a question about how to calculate the growth rate of an algorithm. Suppose I had an algorithm with O(√n log n) time complexity, and for n = 10

I need to derive the big-O notation of this validation program. Its job is to accept product entries of this type: 'jacket,8,12,18,16,6', validates it, sort the sizes, sort the entry into a list by al

So I have this problem to do and I am not really sure where to start: Using the definition of Big-O, prove the following: T(n) = 2n + 3 ∈ O(n) T(n) = 5n + 1 ∈ O(n2) T(n) = 4n2 + 2n + 3 ∈ O(n2) if an

Is there a master list of the Big-O notation for everything? Data structures, algorithms, operations performed on each, average-case, worst-case, etc.

I'm trying to grasp the idea of Big O for part of a project that is due tonight and I don't know if i'm thinking this through right or not... The project included us writing Iterative and Recursive so

For example, if time complexity of merge sort is O(n log n) then why it is big O not theta or omega. I know the definition of these, but what I do not understand is how to determine the notation based

The following code give me a O(n). how do I code a for loop that has time complexity of O(c^k)? int power(int x, unsigned int y) { if( y == 0) return 1; else if (y%2 == 0) return power(x, y/2)*power(x

Lets say we have a problem we implemented using X algorithm with O(n) or O(log n) or etc.... When is the value of n big enough that we must consider an alternative implementation? Let's see if i can e

Good evening people, I would like some help to compare a Big O and a Theta algorithm. I can understand how to compare two big O's but something troubles my understanding on how to compare big O with T

Given the following code -: for(int i = 1; i <= N; i++) for(int j = 1; j <= N; j = j+i) { //Do something } I know that the outer loop runs N times, and that the inner loop runs approximately lo

I wonder whether there is any automatic way of determining (at least roughly) the Big-O time complexity of a given function? If I graphed an O(n) function vs. an O(n lg n) function I think I would be

Hello I am having trouble finding the running time for this algorithm with following assumptions that s = O(logN) and the running time of Random_Integer is constant time: 1 express N-1 as 2^t * u whe

A homework question asks me to analyse the following code fragement: for (int i = N; i > 0; i--) for (int j = 0; j < i; j++) I think the inner loop runs the following number of times: N + (N-1)

I don´t really know how to express in big O-notation. I´ve seen several sources talking about this but it only made me more uncertain. When I write in big-O should I just ignore the constants? exampl

Possible Duplicate: Plain English explanation of Big O I know Big O notation is used to assess how efficient an algorithm is, but I do not understand how you read Big O notation or what exactly how

What are the computational complexities of JBIG lossless compression algorithm in Big O notation?

(logN)^logN and n/logN what is the Big O relation between these two? and how to derive a proof the relation?

Can you give an example of a big and complex SystemVerilog constraint? The bigger the better, and preferably realistic. Perhaps some address calculation that also depends on a few other variables. I a

I think this is probably a beginner question about big-O notation. Say, for example, I have an algorithm that breaks apart an entire list recursively(O(n)) and then puts it back together (O(n)). I ass

Possible Duplicate: What is Big O notation? Do you use it? Hi all, fairly basic scalability notation question. I recently recieved a comment on a post that my python ordered-list implimentation but

What would the Big O notation be for the following nested loops? for (int i = n; i > 0; i = i / 2){ for (int j = n; j > 0; j = j / 2){ for (int k = n; k > 0; k = k / 2){ count++; } } } My t

I have a couple of questions regarding some algebra using big O notation: if f(n)=O(g(n)) is log(f(n)) = O(log(g(n)))? is N^{f(n)}=O(N^{g(n)})? (where N is any real number)

Possible Duplicate: What is the difference between Θ(n) and O(n)? It seems to me like when people talk about algorithm complexity informally, they talk about big-oh. But in formal situations, I ofte

This question already has an answer here: Big-O summary for Java Collections Framework implementations? 7 answers Question pretty much says it all. Specifically, I would like the Big-O of all t

What would be the big O notation of the length of the list of permutations of the characters of a list of words of lenght n? I just do not know how to express that because it would be like n! for each

Can someone help me figure out the running time of this loop? I believe it is O(5nlogn). for(int f = 0; f < Array.length; f++) { F = Array[f]; for(int e = 0; e <= f; e++) { E = Array[e]; for(int

What's the big-O complexity for the following loop: for each vertex u ∈ C do for each vertex v ∈ C and v > u do What I'm doing here is imagine the following set {1,2,3,4} the loop executes a funct

Possible Duplicate: Efficency of Insertion Sort vs Bubble sort vs Selection sort? is selection sort faster than insertion for big arrays? When its in the worstest case? I know insertion be faster th

The Hash table wiki entry lists its Big O as: Search: O(n) Insert: O(n) Delete: O(n) while a java HashMap is listed with Big O as: get: O(1) put: O(1) remove: O(1) Can someone plz explain why does t

Just wondering if anyone has come by any example code using the ToneGenerator class? I would like to generate tones in the frequency range of about 200Hz to 900Hz. Thanks...

I am doing some comparisons about where to filter out items from a list. I am unsure of doing it directly which would be O(n), or using .Where(). I made a simple example to test .Where() on a simple d

The std::queue class is unclear as to the complexity of the size member function. It appears to be based on the data structure implementation used at the time. One would assume that size would be O(C)

Which are the corrent hierarchy between above. is that correct? 0(n^3) < O(2^n) < O(n^2)

both single and double? If this is to abstract, please don't down-vote, and I can remove it. There should be a big O for inserting and finding elements in a linked list. According to Columbia notes, f

This question already has an answer here: Big O, how do you calculate/approximate it? 22 answers How can I learn if an algorithm runs in O(n) time, or O(nlogn)?