I am trying to find the Big O for stooge sort. From Wikipedia

```
algorithm stoogesort(array L, i = 0, j = length(L)-1)
if L[j] < L[i] then
L[i] ↔ L[j]
if j - i > 1 then
t = (j - i + 1)/3
stoogesort(L, i , j-t)
stoogesort(L, i+t, j )
stoogesort(L, i , j-t)
return L
```

I am bad at performance analysis ... I drew the recursion tree

I believe the ... :

- height:
`log(n)`

- work on level 0: n // do I start from level 0 or 1?
- work on level 1: 2n
- work on level 2: 4n
- work on level 3: 8n
- work on level log(n):
`(2^log(n))n = O(n^2)`

?`2^log2(n) = n`

, but its what does`2^log3(n)`

actually give?

So its `O(n^2 * log(n)) = O(n^2)`

? Its far from Wikipedia's `O(n^(log3/log1.5))`

...

The size of the problem at level k is (2/3)^{k}n. The size at the lowest level is 1, so setting (2/3)^{k}n = 1, the depth is k = log_{1.5} n (divide both sides by (2/3)^{k}, take logs base 1.5).

The number of invocations at level k is 3^{k}. At level k = log_{1.5} n, this is 3^{log1.5n} = ((1.5)^{log1.53})^{log1.5 n} = ((1.5)^{log1.5n})^{log1.5 3} = n^{log1.53} = n^{log 3/log 1.5}.

Since the work at each level increases geometrically, the work at the leaves dominates.

If we define T(n) as the answer (j-i+1 = n) we have:

T(n) = 3*T(n/1.5) + O(1)

You can solve that using Master Theorem and the answer will be theta(n^log(3,1.5)) = theta(n^(log 3 / log 1.5))

You can prove that using induction on n too.

Using recursion tree is acceptable too:

k = number of levels = log (n,1.5)

ans = 1 + 3 + 3^2 + ... + 3^k = theta(3^k) = theta(3^log(n,1.5)) = theta(n^log(3,1.5))

t(n)=3t(2*n/3)+theta(n)

h=1+log base(3/2) (n)

try to calculate.it is easy.

at every level we have complexity 3^i*c where c is some constant and i is the height of that particular level

t(n)=(i=0)summation to (i=h) c*3^i

t(n)= n-2^log (base3)(n)/2^(log (base3) (n)+1)

then its a simple geometric progression

You can use the Master Theorem to find this answer. We can see from the Algorithm that the recurrence relation is:

T(n) = 3*T(2/3 n) + 1

Applying the theorem:

f(n) = 1 = O(n^{c}), where c=0. a = 3, b = 3/2 => log2/3(3) =~ 2.70

Since c < log2/3(3), we are at Case 1 of the Theorem, so:

T(n) = O(n^{log2/3(3)}) = O(n^{2.70})

Similar Questions

What is the Big-O of division on most modern day ISAs? Is there some kind of optimization or is it the naive O(numerator/denominator)? I'm writing code that relies heavily of modulus operation. For ex

Possible Duplicate: are there any O(1/n) algorithms? Is it ever possible for your code to be Big O less than O(1)?

If I have 10 elements and starting with an empty tree, What is the complexity of inserting 10 elements into Red Black in big-O notation? Is it going to be more than O(log 10) because each time it ins

I would like to show that: O(f_1(n) + f_2(n) + .. + f_k(n)) <= O(f_1(n)) + O(f_2(n)) + ... + O(f_k(n)) is true. My intuition why inequality holds is that in both directions: <=: We sum up all th

If I had two data structures linked together (e.g. each node of a linked list contained an AVL tree) then when searching for one data item would the Big O efficiency be O(N) + O(logN) = O(N), using

What all algorithms do you people find having amazing (tough, strange) complexity analysis in terms of both - Resulting O notation and uniqueness in way they are analyzed?

I am reading a book on trees. here is the text snippet. There are quite a few general algorithms to implement balanced trees. Most are quite a bit more complicated than a standard binary search tree,

If I had a function like this: void myfunction(node* root) { for(int i = 0; i<root->children.size();i++) { myfunction(root->children[i]); } } Would that be Big O of n^2 or Big O of n? If you

I was wondering how the following code traverse the tree : //pre order travel void travel (BST *tree) { the if(tree!=NULL) { printf(%d ,tree->info); travel(tree->left); travel(tree->right);

My question arises from the post Plain English Explanation of Big O. I don't know the exact meaning for logarithmic complexity. I know that I can make a regression between the time and the number of

While reading more on merge sort I came across recursion trees. What are recursion trees? Do they help in solving recursive problems? What do we achieve by drawing a recursion tree? Google did not hel

I've learned Heap Sort with a visual representation of it with the infamous tree diagram (here), so I set out to find a way to print one out and I've progressed very well so far. My only problem seems

After reading the answers to List of Big-O for PHP functions I got a little curious. The Answer of Kendall Hopkins states that php lookups are O(n) for large arrays and constant time (means O(1) ?!)

I am trying to get a grasp on Big O notations. It seems pretty abstract. I selected the most common data structures - array, hash, linkedl list (single and double) and a binary search tree and guessed

I'm searching for the Big-O complexity of PageRank-Aglorithm. I hardly could found anything, I just found O(n+m) ( n - number of nodes, m - number of arcs/edges) but I didn't believe this complexity b

What would be the best-case and worst-case complexity in Big-Theta (T) notation of the selection sort algorithm when the array grows by repeatedly appending a 19? For instance: [ 19, 13, 7, 19, 12, 1

What in an exact formal manner does the expression f(n) = 2^O(n) mean?

Recursive mechanism to find max depth of depth of binary tree is very straightforward, but how can we do it efficiently without recursion as I have large tree where I would rather avoid this recursion

The following code give me a O(n). how do I code a for loop that has time complexity of O(c^k)? int power(int x, unsigned int y) { if( y == 0) return 1; else if (y%2 == 0) return power(x, y/2)*power(x

I've come across some code which could definitely be improved, but i'm wondering about the Big-O notation of my improvements. Their original code adds a element to an array, and each time it does this

I've just been introduced to Big-O notation and I've been given some questions. However I'm confused as to how to determine the value of n0. I have to show that 3n^3 +20n^2 + 5 is O(n^3). So far I hav

I'm trying to write this algorithm with tail recursion in Scala. public ArrayList sort(ArrayList<int> toSort) { ArrayList<int> list=toSort; for(int i=0; i<list.size();i++) { int min=100

We use Ө-notation to write worst case running time of insertion sort. But I’m not able to relate properties of Ө-notation with insertion sort, why Ө-notation is suitable to insertion sort. How does th

I've written a Tree class in Python but I'm having issues creating in iterator for it I want to be able to do phonebook = MyTree() # Build up tree for node in phonebook: print %s: %s % (node.key(),

I have the following question: Solve the recurrence relation simplifying the answer using Big 'O' notation: f(0) = 2 f(n) = 6f(n-1)-5, n>0 I know this is a first order inhomogenous recurrence rela

Trying to do Big O analysis- What is the average case for this program?. O(n/2(n/2)) = O(n^2) ?.. /* Returns the largest integer in the array */ int CompareToAll(int array[], int n) { int i, j; bool

Does anyone know the algorithm for doing a post order traversal of a binary tree WITHOUT using recursion. Any information would be greatly appreciated.

1) The time cost to add n elements to an initially empty singly linked list by inserting at the front of the list. the answer seems to be one of these O(n) or O(1). I think it is O(1) because insertin

I'm quite new to java and one of our assignments requires me to create a binary tree containing nodes with int values. My professor wants us to use one class containing the main method. I applied two

I've been working through a recent Computer Science homework involving recursion and big-O notation. I believe I understand this pretty well (certainly not perfectly, though!) But there is one questio

I have a few questions so please bear with me. I need some help to clarify Big O and run time. As I understand it Big O is a way of presenting the run time of an algorithm correct? From reading I've b

What is the big-O complexity of the function (log n)k for any k?

I've been trying to make a program that will take user input (an integer) and increment that integer for every level in a tree, using recursion: Edit: This is the __init__ method for the class set_dep

I'm trying to determine the time complexity of the following: while loop that executes n! times { perform an operation that is O(n) } Would the big-o analysis still be O(n!)? I don't see how it would

I have a quite simple problem, however I am a bit unsure of what the actual runtime (e.g. Big-O) is of this. The program looks like this. n <- user input for i=1 to n foo(i) foo a: for i=1 to a foo

Title is self-explanatory. Very easy question. I think it's O(n) but wanted to verify before my final tomorrow.

The great people at MyCodeSchool.com have this introductory video on YouTube, covering the basics of Big-O, Theta, and Omega notation. The following definition of Big-O notation is provided: O(g(n) )

I'm having trouble trying to understand how to use recursion with this problem. I'm using Ruby to solve it because that's the only language I know so far! You have some hash of firms that own other fi

I am trying to get a better understanding of big oh algorithm analysis Is there a way to simplify 5n * (log(n))^3. I am thinking it simplifies to: n * (log(n))^3

Consider a function whose body is: sum = 0; for (i = 1; i <= f(n); i++) sum += i; where f(n) is a function call. Give a simple and tight big-oh upper bound on the running time of this function, as

While going through Wikipedia's list of sorting algorithms I noticed that there's no stable comparison sort that has O(n*log(n)) (worst-case) time-complexity and O(1) (worst-case) space-complexity. Th

I am trying to learn Big-O notation but i have difficulties in calculating time complexity of recursive functions. Can you help me to understand the time complexity of following example? public int r

Suppose I have this tree : O-ROOT / \ O-A O-B / / \ O-A1 O-B1 O-B2 and I want to do this in C#: 1. Check every node starting from root (I think the best way is trought recursion?); 2. If I found a

I am trying to figure out how to use recursion in programs. I understand how recursion works in classical examples like factorial, but am not sure how to apply it on my own... I am starting out with

def foo(x): if x > 5: return foo(x–1) – foo(x-1) else: return 11 def bar(a,b): if (b > 0): return bar( bar(a, b+1) , b-1 ) else: return 0 How do I find the running time in Big O notation and ho

For large problems sizes, an algorithm with time cost O(2^n) is faster than an algorithm that has time cost O(N^2) Is this true or false? What I think is that if C^n, C = constant and C > 1, then

I learned that,using Big O notation O(f(n)) + O(g(n)) -> O(max(f(n),g(n)) O( f(n) )* O( g(n)) -> O( f(n) g(n)) but now, I have this equation for running time T for input size N T(N) = O(N^2) //

I need some help understanding Big O concepts on code. We only went over this for 30 mins and we did not discuss how to interpret code on java (I think?). Any who, I'll try to attempt my solution can

I am trying to understand Big-O Time complexity and am unfortunately struggling, I cannot seem to grasp the concept, I know my results are correct for the following two code fragments however how I go

Following is the iterative tree, done in html.erb, which reaches only two levels in a tree structure: <ul> <li><%= root_template.id %></li> <ul> <% for template in ro