I am having trouble understanding this time complexity `O(sqrt(B))`

given that `B`

is an integer.

For example if I have a function...

```
int GetResult(int A, int B)
{
}
```

...and this function has a time complexity of `O(sqrt(B))`

, what exactly is the time complexity?

Sorry if this is a little vague...I'm not really sure how else to explain.

The time complexity is an indicator of the runtime of a function relative to the amount of its input data.

given n data items for a function,

- O(n) means a function will simply pass over each data item "once". So doubling the input amout will double its duration.
- O(n
^{2}) would mean that a function for example has two nested loops over the data, so double the input amount and wait 4 times as long. - O(log n) for example would only need logarithmic time, e.g. when you give 10 times more input, the function will only take one "step" longer.
- O(sqrt(n)) thus means when you give 4 times the input of a call, the function will only take twice the time.

The Big-O-Notation only states how a function scales, but not how long it actually takes. For instance, the Big-O-Notation ignores constant factors. e.g. A function that iterates 4 times over some data (4x loop in sequence) has O(4n), and that is equal to O(n).

This fact also shows why O(log_{10} n) is equal to O(log_{2} n): log_{10} n = (log_{2} n) / (log_{2} 10). As (log_{2} 10) is a constant factor, it can be omited in Big-O-Notation. Thus you can choose whatever log you like, it will not mean any difference concerning Big-O-complexity.

When you have two inputs, say lists A and B, you use two variables for there size, say n resp. m. A function that has complexity O(n^2 * log m) behaves as follows:

- doubling list A will result in much slower execution (i.e. 4x duration) but
- doubling list B will only result in only "one more iteration" over the A's processing duration of O(n
^{2}) (i.e. it will only take n^2 * (any unknown constant factor) longer.)

The answer to your question depends on B.

If B = O(n^4), then O(sqrt(B)) = O(n^2)

If B = O(n^2), then O(sqrt(B)) = O(n)

If B = O(n), then O(sqrt(B)) = O(n^(1/2))

Similar Questions

My algorithm is as shown below. It makes a remote call to a server and gets the results process them and again send the remote call to a system. Can you give me an idea what can be time and space comp

Hello everyone I trying to calculate the time complexity of Maximum Subsequence Sum. Actually I know the answer which is O(n^3) and it follows from the function (n^3 + 3n^2 + 2n)/6 My question is how

What is the time complexity of the put(x) and get() functions for a Stack abstract data type that is implemented using a LinkedList? My first thought was they are both O(1). But if get() has to traver

I am trying to figure out the complexity of a for loop using Big O notation. I have done this before in my other classes, but this one is more rigorous than the others because it is on the actual algo

What would be the BigO time of this algorithm Input: Array A sorting n>=1 integers Output: The sum of the elements at even cells in A s=A[0] for i=2 to n-1 by increments of 2 { s=s+A[i] } return s

How to decide on expressing time complexity of algorithm ? Should we choose to express time complexity in terms of O(n) or theta(n) ? Because a function f(n) can be expressed as either Big-Oh(g(n)) o

Just wondering if in a question there is talk about the Running time of an algorithm, does it mean the same as Time Complexity or is there any difference between the two?

This algorithm looks through a string and tries to find another string. The logic is simple, I guess. Though, I need help finding it's complexity. int find(string mString, string lookUp) { int i, z, j

Does anyone know which algorithm MATLAB uses for matrix multiplication and what is its time complexity?

Is the time complexity of the following code O(NV^2)? for i from 1 to N: for j from 1 to V: for k from 1 to A[i]://max(A) = V z = z + k

I want to know the exact time complexity of my algorithm in this method. I think it is nlogn as it uses arrays.sort; public static int largestElement(int[] num) throws NullPointerException // O(1) { i

I've looked over the big-O complexity of multiplying two n × n matrices, which takes time O(n3). But how do you get big-O complexity for multiplying two rectangular matrices which are of dimensions m

I am reading some information on time complexity and I'm quite confused as to how the following time complexities are achieved and if there is a particular set of rules or methods for working this out

I read about time complexity modular arithmetic in many books . there is thing I don't understood . I read in some books the following For any a mod N, a has a multiplicative inverse modulo N if and o

Does anyone know the time complexity of the operations of TreeMap like - subMap, headMap. tailMap. The time complexity of operations like get, put is O(logn). But the javadoc doesnt say much about the

Today i'm come across with the blog in msdn and i noticed how to calculate the time complexity of an algorithm. I perfectly understand how to calculate the time complexity of an algorithm but in the l

Can anyone teach me how to calculate the time complexity when you have an polynomial as a condition in your for loop? eg. for(i = 1; i < n^4; i = n * i){ ... }

void f(int n) { int x = n; while (x * x > n) { x /= 2; printf (“x cubed = %d\n”, x * x * x); } while (x > 0) x--; printf(hello %d\n, x); } I don't understan how they got complexity of TETA(sq

As the title says i was wondering what the time complexity of the contains() method of an ArrayList is.

I'm trying to write my own (as close to standard as possible) single linked list implementation. However I am wondering what time complexity people expect of such a list? Especially for inserting I a

what is the time complexity of the pytables file operation get_node? Let's say I query mynode = myfile.get_node(where='group0/group1/..../groupN', name ='mynode') How does this operation scale with N

I need som help understanding how this works! how do I go about calculating the complexity of 'Computing the first half of an array of n items' or 'displaying the third element in a linked list' ? I n

I am having issues fully understanding how to prove some of the following statements. For instance I have a statement: n^2logn = O(n^2). Correct me if I am wrong, but this states that n^2 is bigO of n

Input: My Name is Pritam Output: Pritam is Name My I have written this so far, but I'm bit confused with time complexity public string ReverseWordsInAString(string str) { char[] temp = str.ToCharA

What is the Big-O time complexity ( O ) of the following recursive code? public static int abc(int n) { if (n <= 2) { return n; } int sum = 0; for (int j = 1; j < n; j *= 2) { sum += j; } for (i

Can an algorithm having a time complexity of O(n) have a space complexity of O(n2) or more than that?

I've been told different things over my course on algorithms, and was wondering if I could get a definitive answer as to the time complexity of Java's System.out.println() command. For example, what w

I have the following problem: For the following code, with reason, give the time complexity of the function. Write a function which performs the same task but which is an order-of magnitude improveme

I'm currently using printCoefmat to print a matrix out and want to apply some formatting to the numbers. I want to force scientific notation when the numbers have an exponent greater than 3. I can't q

Possible Duplicate: Plain English explanation of Big O I can't find any sufficient help for learn or understand about O-notation, and how learn about time or space complexity. So please suggest me ,

I'm looking for an intuitive, real-world example of a problem that takes (worst case) exponential time complexity to solve for a talk I am giving. Here are examples for other time complexities I have

I'm still learning about complexity measurement using the Big O Notation, was wondering if I'm correct to say that following method's complexity is O(n*log4n), where the 4 is a subscript. public sta

I am using this algorithm in my program : for( i=0 ; i<N ; i++ ) for( j=i+1 ; j<N+1 ; j++ ) for( k=0 ; k<i ; k++ ) doWork(); Can anyone help me find the time complexity of this snippet ? I g

I understand that when you multiply two time complexities, you just multiply them as usual, for example a time complexity of n log n multiplied by the time complexity of n will give you a time complex

I'm trying to get my head around the calculation of time complexity of my own method - can anybody nudge me in the direction of calculating a method involving a for each method which involves a recurs

I am writing a simple Python program. My program seems to suffer from linear access to dictionaries, its run-time grows exponentially even though the algorithm is quadratic. I use a dictionary to mem

I've gotten pretty good at figuring out time complexity, but am still confused about how to determine exponential time algorithms. The algorithm below has a worst case running time O(k^n). How did the

I need help for this question. I think the time complexity is O(n), but my friend insists that this is O(n^2). one of reason is because of fn = fn[index+1 ::] #filename: string_expression_matcher.cc

Need some Help Regarding how to calculate the time complexity of a function. e.g. while(x<N) { while(y<N) { stat 1; if(..) stat; } } thanks.

def multi_merge_v1(lst_of_lsts): all = [e for lst in lst_of_lsts for e in lst] merged = [] while all != []: minimum = min(all) merged += [minimum] all.remove(minimum) return merged What is the time c

While experimenting with the new reference classes in R I noticed some odd behaviour if you use the [[ ]] notation for methods (X[[doSomething]] instead of X$doSomething). This notation works for

What have I done before Asymptotic analysis of three nested for loops I was solving the time complexity of this algorithm. seeing that the outer loop runs n times and inner loop 1 runs i times, I appl

Suppose there is an array containing unsorted data and I need to choose either linear search or binary search for searching. Then which option should I choose? The time complexity for linear search is

Lets assume I have the following : 1- a code snippet CODE1 with time complexity O(N^2) 2- a code snippet CODE2 with time complexity O(L*N) if I integrate both snippets in one java program like :

def foo(x): if x > 5: return foo(x–1) – foo(x-1) else: return 11 def bar(a,b): if (b > 0): return bar( bar(a, b+1) , b-1 ) else: return 0 How do I find the running time in Big O notation and ho

I wanted to know the time complexity of the next_permutation function. Can I view its code too ?

how can we tell whether the obtaining time complexity for an algorithm is in best case or worst case or an average case? example if we get time complexity as t(n) = 2n^2 + 3n -1, how to estimate the b

Is there a way or a resource for finding the time and space complexity of the Array implementation in PHP other than calculating it by hand? An array in PHP is actually an ordered map. A map is a typ

This question already has an answer here: What is the time complexity of LinkedList.getLast() in Java? 5 answers I am implementing a Linked list in terms of stock market program. It has and ope

In most materials available online given an input n you can figure out the big O of a notation. However, there are lectures online who looks at the number of bits as input. What does it mean to