-
Notifications
You must be signed in to change notification settings - Fork 0
Expand file tree
/
Copy pathtime_complexities
More file actions
66 lines (37 loc) · 1.18 KB
/
time_complexities
File metadata and controls
66 lines (37 loc) · 1.18 KB
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
lets understand time complexities better in this :
O(1) means one operation ex: in an array [1,3,4,5,6] so a[3] = 5
O(log n)
-----
check if the element is present in this array or not like
[1,2,3,4,5,6,9,10]
now this is sorted array if we wants to check the element 9 in it
it will check the length of the array like (0 + len)/2 then we go to middle element and compare the 9 with that if it is small move left or right
so right
again until we find it
O(n)
-----
O(n) is iterating over all the elements like
check if the element is present in this array or not like
[1,7,8,10,4,5]
so to check 2 is in the array or not i have to check the entire array like the linear search so O(n) the worst case
O(n log n)
----------
now lets take
[1.4.2.3.6.7.9]
we are having two sorting techniques
1.Quick sort
2.Merge sort
if we sort the array using these two techniques the array will sort
but time complexity will be O(n log n)
O(n^2)
------
using two for loops :
for i in range(a):
for j in range(a)
O(n^3)
-----
using three for loops :
for i in range(a):
for j in range(a):
for k in range(a):
O(2^n) and O(n !) we dont discuss these and dont use also