Why the FASTEST Sorting Algorithm can(t) be O(N)!
Vložit
- čas přidán 4. 07. 2024
- What is the fastest sorting algorithm known to man? We find out by attempting to find an O(n) time complexity sorting algorithm. Until now, the sorting research has found O(n*log(n)) to be the best.
We discuss concepts of time complexity, decision-making, and how the best algorithm is bounded by the permutation tree of an array. A final trick is used to reduce complexity to O(N)!
N! Approximation: en.wikipedia.org/wiki/Stirlin...
log(N) video: • Why does log(N) appear...
00:00 Introduction
00:19 Prerequisites
00:40 Sorting Algorithms Comparison
01:32 Building a Permutation Tree
04:35 Finding the best permutation
05:45 Time Complexity Analysis
07:21 Proof through Contradiction
08:08 Parallel Sorting!
09:25 Thank you!
System Design Video Course:
interviewready.io
Along with video lectures, this course has architecture diagrams, capacity planning, API contracts, and evaluation tests. It's a complete package.
Sources:
hackernoon.com/timsort-the-fa...
stackoverflow.com/questions/3...
stackoverflow.com/questions/2...
#sorting #algorithm #sortingalgorithm
You said to express a number we require atleast Log(n) operation. But we don't include this in the Time complexity analysis of bubbe sort, still it's completely is O(N square). Why? Shouldn't it be more than N square?
I think this part of the video is the most misunderstood bit, and I take responsibility for not communicating my thoughts accurately:
A number takes logN bits to represent. When we choose a position for it to be placed, the index will take logN bits to represent. Each bit requires a decision to be made -> 0 or 1. Hence we make logN decisions for deciding a single position amongst N positions.
This is how I came to the series logN + logN - 1 + logN - 2 + ... + 1
Bubble sort goes through the numbers ahead of it. Each comparison is assumed to be an O(1) operation. This makes sense because the computer hardware can compare 32/64 bits in a single operation, and we rarely sort numbers larger than that.
If you sort an array of arbitrarily large numbers, the complexity of bubble sort will be O(N^2* log(N)). The same as bubble sorting strings of size L: O(N^2 * L).
If L is small enough or constant, the complexity reduces to O(N^2), since the factor of L is ignored.
@@gkcs Makes sense. Thanks!
Gaurav Sen why does a number take LogN bits to represent? If bits are only 0 and 1 won’t a number like 255 need 8 bits (11111111)? But log(255) base ten is 2 so only 2 bits to represent?
@@noasmr46 I think he meant log base 2 of N
@@noasmr46 A bit is a digit in the binary world. A decimal digit is a digit in the decimal world. You are mixing the two.
Binary? log(N) to the base 2.
Decimal? log(N) to the base 10.
Check out: czcams.com/video/Xe9aq1WLpjU/video.html
This is what clickbait looks like for computer science students.
souravk229 lmao 😂
😂😂😂😂
It worked 😂
He got me 😪
O(n) hahahaha
Actually, the Faith Sort's time complexity is O(1) - you just take the input array, put all your faith in it being already sorted, and return it. Worst case scenario whatever god you believe in takes care of the rest.
bruh....
😂😂😂😂😂
😂😂😂😂
🤣🤣
😂😂
There is a sorting algorithm working in O(n) called eliminationsort. You eliminate every element that is out of order. By the end, you have a sorted list.
The comments today on this video are awesome :D
This comment is great :D
i optimize your alogrithm by delete the whole list
Also knowns as HitlerSort
@@gkcs so is it true that elimination sort takes o(N) time?
Ever heard of Schrodinger's + Heisenberg's sort? 🤩
The array is already sorted as long as.......
You don't look the array 😂
Brilliant XD
lul
smart!
Best comment
It is both sorted and unsorted at the same time until you look at it to be precise😂
Cool one though!
The trick to faster sorting is to always buy your data pre-sorted!
Hahaha!
This is where he could have talked about best case for Insertion Sort.
is a joke, but that's pretty much what high performance programs try to do: push all extraneous calculations to before you actually needs to run the program
Assumption sort. Assume its sorted, return.
@@maciejmalewicz9123 ----- I go one better - I do not even have to assume it is sorted. I will just declare it sorted already, in the order of whatever the list is already showing.... hey, why bother to sort at all … you get whatever you see...by the time we spent discussing all these optimal sorting techniques, the brute force program code would have run the job many times over, right ? :-)
pregnant woman: (hits blunt) it wont affect my child.
the child: let's use bubble sort.
Actually, at the extremely low level of an L1 CPU cache (very small lists, very small data), bubble sort takes the least amount of real time. It's faster (in memory) to access and compare cached nearby elements than it is to go get elements from RAM and put them in the cache.
Nah, he be using bogosort.
@@dr_davinci delete sort, better
@@tusharsingh2439 no its worst sort
@@JeffHykin no, for small lists, insertion sort works best. As a matter of fact quicksort is also supposed to use insertion sort once partitions are small enough
"O(n) is not possible"
Do you want to learn about our lord and savior Sleep Sort?
Hahahaha 😝
i stand by the word of Gravity sort
in all seriousness, O(n) and even O(log(n)) time are possible with parallel sorting networks.
though they can only sort a fixed n, if you need a really fast sort for n
Radix sort I think is O(n). It works really well if you drop your deck of FORTRAN cards and you need to get them back in order.
Random Sort is able to do it in O(n). It just doesn't happen very often. ;)
C'mon, you know you deserved this comment for that title. :P
Hahaha!
I thought it is in O(1)
@@ivosu4936 You need to verify the guess too 😛
@@gkcs or you can just roll with it, cause in one of manny parallel universes it is sorted
You might be knowing this, but time complexity has to taken considering all cases instead of just best cases
youtube guy explaining fastest sort algo:
me simply using .sort() method in python : KODER!!!
That .sort() in Python is using Timsort algo developed by Peter Tim, having worst and best case as n*log(n).
@@ItachiUchiha-ub2iu yes that's similar to merge sort I Guess
O(1) for sure m8
Good content but bad clickbait ):
The fastest sorting algorithm should have been O(1): By pointing to an empty list.
An empty list is always sorted.
Hahaha!
or a list with only one element ;)
lol!
We are talking about the worst case here, Don't even write a program it will be O(0) 😂
I totally and absolutely love the creative treatment.
This video always brings up a smile.
What's more the explanation is clear to an non-computer science student like me.
Great work Gaurav Sen.
Thanks!! I really enjoyed the way you explained the concept.
Dozens of years of my life as a programmer, I mostly use Sort() method in almost C# collections. I never care how it works. You have explained the subject very well mate. Thanks
My senior who works at a MNC suggested me your channel for system design videos, and trust me if I am saying that this channel has great great content for budding fresher's.
A big fan of yours 🙌🏻
Hey everyone! This video is meant to explain why O(n) is impossible for a sorting algorithm.
I try to keep metadata as relevant as possible, but I have to acknowledge market realities. The titles have to be catchy and we all love drama. Thanks for all your feedback. Cheers!
Gaurav, i didn't get that part of god at 5:18 , i think you mixed two N's
Like first u said if N is the single number u need logN time to say it to god , then after some time u mixed that logN with the original N i.e. number of elements in an array . Isn't it??
Yes, but they are the same in this case. When we choose an element to put into the sorted array, we are effectively telling God an index from the original array. This index can take values from 0 to N-1, which is N choices.
So we need to say a number upto N, which takes log(N) time. And this number N is equal to the number of elements in the array.
Once a element is removed, we have N - 1 choices, and God can now be told any number between 0 to N-2 to place in the second position. This requires log(N-1) time. And so on.
@@gkcs Ok, so just like insertion sort but rather than iterating for that min/max value god will help us , nice!!
Yup!
Hey , can u plz explain or share source why radix sort is not of order n , and what about bucket sort
haha.. laughed at the start looking at so many YOU.. Simple and neat explanation... I've been watching your videos lately and guess what ? You have a Subscriber ;)
Thanks to your videos. You're one of the reasons I'm placed today, that too within 15 days since the inception of placement season in my college. I used to suck at coding, now I can safely say that I'm amongst the top 20 coders of my class(PS: my college(VJTI,Mumbai) has very good coders ) . I am placed in Accolite (which recruits only 60 engineers all over India , I was amongst the 6 people they chose out of 300 students in a pool campus process,went through 1 written coding round , 3 gruesome tech rounds and one HR round ). I went till the last round of Morgan Stanley (top 5,but they chose only 2 sadly) and your system design videos helped me a lot through the process.
Now I've started competitive programming and I'm enjoying it a lot.
From a person who didn't get a single internship last year, to being amongst the first students to be placed this year. You've inspired me.
Thank you!
Keep up the good work! :)
I am super impressed Janhavi! Congrats! 😍
Hi Guarav. I really like your content, and this video is no exception. I understand the marketing realities present on CZcams force one to make "clickable" titles. I really don't want to see smart channels like the one you run devolve into the Buzzfeed of CZcams. Granted, this is a far cry from the typical clickbait, but I just don't find those tactics to be genuine.
I wouldn't take the time to comment on other channels employing these same tactics, I just feel like the quality of content you usually produce is above the clickbait tactics. Hope you take this as constructive criticism, and not as a personal attack.
I will, thanks for the feedback! 😁
He has mentioned that in comment section. Don't watch it if you don't find it useful. Gaurav keep it up I loved this video. Why do we learn this ? 😎😎😂😂😂
This was a click-bait.
Yeah click-bait, but good video
Thank you very much Gaurav, I was surfing for this and you gave the best concise explanation. Cheers to your efforts behind this!
Thank you!
Nicely explained, but that God part is little confusing, instead of that it would be better to say,
1) Comparisons will be at-most height of your decision tree
2) no of leaf nodes of this tree would be n! i.e. no. of permutation possible
3) and h >= log(no. of leaf node)
I didn't ever thought of this concept....tq for making such a good video and sharing
Do you understand this concept?ok tell which is fastest sorting algorithm?
@@poosaipandiv4947 Counting Sort.
I did not understand the part at which you started saying "you can represent a number with log n +1
What does it have to do with the permutation?
Brilliant explanation of the derivation of fastest algo for sorting
I love the content! Thank you!
I came here for System Designs ! But omg now I have started learning logarithms, such a wonderful video ... especially where u prove fast sort cannot exist!
Thank you!
This guy makes most precise videos..
You are amazing bro I wanna learn programming because of your content and I’ve been active with user interface design and you changed my life I hope you know that
More correctly, I would say "comparision based sorting algorithms best run time is nlog(n)" but integer based sorting algorithms such as counting, radix & bucket sorts are run in order of N asympoticially. But not quite O(n), the best one is O(n*sqrt(loglogn)) but it's hot research area at the moment, we can't claim it's impossible without any proof.
Where can I find any info on this? What is the name of the concept algorithm?
Best intro ever in an Algorithms video!
😎
Well, this was really interactive.
Editing was better this time. I liked your idea. :)
I subscribed it immediately. I am glad that CZcams recommended this. I have seen your video with Anshika Gupta where you were talking that " Agar 8 semester me nahi padha to kya padha". You are too sarcastic. Really liked the way you are teaching. 🙏 Thankyou Gaurav Sir
Ohh .... That was really cool 😍
Can you discuss the multiprocessor part to reduce the complexity to O(logN) again ?
I'll try in one of the future videos 😋
@@gkcs if we use N number of processors likewise you said logn number processors to achieve O(N) can't we achieve O(logn) like this? why we need infinite processors than? or similarly nlogn processors to achieve O(1) time complexity to sort array (which is obviously impractical) than why using infinite processors we still achieved O(logn) time.
Thanks
@@shubhygups That's not practical lol unless you have 4 items to sort. Then bogo sort isn't even that bad.
Well, any comparison sort where one compares the elements to each other cannot be done in less than O(nlogn) time!
Non-comparison sort like Bucket Sort can be in O(n)
(The special conditions required for bucket sort are:
1) The values to be sorted are evenly distributed in some range min to max.
2) It is possible to divide the range into N equal parts, each of size k.
3) Given a value, it is possible to tell which part of the range it is in.)
For me, I loved the explanation cos I'm kinda new to sorting algorithms. Thanks for sharing
Wooaah! Keep growing man!!
great video bro
Thanks!
Me: Trying to print output correctly.
GS: Today we'll sort in O(n). :D
Tricked me into understanding something more fundamentally. Not even mad.
Great explanation !
Thanks!
Best video 👍 thanks for sharing your knowledge
This was a great video
On a completely unrelated side note, you look like kanan gill and biswa Kalyan rath had a child.
Hahahaha
I liked the old saas-bahu "No no no" part.... btw Great work buddy!
Hehehhuhuhuhahahhaha! Thanks!
Good video got some insights on this minimal time complexity
Great video, man! Keep it up!
That was me when I learnt sorting yrs ago lol “ why are we learning this?”
Gaurav Sen found a friend good in video editing XD
One of the Gaurav Sen's in the video helped me :P
😆
@@gkcs very soon we will ask you to teach us video editing.
No he learnt shadow clone jutsu from naruto. :P
Nice way sir u driven out the technique to learn with fun
Nice video Gaurav, but I did not understand the transition from the usage of log10 for digit counting to log2 usage when unravelling the factorial. Would the digit count example be applicable for base 2 numbers?
Or is the O(n log2 n) the minimum time complexity because you have to do 2 choices for each item (either stay put or move in relation with its compared item)?
Thanks!
The base doesn't matter in order complexity analysis. log10 n = (log2 n)/(log2 10).
1 / (log2 10) becomes a constant factor.
This is high level punch for cs students. I will prove you wrong just because of this video. Jai mahishmathi
Hahaha!
State Space Search To Prove This! Awesome 🤘
The first few seconds cracked me up.... Very nice video
Oh my god! So clear and easy! Thank you very much!!
I can make an Unsorting algorithm in O(1) 😜 Great video Gaurav!
Your proof is valid however it only applies to comparison-based sorting, as those algorithms are reliant on relativity between one another.
Since non-comparison based sorting are all based on the foundation that you can sort a data set with no knowledge of any other piece of data except for the one which you are ordering.
It is this fact that permits non-comparison based sorting to have a better worst-case time than O(n log n).
The limit for a non-comparison based sort would be O(n) since you would have to access each data member at *LEAST* once during the sort.
Proof:
Let X(n) be an unordered data set of n items, and let Y(n) represent the ordered data set of X(n)
Let H(v) be an ideal hash function which maps a member of X(n), v, to its corresponding index in Y(n)
If O(H(v)) = O(1), then the number of steps needed to perform to create Y(n) would be:
S = O(H(X(1))) + O(H(X(2))) + . . . + O(H(X(n)))
= O(1)_1 + O(1)_2 + . . . + O(1)_n
= n * O(1)
= O(n)
yasss
That's correct, thank you for pointing it out :)
Thanks Homer Calcalavecchia sir.
What about collisons in hash function.. that can end up in O(n) instead of O(1) so we need completely collision proof hash function👍👍👍. To achieve these and i don't think there exit one(not sure)
@@khushitshah979 he already stated .." 'Ideal' hash function"
great video..keep sharing your knowledge:)
Though it's a clickbait he proved that there's a mathematical limit to how fast we can sort with some help from wikipedia article, I'd say he's smart.
Why are we learning this hehehe😂😂😂thug life
Ssly yooo... What the heck.. Had lot of expectations from fast sort in the beginning.. Anyway that was really an amazing way to prove that O(n) is nt possible while sorting.
Its possible just get a quantum computer you can do it in O(root(n)) time
I think radix sort is O(n), though that is the running time in a mechanical sorting machine if you discount the gathering of the cards and resetting the machine to run the next digit. Though if you have 100,000 cards sequentially numbered then put out of order, I think there should be 500,000 comparisons and five gatherings.
Absolutely loved your video! 🔥
You forgot to maintain an important point. The lower bound is Ω(nlogn) for comparison sorts. It can be less than that for other classes of sorting algorithms.
Sir. Love this video.
Pimp daddy max!
Radix sort can practically do it in O(n)
He just proved mathematically that Radix sort does not exist! At 8:04.
Who are you going to believe, him or your lying eyes?
One "simple" enhancement to a sorting algorithm that only grabs the largest (or smallest) number each pass is to grab both. For example. suppose we had ten scrambled numbers to sort between 1 and 10 (such as 7,4,6,1,8,10,5,3,9,2). The "hi-lo" sort algorithm would make a pass thru all 10 numbers and would record the position of the lowest and highest numbers (position 4 = 1, position 6 = 10). It would then put the 7 in temp memory (since we need to move the 1 there) and would also store the 2 in temporary memory (since we need to move the 10 there). The new array (after the first pass) would be 1,4,6,7,8,2,5,3,9,10. Now we can sort the subarray (4,6,7,8,2,5,3,9) since we know the 1 and 10 are already in the proper positions. I am calling this the hi-lo sort. It is a good start to try to improve on such as maybe try picking off the 2 highest and the 2 lowest numbers each pass.
I realize this is somewhat similar to the cocktail shaker sort except that the direction can always be low to high index in the array of numbers, thus perhaps making it slightly easier to implement.
Nice proof. Never thought in this direction. You explain like biswa Kalyan rath , so I it was more fun.
Hahaha, I get that a lot 😛
Yes, many different algorithms all can given O(n) performance in the "best case" scenario as listed out here: en.wikipedia.org/wiki/Sorting_algorithm
but most of the time, this is not achievable. So best we can get is average performance, which is usually worst than O(n).
Hi Gaurav, I like your explanation. Thanks for such a knowledge transfer. It was really interesting.
Broo first time I see your video
Really enjoyed and well understand the concept.
I think this is the first video which is watched without skipping.
Thank you!
Woaah. You got me with the title. I guess python's "list.sort()" or quite famously known as "timsort" is faster than any other sorting module present in any language. Is it so?
I am taking a video on Timsort next 😋
Java has opted for timsort as well
Awesome intro and Amazing lecture sir !
Highly informative , especially the beginning and the repetitive edits near the end😁
Thanks CZcams for the recommendation.
Saw this JIT!! Nice video explaining why O(n) sorting doesn't exists.
P.S: The introduction was next level swag!! Keep such things more as it makes it more interesting.
Yey!
If you define a constant value for the log(n) variable it will be n log C, no matter how big the C is, the time complexity will always be O(n)
h3h3h3
Thats basically how counting sort and radix sort works also
do you store the index of the unsorted array into the sorted array and just replace the index from the sorted array with the position of the unsorted array?
Hey bro damn nice😂😂.....seriously love such videos
The fastest pairwise comparison algorithm is O(N*log(N)) and the proof is similar to what you've shown. Basically, a comparison can be enough information to reject about half of remaining possible permutations of numbers.
Radix sort (using counting and indexes, not buckets) is the fastest in practice for tens of thousands or more elements and leaves other sorting algorithms in the dust. It IS almost linear in practice, since an array of 32 bit numbers requires just 4 passes, while 64 bit numbers require 8 passes. It is also pretty cache-friendly. Mathematically, n! can be less than or equal to a^b for some a,b
so it's something like as we can't travel at O(speed of light) similarly we can't sort an algorithm with complexity less than O(nlog(n)) 😁😁
I made a supposedly "clickbait" video on this. So I took another dig at it 😛
@@gkcs hahaha u r simply amazing !!
Very well done
Nice work
Very interesting gaurav Bhai !
7:30 nice prank
This past weekend, just to keep myself sharp, I benchmarked a bubble sort, a quick sort and a radix sort on varying numbers of random integers. The bubble sort did as badly as expected. But on 1 million elements, the radix sort was 5 times faster than the quick sort. Essentially you're saying that I can't do ... what I did. Now, you didn't say "fastest comparison sorting algorithm." Arguably the radix sort is in a different class in that it's not comparing the values directly against the others. But that's not the point: the data ends up sorted, and in 1/5 the time of what you're saying is the fastest theoretically possible method.
Somebody said that it couldn’t be done
But he with a chuckle replied
That “maybe it couldn’t,” but he would be one
Who wouldn’t say so till he’d tried.
So he buckled right in with the trace of a grin
On his face. If he worried he hid it.
He started to sing as he tackled the thing
That couldn’t be done, and he did it!
I get a little weary of people who say things can't be done. I've already done so many things that can't be done.
Very True @Argle
@Gaurav your videos are great but may be this needs a little review because as @Argle mentioned Counting Sort or Radix Sort are on a different model of computation i.e Direct Access Array Model in contrast with comparison model of computation.
What you mentioned in your video that Counting Sort and Radix Sort due to big range can take effectively NlogN time but representing numbers in tuple format to minimize the range of keys and then sort them using Direct Access Array Paradigm will effectively make it O(N) - in my understanding!
Multiple passes of sorting tuples is a constant factor because we can know how many passes we will require for the given array.
I think this topic/claim demands an in-depth follow up video, what do you think?
Lecture from MIT 6.006 would definitely help here!
czcams.com/video/yndgIDO0zQQ/video.html
Awesome video! Thank you!
is it possible to use quantum computing to create a better sorting algorithm?
Did i just watch a 9 min video just to get trolled at the end? ☹️
it was just 13th second and you earned my like, well nice content, thanks for sharing
Thank you!
Wow..! though I am from "VLSI Physical Design" background I like "Algorithms" a lot..Really you are doing a great job bro..
All the best bro...
Thanks Lalith! 😁
It's not a clickbait he really sorted our concepts very fast
The true O(1) sort is Intelligent Design Sort: assume the list was created by an almighty creator and is therefore already perfect as it is. Done.
Me : don't even know how to use while loop properly
Watching ** fastest sorting method**
I couldn't understand at 5.25sec.
When we say - 3 for 1st position, it takes log3 time not log(n) times. Here 3 can be any number and need not be always of n digits. What am I missing?
Is it like, when we are saying 1st number, we are saying whole list of n numbers along with it?
At 7:00 we can only do that if the base of log is 2. But in this case we took it as 10 right? So how is that even possible? Correct me if i'm wrong.
You may achieve O(n) time if you do not rely on a comparison based sort. Radix and bucket sort are both O(n) in time complexity. Technically they are
O(n * log(n)). There is an important distinction to make though. With merge sort and quick sort, the base of the logarithm is 2, while the base of the big O notation described previously is base 10. This is important because in big O analysis, constant factors are essentially ignored. Therefore the log(n), read "log base 10 of n", is ignored in the runtime analysis, giving you a runtime of O(n).
This is a short explanation, but please to not listen to this man. Runtime analysis always refers to a single processor, ignoring cock speed and core count. This is because these factors will improve all performance across the board, and runtime analysis, specifically big O runtime analysis, is interested in the worst case running time of an algorithm.
Top comment material.
Gotta love that "cock speed" and the big O. Otherwise a good comment :).
log(n), no matter the base, is not a constant, therefore NOT ignored.
The actual base of the log IS irrelevant, since logA(n) = C*logB(n) for some constant C depending on the values of A and B (eg. log(a) = 2.303*ln(a)).
When you have polynomial complexity, lower powers than the highest in the polynomial are irrelevant. eg N^3+N^2 boils down to N^3, which is why we speak of O(nlog(n)) rather than O(nlog(n)+n+C).
Under no circumstance can you say O(nlog(n)) boils down to O(n)
0:07 am I racist or are they all the same guy?
XDD
Hey, that's racist! 😂
Hey you are doing well. And I really like this thing. Can you make a video which can explain time complexity and space complexity both?
For the 1st element to be sorted, it takes log(n) time? How? How is it related to number of digits when we r talking about array here..?
O(n) can be achieved.....a random number generator is created and seeded randomly, using that, the full set of the sort can be unraveled in order based on the seed cycle. Destroy all universes that are wrong, and we're done.
That's interesting!
Of course, only if we can destroy all the universes in O(n) or lesser...
Didn't get it could please explain it more specifically if possible with a short example
Next up, do the slowest sorting algorithm ever. Bogosort is slow, but there are slower ones! I've personally designed what I believe is the slowest optimal (without doing unnecessary steps) sorting algorithm ever, QuantumSort:
1. Check if data is sorted and if it is goto 3.
2. Goto 1.
3. Return data and exit.
How does it work? Simple: over time, random quantum events (such as cosmic rays) will modify the data in RAM until it's sorted.
Interesting. Won't the heat death of the universe happen first?
Gaurav Sen Most likely yes
Does Bogo sort have a potential of O(1) in the best case scenario or is it still O(n)?
I just wrote a sorting suit and to compare fastest algorithm. Insertion sort is fast only if you avoid one of extra comparison, each comparison matters. Quick sort perform well for sorted reverse sort was slowest, it can be easily to workaround to make it fastest soft, this added one addition comparison in each iteration, but overall performance is excellent.
That Thug Life start was really amusing!