Note: I want to cover a wide breadth of Data Structures & Algorithms topics this semester, so I’m only going to touch on concepts that I either a) believe are essential to understanding the subject, or b) struggled finding existing accessible resources to on the Internet. Once all the essentials have been covered, I’ll return to do articles on more nuanced topics (such as Sororitree Rush).
Linked list operations in this article are limited to ones that are used when implementing stack and queue data structures. Check out this great post on linked lists for a more robust description.
Have you ever wondered about how the Back and Forward buttons on your browser work? Or how your favorite clothing website determines your place in line during an online sale?
Stacks and queues, that’s how. Your browser uses a stack data structure to implement the changes in state as you toggle the buttons on the upper left corner. That clothing website uses a queue data structure to stick your access request in line, and only lets you into the store when someone else comes out.
Stacks and queues are separate data structures, but have similar operations and are implemented in similar ways.
Earlier today, I was doing a coding challenge that asked me to sort an array of integers in better than O(n log n) time.
I wondered briefly whether or not it was a trick question. The sorting algorithms I knew were all O(n log n) algorithms — could there be something faster?
It turns out that there was. Counting Sort, one of the more sophisticated sorting algorithms I’ve seen, takes constant time, and it sorts elements in a really cool way.
Trees — they’re all around us in real life, providing us with fresh oxygen, giving us comfortable places to sit under and read, and acting as cool backgrounds to pose by.
(Okay, maybe that last one’s a me thing.)
As it turns out, trees are also all around us in the digital world. They’re the underlying data structures behind most file systems, library catalogs, 3D video games, network routers, music compression algorithms, and more. We just can’t see them, so we often don’t know that they’re there.
When was the last time that you sorted something?
Maybe you accidentally dropped your freshly printed Parisian Art History final paper all over the floor and had to put it back in order according to the page numbers on the upper right-hand corners. Maybe you were making door signs for all the members of your sorority, and decided to put them in alphabetical order. The last time I sorted something by hand, I was organizing all of the dresses in my closet by color.
Take a step back and recall your last sorting memory. How would you explain your process to an intelligent alien? (“Umm, I just started picking things up and putting them in order” won’t cut it, sorry not sorry).
When you really take the time to think about it, sorting becomes pretty complex and hard to put into words. And there are so many ways to do it.
Welcome to the world of sorting algorithms.
When I first started studying algorithms, I had an incredibly hard time because of two things.
One, I didn’t have a very strong mathematical foundation, and would struggle to understand the reasoning behind certain algorithms. Two, I was just plain bored. “Implement insertion sort and time your algorithm input on arrays of size 10,000…” Ugh! I approached my assignments with the same sort of dread that one normally reserved for seeing disagreeable relatives during the holidays: Yeah, sure, I’ll bang out this code as quickly as possible and retreat to my room to watch Rick and Morty the first chance I get.
Because I had such an approach, it was painful to study algorithms. Even worse, the knowledge didn’t stick. I couldn’t remember a damn thing after I’d handed in an assignment. Insertion sort? How was that different from Merge Sort, or Selection Sort, or, God forbid, BogoSort?
I know what you’re thinking –“Big-O Time Complexity” sounds like the punchline of some nerdy that’s-what-she-said joke.
(I’m That Kid who snickered loudly at the term “Big-O notation” when it was first introduced sophomore year, so I won’t judge if you do the same.)
Big-O is actually one way to measure the amount of resources needed for an algorithm to run (also known as an algorithm’s complexity). “Resources” usually refers to time or space in memory. For simplicity’s sake, I’m going to be writing about time complexity only, so for the duration of this post, you can think of Big-O as a way to measure the amount of time an algorithm needs to run, depending on the size of its input.
If this all sounds hella confusing but also slightly interesting, good! Keep reading. Information sticks way better when you’re curious.
When I walked into Algorithms class earlier this semester, my first thought was “Oh God.”
I had not been a fan of Discrete Mathematics, the precursor to the course, and the thought of having to learn these data structures and algorithms — and implement them programmatically — made me want to hide under a large rock and never come back out again.
Then I dove into the class, and realized that algorithms are really cool, and understood everything right away, and got a great job in Silicon Valley, and saved up money over the years and bought a nice house on the water where I lived happily ever after with my 50 cats.