CST 370 < ILP < CSUMB < School < Santa Cruz Abalone Works santacruzabaloneworks.com
Home - Info - Arlon - Featured - School - People - Schools
Cabrillo | Cal Poly SLO | CSUMB
ILP
CST 231 | CST 237 | CST 238 | CST 300 | CST 311 | CST 328 | CST 329 | CST 334 | CST 336 | CST 338 | CST 363 | CST 370 | CST 383 | CST 438 | CST 462s | CST 499 | General Ed | Math 130 | Math 150 | Math 151 | Math 170

CST 370

CST 370 - Design and Analysis of Algorithms

Students learn important data structures in computer science and acquire fundamental algorithm design techniques to get the efficient solutions to several computing problems from various disciplines. Topics include the analysis of algorithm efficiency, hash, heap, graph, tree, sorting and searching, brute force, divide-and-conquer, decrease-and-conquer, transform-and-conquer, dynamic programming, and greedy programming.

Prerequisite(s)/Corequisite(s): (Prereq: CST 238 and MATH 170 with a C- or better)

Typically Offered: Fall, Spring

Units: 4

My Experience in CST-370 - Design and Analysis of Algorithms:

The class was both informative and invaluable.

I was able to get some very useful and educational experimenting with algorithms in Java, and a little in C++, too.

Here are links to my journals:

Links:

I put some explanations in my journals as I came across them.

As I said in my journal a few weeks back, Algorithms II was an amazingly spectacular class, amazingly well taught. At first I thought it was silly, the way a lot of the lecturers discuss algorithms is as if they are somehow within it themselves. The lectures, exercises and quizzes combined had a way of gifting us the knowledge of an array of algorithms I mostly knew only vaguely about until now. The lectures were surprisingly well done, teaching the concepts in a way that can really be understood and walked away with. I heard at first strange seeming analogy verbiage such as "I like it" used as an analogy to a conditional being satisfied during an algorithm's steps, and silly as it seems, it is a really great way to understand the steps behind the algorithm. Take the quick sort lecture as an example, I think that is the one I am thinking of. Just think "I like it" if the conditional is satisfied, whichever one is greater than the other and it is a really good mental pneumonic, it just gets your brain flowing past whether or not one's greater than the other or not and you can mentally fly through the algorithm and see it working in your head and then understand it so later you can make it work in the lab and get quiz questions about it right. I think I have to give the Algorithm II class A+ for teaching algorithms so well and surprising me at the level of understanding the lectures, and combination of lab and homework provide.

As extremely interesting as the class was, there weren't a lot of graphically exciting screenshots from the class so I just took two Repl.It screenshots - the class homeworks were about passing the input output programming tests on Repl.It, so you get the assignment, read the specs, watch the class lecture, do the reading and put all the clues together to write a small program in either Java or C++ that uses a custom algorithm we design but based on the concepts we learn in the class like pivot from quicksort, and stuff that ties in with the famous algorithms but also that we design to get it to work as the specifications list.

Algorithms II reminded us of several algorithms I really like, they're all interesting, and some of my favorites are: Quick sort - Quick sort is my latest favorite after studying the class video on how it works, I super understand it better than I think I would have without that video. The instructor seems like he really understands it and conveyed it to us in a way that helped me at least think about it in simple terms. Merge sort - conceptually very interesting and I used the concept to multiply numbers together in Excel (Gnumeric) the other day, in a way I could see each one get added in to the multiplication in tree form. Bubble sort - very easy to understand, remember for quick reuse. Selection sort - also very easy to understand, at little longer and more complex than bubble sort but it is in-place, and also great for remembering for quick reuse. It's still order O(n^2) but it's always useful for small data sets. Heaps - very interesting and useful and something I don't think I would have thought of on my own. Depth-first search - the specific implementation they teach us is really neat and simple so you can implement it without objects or anything complicated. Breadth-first search

In discrete math class (a different class but what I am about to say is super relevant to learning algorithms) I learned recursive formulas can be algabreically manipulated to become non-recursive, hence turning the efficiency upside down, depending. An example being the Fibbonacci value at n. The common sense way is to make a Fibbonacci, which is recursive, and find out what it is at N, or use (((1+sqrt(5))/2)^n-((1-sqrt(5))/2)^n)/sqrt(5), and if you don't think that's it, I can prove it is with induction.

Algorithms II reminded us of this, and was able to convey a quick version of induction for us to use as a mental pneumonic so that we can remember how it works and keep it with us. Switching from recursion to a simple equasion in the case of the Fibbonacci series means going from Order O(2^n) to order 1, which could mean the difference of seconds, minutes, hours, or even days or years, depending on how much you try to feed into the recursive version, because the non-recursive version is literally a tiny handful of computations (listed there, above, with all the sqrt(5)'s) which takes the processor a split instant no matter how far out you pick N to be (well maybe not no matter, but for ranges within the normal range) it is instantaneous vs. potentially years of computations if you try to write that with recursion. Which explains why it maybe was a good idea for me to officially learn programming, because before I learned all this stuff I would have written Fibbonacci recursively. Now I know better, a lot better. The flip side to that is it I also think it pays to have decades of personal programming experience under your belt because I can program at a rate you wouldn't believe, I can fly when programming because of how comfortable I am with the tools I've literally been using for myself for decades.

The Algorithms II class was invaluable gold, especially for how clearly and well taught it was, I really feel like I gained a lot of actual understanding of a lot of algorithms because of the way they taught it with understanding and knowledge and clarity with plenty of resources to further deeper understanding of how these algorithms work and how we should remember to use them in our programs.

One thing I will definitely remember to practice with is - you can replace recursive methodology with a closed form formula, and prove it with induction.

Algorithms II class had really good lectures, that may appear on the surface to be ordinary, but I'm here to say they are not - they are extraordinarily clear and enlightening. I think the reason is that the professor that made the lectures really understands the algorithms himself very well - and this has been true for several if not most classes - perhaps even all - anyways, what I was saying is - the professor's understanding is very deep so that his explanation hits home. At first the explanations may seem simplistic. That might be part of it. I'm not sure, it's hard to explain, all I know is, I've studied algorithms before, but when I heard these lectures, for some reason, the way everything was explained seemed to make 100x more sense than ever all of a sudden.

I can't decide my favorite algorithm, I like several. Radix is really interesting, Heaps, dynamic Floyd-Warshall, depth-first and breadth-first search, binary search, bubble sort for simplicity, selection sort, merge sort, quick sort.

I got an idea recently. This is not related to the class, but I'm not doing a journal today so I might as well put it here. This idea is similar in concept to self-documenting programs, the idea is self-providing-officially-integrated-manual-built-in-override-system-programs. Tell me that is not the best idea you ever heard. This is a framework for programs perhaps all kinds of programs that intends from step zero to provide it's own manual backup system for smooth - very smooth - transition to disaster - mode. Nobody wants any disasters - but preparedness is a key to prevention of disaster effects compounding. This is insurance for the future. It could come with its own system for re-entering all manual data back into the computer system when the computer comes back up. I'm not trying to scare anyone, so, put on your warm fuzzy hat, grab a cup of coffee, and hear me out a second. What I'm talking about is casual preparedness to protect humanity in case of disaster. Remember, fuzzy hat, coffee. Ok. Say a solar flare wipes everything. Keep your fuzzy hat on. This is hypothetical. This system I'm talking about, inventing here, protects us from that, fuzzy hat or not, we'd have the papers showing what was on our computers. Wipe everything, go ahead, sun, Russians, whatever, we would be prepared. So these new systems - this new framework for new systems - would provide physical wood-pulp paper print of all database files and a mechanism for scanning everything back in if/when ready. Say there's some database that is so important we really do not want to lose it. The system would print five copies and send each to different locations around the world. Then the world would have to explode to lose the info. I guess we might want to consider time, too, like the Egyptians. Perhaps a paper-to-stone machine for the real important stuff, but that's a different invention. I will call this automanual for now - the domain names taken, but they don't know what to do with it, the page is blank so, perhaps....

List of paradigms I practiced during the class:
  • Heaps
  • Dynamic Floyd-Warshall
  • Depth-first
  •    and Breadth-first search
  • Binary search
  • Bubble sort
  • Selection sort
  • Merge sort
  • Quick sort
  •    and custom Partitioning
  • more

Thank You CSUMB for the very insightful and very educational class! The material is extremely broadly applicable and useful, as well as extremely well presented. The class ended mid 2022.

octo-skull-eyeball

I did not draw that but I found it painted (in a secret spot) (somewhere .... in the world)

       Powered by       Santa Cruz Web Factory