Skip to main content

Computer Science: Faster, Smarter, More Robust

My wife and I like to play video games together cooperatively. Generally, she plays the exploratory parts, and I play the parts the require quick reflexes.

Recently, we've been enjoying a game called iNinja for the GameCube. (Thanks Ben Bangert!) As we neared the end of the game, we reached a level called "Egg Shell Skull" that seemed impossibly hard. We each spent a couple hours trying to beat it, but to no avail. It required 100% accuracy, very fast reflexes, and a little bit of multitasking. I could "program myself" to have fast reflexes with high accuracy, but every time I mixed in the multitasking, my reflexes went out the window.

Finally, I had the idea of playing the level together. I used the left side of the controller which involved ducking with 100% accuracy and very fast reflexes, while she focused on the right side of the controller which involved planning, combos, and monitoring a timer. After several tries, we finally beat the level :)

Since this is a purely technical blog, you might have guessed that I'm not trying to focus on how great a gamer I am ;) I'm trying to point out something. In a real-time system, you don't always have enough clock cycles for the best algorithm possible. That's doubly true for "coding in wetware". I was able to program my brain to have high accuracy and low latency, but only if the task was very, very simple. There's just not much decision making you can do in low latency situations. This is backed up by what little I know about the amygdala and the neo-cortex.

There's another case of this pattern that I find enlightening. I saw a talk from an air traffic controller, and he was explaining that there are three systems that are used simultaneously to help planes avoid hitting each other.

There is a very stupid and robust system that kicks in when two planes are about to collide. This system is in charge of implementing the correct "duck!" algorithm. The next system is in charge of avoiding a "loss of separation" within the next 5-7 minutes or so. It's more complicated, but doesn't have to worry about wind, etc. The highest level system is in charge of avoiding a loss of separation in the next 12 minutes. It is extremely complicated, and has to account for things like wind, etc. These separate systems have separate code bases, naturally.

In the future, I wonder how often we'll see cases like this. Sometimes, neither the simple, fast, and stupid solution nor the elegant but complex solution will do the trick. Sometimes you need both--just like the human brain needs both.


Anonymous said…
Very nice entry. Interesting and an inspiration for a current little programming project of mine.
Cheers, Jan
writeson said…
Really like this entry, though it completely reinforces my belief that driving and talking on a cell phone are in a mutually exclusive domain. I've also read articles about the programs on the space shuttle being kind of like this; complicated, but short and only responsible for this range of five minutes before another program takes over.
Anonymous said…
It brings about a zen like realization...somethings like these are the ones that you think about, know about, yet when you read it in words, it does bring out something.
jjinux said…
Glad you liked it guys :)

Popular posts from this blog

Ubuntu 20.04 on a 2015 15" MacBook Pro

I decided to give Ubuntu 20.04 a try on my 2015 15" MacBook Pro. I didn't actually install it; I just live booted from a USB thumb drive which was enough to try out everything I wanted. In summary, it's not perfect, and issues with my camera would prevent me from switching, but given the right hardware, I think it's a really viable option. The first thing I wanted to try was what would happen if I plugged in a non-HiDPI screen given that my laptop has a HiDPI screen. Without sub-pixel scaling, whatever scale rate I picked for one screen would apply to the other. However, once I turned on sub-pixel scaling, I was able to pick different scale rates for the internal and external displays. That looked ok. I tried plugging in and unplugging multiple times, and it didn't crash. I doubt it'd work with my Thunderbolt display at work, but it worked fine for my HDMI displays at home. I even plugged it into my TV, and it stuck to the 100% scaling I picked for the othe

ERNOS: Erlang Networked Operating System

I've been reading Dreaming in Code lately, and I really like it. If you're not a dreamer, you may safely skip the rest of this post ;) In Chapter 10, "Engineers and Artists", Alan Kay, John Backus, and Jaron Lanier really got me thinking. I've also been thinking a lot about Minix 3 , Erlang , and the original Lisp machine . The ideas are beginning to synthesize into something cohesive--more than just the sum of their parts. Now, I'm sure that many of these ideas have already been envisioned within , LLVM , Microsoft's Singularity project, or in some other place that I haven't managed to discover or fully read, but I'm going to blog them anyway. Rather than wax philosophical, let me just dump out some ideas: Start with Minix 3. It's a new microkernel, and it's meant for real use, unlike the original Minix. "This new OS is extremely small, with the part that runs in kernel mode under 4000 lines of executable code.&quo

Haskell or Erlang?

I've coded in both Erlang and Haskell. Erlang is practical, efficient, and useful. It's got a wonderful niche in the distributed world, and it has some real success stories such as CouchDB and Haskell is elegant and beautiful. It's been successful in various programming language competitions. I have some experience in both, but I'm thinking it's time to really commit to learning one of them on a professional level. They both have good books out now, and it's probably time I read one of those books cover to cover. My question is which? Back in 2000, Perl had established a real niche for systems administration, CGI, and text processing. The syntax wasn't exactly beautiful (unless you're into that sort of thing), but it was popular and mature. Python hadn't really become popular, nor did it really have a strong niche (at least as far as I could see). I went with Python because of its elegance, but since then, I've coded both p