Skip to main content

Books: Out of their Minds: The Lives and Discoveries of 15 Great Computer Scientists

I just finished reading Out of their Minds: The Lives and Discoveries of 15 Great Computer Scientists. In short, I liked it. I wouldn't say I liked it as much as, say, Coders at Work, however, I'm glad I read it. What I liked most about it was that it contains biographies from really early computer science pioneers such as Ada Lovelace, John von Neumann, John Backus, and John McCarthy. I know computer history after 1970 really well, but this book contains a lot of stuff from before 1970.

I jotted down a few interesting tidbits while I was reading the book. However, since I read the Kindle version of the book, I only have percentages, not page numbers. Anyway, I hope you're as entertained by some of these as I was.

John Backus, who lead the team that created Fortran at IBM, flunked out of college [2%]. He had a metal plate installed in his head [3%]. He disliked calculus but liked algebra [3%] (just like me!). These days, Backus is a proponent of functional programming [7%].

John von Neumann, who helped establish the fundamentals of computer architecture, thought that creating a programming language (i.e. Fortran) was a waste of time since programming wasn't a big problem [4%].

Ada Lovelace, who was the first programmer (not to mention, a girl!), was a gambler, an alcoholic, and a cocaine addict. She died of cancer at the age of 36. She is credited with inventing loops and subroutines [5%].

Fortran only had globals. Algol, which is considered an ancestor of C, added locals, thus permitting recursion [6%].

McCarthy, who designed Lisp, was born in 1927 to Communist party activists. He had an Irish, Catholic father and a Lithuanian, Jewish mother [8%]. McCarthy is the reason Algol had recursion [11%]. (I didn't know that C got recursion because of Lisp.)

Alan Kay, who did pioneering work on object-oriented programming and helped create Smalltalk, got thrown out of school for protesting the Jewish quota [15%].

Edsger W. Dijkstra, who did influential work on a lot of early computer science problems such as concurrency, did very well in school and wanted to turn programming into a respectable discipline [21%].

Fred Brooks, who wrote "The Mythical Man-Month", wrote this about iterative development:
In "The Mythical Man-Month" I said build one and throw it away. But that isn't what I say anymore. Now I say, build a minimal thing--get it out in the field and start getting feedback, and then add function to it incrementally. The waterfall model of specify, build, test is just plain wrong for software. The interaction with the user is crucial to developing the specification. You have to develop the specification as you build and test.


Sam Rushing said…
Yeah Fred Brooks! 100% agree.
BTW, if you haven't read "Soul of a New Machine", now would be a good time.
jjinux said…
I need to read that. It's on my TODO list, and it's gotten 3 recommendations from my friends.

Popular posts from this blog

Ubuntu 20.04 on a 2015 15" MacBook Pro

I decided to give Ubuntu 20.04 a try on my 2015 15" MacBook Pro. I didn't actually install it; I just live booted from a USB thumb drive which was enough to try out everything I wanted. In summary, it's not perfect, and issues with my camera would prevent me from switching, but given the right hardware, I think it's a really viable option. The first thing I wanted to try was what would happen if I plugged in a non-HiDPI screen given that my laptop has a HiDPI screen. Without sub-pixel scaling, whatever scale rate I picked for one screen would apply to the other. However, once I turned on sub-pixel scaling, I was able to pick different scale rates for the internal and external displays. That looked ok. I tried plugging in and unplugging multiple times, and it didn't crash. I doubt it'd work with my Thunderbolt display at work, but it worked fine for my HDMI displays at home. I even plugged it into my TV, and it stuck to the 100% scaling I picked for the othe

ERNOS: Erlang Networked Operating System

I've been reading Dreaming in Code lately, and I really like it. If you're not a dreamer, you may safely skip the rest of this post ;) In Chapter 10, "Engineers and Artists", Alan Kay, John Backus, and Jaron Lanier really got me thinking. I've also been thinking a lot about Minix 3 , Erlang , and the original Lisp machine . The ideas are beginning to synthesize into something cohesive--more than just the sum of their parts. Now, I'm sure that many of these ideas have already been envisioned within , LLVM , Microsoft's Singularity project, or in some other place that I haven't managed to discover or fully read, but I'm going to blog them anyway. Rather than wax philosophical, let me just dump out some ideas: Start with Minix 3. It's a new microkernel, and it's meant for real use, unlike the original Minix. "This new OS is extremely small, with the part that runs in kernel mode under 4000 lines of executable code.&quo

Haskell or Erlang?

I've coded in both Erlang and Haskell. Erlang is practical, efficient, and useful. It's got a wonderful niche in the distributed world, and it has some real success stories such as CouchDB and Haskell is elegant and beautiful. It's been successful in various programming language competitions. I have some experience in both, but I'm thinking it's time to really commit to learning one of them on a professional level. They both have good books out now, and it's probably time I read one of those books cover to cover. My question is which? Back in 2000, Perl had established a real niche for systems administration, CGI, and text processing. The syntax wasn't exactly beautiful (unless you're into that sort of thing), but it was popular and mature. Python hadn't really become popular, nor did it really have a strong niche (at least as far as I could see). I went with Python because of its elegance, but since then, I've coded both p