Code Review

No, not that kind of code review. This is a review of the book, "Code - The Hidden Language of Computer Hardware and Software."

Quite simply, this is the best book I've read in years. (Disregard the fact that I can count on one hand the number of books I've read in the past few years -- technical references don't count!)

I'd like to discuss my incentive for buying the book; but, before I begin, let me share a story of what happened today. This week, I'm in a developer training class on campus that is geared for experienced developers who are new at Microsoft. It's really great -- I've enjoyed the interaction with so many different developers and the content has been invaluable. If it weren't confidential, I could fill days of blog material with some of the great stuff we've talked about.

During one of the breaks, a conversation was started about different languages. It was noted that the bulk of the people in the class, and the bulk of the course materials, was some variety of managed code (generally C#). The conversation, for whatever reason, drifted into a topic about how many university CS programs have switched entirely to Java (and in some cases .NET). Although most people seem to love the elegance and simplicity of C#, the criticism of only learning a managed platform seemed disconcerting. Would this lead to lazy programming? What happened to the days of understanding memory and resource management?

It's not that we should abandon programming roots; in fact, it could be argued that computers have reached an advanced enough state that have made it impossible to teach everything. University programs need to balance education with career preparedness -- with more and more jobs requiring C# or Java, it makes sense these are the languages of choice in university programs. Is it really necessary to learn Assembly as opposed to learning more about design patterns, for example?

So what does any of this have to do with this book? As I've worked with higher and higher level languages, I've actually missed the early days. I took a few EE classes some 15 years ago and really loved it -- but haven't touched it since.

While I was at the bookstore picking up a debugging book for .NET, I happened upon "Code." I flipped through it, and recognized the good ol' AND, OR, NAND, and NOR logic gates in some simple circuits. I decided to buy it.

You don't have to be an uber geek to get into this book. But, it does get fairly deep. I sat and stared at many of the logic diagrams to "get it." A few times, I could see where diagrams were going -- such as the RS Flip Flop or 8-bit Adders. The beauty is: Petzold offers an incredible narrative that begins with the simplest of childhood problems: using a flashlight to communicate. It's of no surprise that an "A" (1-flash) and "Z" (26-flashes) wouldn't work, so Petzold quickly moves into Morse code, Braille, and binary numbers.

The transition into the telegraph and the first logic gates is clear and exciting. Undoubtedly, some of the content with be review for many people, although it may be mixed with things you never knew or have been in the cobwebs of you brain (like mine) for the better part of two decades. It was exciting to relearn the problems that carry bits introduce in a simple adder, or using nine's complement to solve subtraction problems.

It's at this point that Petzold really starts getting into simple computers, assembly language, RAM, and other fun topics. And it's not just a history lesson: if you're anything like me, you probably know the difference between an int and a uint (one of many examples that come to mind). You'll recognize, of course, that the uint buys you that extra bit of storage (in effect doubling the number of available values) as long as you don't need to worry about negative numbers. But why and how is it implemented?

Similar to the discussion that happened today about losing touch with the low level code, I had forgotten about two's complement (for binary) or ten's complement for decimal. It was exciting to learn and relearn some of these concepts.

If you just came out of an EE program, this book will likely be too simplistic. For CS vets, this book may be one of the best reads you've had in a long time if you're a bit rusty on the logic fundamentals, and a must read if you don't know an AND gate from an OR gate.

Comments (1) -

James Byrd
James Byrd
8/26/2005 9:02:20 AM #

I'm sure I'd enjoy this book too, given your description. Petzold has been in this business a long time and is responsible for some of the most classic reads in the industry. His Programming Windows title is in something like its 5th edition now was considered a "must have" back when I was programming for Windows 3.1.



As for how much a modern developer needs to know about memory management and the underlying machine architecture, well I'm not really sure. In C and C++, you had to manage memory allocation yourself, which was often a source of bugs and memory leaks. That all got much better in VB and now C#, since you generally work with references, not pointers. But memory leaks still occur in .NET applications. Having references and garbage collection doesn't stop you from creating memory leaks, it just makes it less likely to happen due to orphaned object allocations.



It seems to me that many developers, particularly the most technology-driven ones, are in love with arcana. Getting excited about the details of two's-complement is a good example. I wrote an article several years ago called Using Bitwise Operators in VB (http://www.logicalexpressions.com/vbtip02.htm), which was an excursion into the details of how to work with bit flags in VB numeric variables. I had a great time explaining how bitwise operators work, and I still get feedback on that article today. However, for many developers, it was probably too much information. If you want a flag, you store a Y or N!



I suspect that most business application developers don't really need to know much about machine architecture. I don't often think about it either. However, I have to say that my knowledge of machine architecture does influence the way I code. Having prior experience with environments that were sensitive to memory allocation bugs makes you more cautious about how you build applications that perform a lot of object allocation and manipulation, even if you carefully remove rooted references and can reasonably expect garbage collection to clean up your mess.



Anyway, your interest in this book earns you a power geek badge. Wear it with pride! I may just have to pick up a copy myself.

Comments are closed

My Apps

Dark Skies Astrophotography Journal Vol 1 Explore The Moon
Mars Explorer Moons of Jupiter Messier Object Explorer
Brew Finder Earthquake Explorer Venus Explorer  

My Worldmap

Month List