Hot Best Seller

Code: The Hidden Language of Computer Hardware and Software

Availability: Ready to download

What do flashlights, the British invasion, black cats, and seesaws have to do with computers? In CODE, they show us the ingenious ways we manipulate language and invent new means of communicating with each other. And through CODE, we see how this ingenuity and our very human compulsion to communicate have driven the technological innovations of the past two centuries. Usin What do flashlights, the British invasion, black cats, and seesaws have to do with computers? In CODE, they show us the ingenious ways we manipulate language and invent new means of communicating with each other. And through CODE, we see how this ingenuity and our very human compulsion to communicate have driven the technological innovations of the past two centuries. Using everyday objects and familiar language systems such as Braille and Morse code, author Charles Petzold weaves an illuminating narrative for anyone who’s ever wondered about the secret inner life of computers and other smart machines. It’s a cleverly illustrated and eminently comprehensible story—and along the way, you’ll discover you’ve gained a real context for understanding today’s world of PCs, digital media, and the Internet. No matter what your level of technical savvy, CODE will charm you—and perhaps even awaken the technophile within.


Compare

What do flashlights, the British invasion, black cats, and seesaws have to do with computers? In CODE, they show us the ingenious ways we manipulate language and invent new means of communicating with each other. And through CODE, we see how this ingenuity and our very human compulsion to communicate have driven the technological innovations of the past two centuries. Usin What do flashlights, the British invasion, black cats, and seesaws have to do with computers? In CODE, they show us the ingenious ways we manipulate language and invent new means of communicating with each other. And through CODE, we see how this ingenuity and our very human compulsion to communicate have driven the technological innovations of the past two centuries. Using everyday objects and familiar language systems such as Braille and Morse code, author Charles Petzold weaves an illuminating narrative for anyone who’s ever wondered about the secret inner life of computers and other smart machines. It’s a cleverly illustrated and eminently comprehensible story—and along the way, you’ll discover you’ve gained a real context for understanding today’s world of PCs, digital media, and the Internet. No matter what your level of technical savvy, CODE will charm you—and perhaps even awaken the technophile within.

30 review for Code: The Hidden Language of Computer Hardware and Software

  1. 4 out of 5

    Craig

    I'll be honest. I only read this book because it was quoted as a must read by Joel Spolsky on a stackexchange answer about how to go about learning programming (and finding out if you want/should be a programmer). I was a little hesitant due to the year of release. Being at least some 11 years old that's a lot of time in the tech world. Ultimately though that doesn't matter. I defy any developer/programmer/system builder to read this book and not blitz through it lapping it up. Yes if you've done I'll be honest. I only read this book because it was quoted as a must read by Joel Spolsky on a stackexchange answer about how to go about learning programming (and finding out if you want/should be a programmer). I was a little hesitant due to the year of release. Being at least some 11 years old that's a lot of time in the tech world. Ultimately though that doesn't matter. I defy any developer/programmer/system builder to read this book and not blitz through it lapping it up. Yes if you've done some schooling in computing or computer science you may be happy with much of the content but you'll surely find things you've either not thought about before in much depth or just wasn't explained in quite the elegant way that Petzold does. For me, whether it was due to age, experience or just maturity through both I found it filled gaps in my memory and indeed gaps in student course material. Petzold opens up the world of computing through a concise linear storytelling format. Starting with a basis in Morse Code and Braille through the telegraph system, barcodes, boolean logic, circuits with memory, von neumann machines, adding peripherals, I/O devices and GUI interfaces we just about catch up to the modern era with talk of HTTP and the world wide web. Having pretty much built the systems (or simplified versions of) we're discussing in the incremental circuit and systems diagrams on the way. Admittedly there's some rather 'of their time' phrases and facts that raise a smile (low resolutions, high costs for 'small' HD storage sizes, usage of cassette tapes by consumers) but this is all still valid information when taken in the context of the time of writing. If you are a Developer/Programmer you're not going to go into work having had an epiphany of how better to do things, but you may have a new found respect for what you're doing and the many, many ingenious shoulders you are standing upon.

  2. 4 out of 5

    Naessens

    My opinion on this book is really divided : on the one hand I enjoy some chapters, on the other hand I hardly managed to restrain myself from flipping through other chapters. Basically, this book designs and builds a basic computer by introducing in each chapter a concept or a technology used inside computers. It was written from 1987 to 1999, consequently one shouldn't expect any description of newest technologies. It starts really slowly with the first chapters, but then things get more and mor My opinion on this book is really divided : on the one hand I enjoy some chapters, on the other hand I hardly managed to restrain myself from flipping through other chapters. Basically, this book designs and builds a basic computer by introducing in each chapter a concept or a technology used inside computers. It was written from 1987 to 1999, consequently one shouldn't expect any description of newest technologies. It starts really slowly with the first chapters, but then things get more and more complicated. One of the things that bother me with this book is the difference in complexity between chapters. Some chapters can be easily understood by a junior school or high school student while some of the latest chapters remind me bad memories of electronic circuits from my engineering school years. For example, a whole chapter is dedicated to explain how to communicate with your neighbour using a flashlight, an other chapter tackles the same issue with light bulbs and electrical wires, whereas all the gates or all the flip-flops are dealt with in a single chapter. I admit I have never been either fond of or good at electrokinetics, but I confess I didn't try to understand how all the electronic circuits of these later chapters work. I guess these chapters mostly interest hard code computer enthusiasts, but don't they already know these stuffs ? Besides, few chapters are a little boring : a whole chapter to describe every op-code of Intel 8080, come on ! Does the decimal system really deserve a whole chapter ? In my opinion, decimal and alternative number systems should have been presented in a single chapter instead of two. Moreover, the huge difference in complexity leads to some contradiction. The binary number system is so well described that a high school student can easily understand it, binary addition and subtraction are very detailed, but multiplication is done with a simple inefficient loop ! In my opinion, it would have been opportune to present at least a more efficient version based on the binary representation of the multiplicand as well as introduce exponentiation by squaring (a.k.a. square-and-multiply or binary exponentiation). Additionally, I think that Charles Petzold tries to explain in too many details how each part works so that readers with less technical knowledge can understand, but in the end I guess these readers get lost or confused by so many details anyway, whereas a few technical references are missing. For instance, both Von Neumann and Harvard architectures are described but I don't recall them being mentioned. Nevertheless, I really liked when the author gives historical anecdotes or references. The chapters I enjoyed the most are the ones where Charles Petzold gives readers some background history to introduce a concept or technology (for instance, Morse and Braille's codes, Bell's telegraph, the invention of telegraph relays, the evolution of transistors, chips or programming languages). Eventually, I find it a bit contradictory for this book that most of the interesting chapters are the less technical ones indeed. Moreover, due to the important difference of knowledge required to understand chapters, I don't think someone may understand or find interesting every chapter.

  3. 4 out of 5

    Cardinal Biggles

    Raise your hand if you think metaphors and analogies should be used sparingly. I'll raise my hand with you. This book is for us. After reading this book, I can see behind the pixels on my computer screen. I know what I'm really looking at. So many layers of abstraction are removed by learning about how logic gates can be arranged as processors and RAM, how code is simply a representation of those microscopic switches being flipped, and how pixels are simply a graphical interpretation of the state Raise your hand if you think metaphors and analogies should be used sparingly. I'll raise my hand with you. This book is for us. After reading this book, I can see behind the pixels on my computer screen. I know what I'm really looking at. So many layers of abstraction are removed by learning about how logic gates can be arranged as processors and RAM, how code is simply a representation of those microscopic switches being flipped, and how pixels are simply a graphical interpretation of the state of particular switches. Moreover, I also have a little bit of an understanding of the historical evolutions these inventions and conventions went through: not just how computers work, but why they work that way and how they came to be. The book was tougher to grasp than I thought it would be (I do not have an extensive background in electronics or programming). Although it started off easily, it became progressively more complicated except for the last chapter or two. Of course, this was to be expected, as the book began with the basic building blocks of a computer, and built progressively more complicated systems from those initial components. However, the problem wasn't really a result of the subject matter, but of the writing style, which seemed to grow more terse in later chapters. I was left with the impression that the author felt he was running out of space, which I'm sure he was; it must be difficult to keep a book with such a vast scope to a manageable size and prevent it from turning into a reference manual. I would characterize this book as grueling, but that might be because I was obstinate in making sure I fully understood every detail of every page. There were a few pages that I had to pore over repeatedly until I received a eureka moment. A few more explanatory sentences here and there would have alleviated this, but ultimately, drawing my own conclusions was very rewarding. The book seemed to recover from its gradually adopted terseness with an appreciated but sudden reference to the first chapter in the very last sentence. Someone less focused and more inclined to skim might find this book to be a bit lighter reading, but it still only took me a few days to read the whole thing. I was surprised to see that the book did not really cover how transistors work at the electron level, which leaves what I consider to be a major gap in any understanding of how modern computers based on integrated circuits work. The text says that transistors are functionally equivalent to electromechanical relays or vacuum tubes and work similarly, but hardly any more than that. This missing knowledge is something that would have been appreciated and wouldn't have taken up much space. It seems like an especially glaring omission when juxtaposed with the inclusion of a few pages on EBCDIC, an obsolete alternative to ASCII text codes descended from paper punch cards. Despite these minor gripes, this is a really great book, and I highly recommend it to anyone who has the interest and persistence to get through it. It teaches and ties together many mathematical and electrical concepts, and the payoff for the reader is a new perspective on computing. Despite being first published in 1999, it hardly seems dated at all, probably because it's really a history book and most of the computing history it covers happened in the 1980s and earlier. All computing history after that is basically just increasingly complex variations on those simpler foundations. A sequel would be welcome. P.S. I think I've discovered a typo in the assembly code program on page 322. It seems to me that there should be an additional "AND A,0Fh" after the four lines of "RRC" and before the first "CALL NibbleToAscii" line. If I'm wrong, would anyone mind explaining why? And if I'm correct, would anyone mind giving me peace of mind by confirming this? Thanks! :)

  4. 4 out of 5

    Mike

    Electricity is like nothing else in this universe, and we must confront it on it's own terms. That sentence, casually buried near the beginning of the book, exemplifies the engineer's muse: a striving to become aware of the inhuman, how it operates, and to find means of creating a socket for human enterprise, something to extend the fallible chassis of our flesh. The first two-thirds or so of this book follows a double track. One track covers the ways in which meaning may be encoded into messages Electricity is like nothing else in this universe, and we must confront it on it's own terms. That sentence, casually buried near the beginning of the book, exemplifies the engineer's muse: a striving to become aware of the inhuman, how it operates, and to find means of creating a socket for human enterprise, something to extend the fallible chassis of our flesh. The first two-thirds or so of this book follows a double track. One track covers the ways in which meaning may be encoded into messages, the other weaves repetitions of a relatively simple device — the telegraph relay — into machines that marshall electricity into the forms of logic and memory. These two tracks eventually coincide at the device we know as a computer. Though it would be impossible to build a computer from telegraph relays, the machines we use today perform the same tricks with electricity that were possible in the 19th century. The last third of the book is more concerned with the makeup and successive improvements in implementation of the devices that embody the marriage of electricity and meaning. For someone like me, accustomed to the elves of the internet bringing me a regular helpings of news, porn, and status updates from the virtual smörgåsbord, it was interesting to see how they have been made so much easier to use since the era of assembly code and text terminals. Regarding electricity, that prime mover of the information age, it has struck me that electricity is the stuff minerals dream with, and we may have subjected an alien order to the vagaries of our desire without being prepared to one day pay the price. We live, all of us, in an era of debt, making allowances for even a future of cities submerged and massive conflicts fostered by drought. When it finally comes time to pay off our mineral deficit, will it be our dreams — that which makes us human — to ultimately be forfeit?

  5. 5 out of 5

    Yevgeniy Brikman

    Every single person in tech should read this book. Or if you're just interested in tech. Or if you just want a basic appreciation of one of the most important technologies in human history—the computer. This book contains the best, most accessible explanation I've seen of how computers work, from hardware to software. The author manages to cover a huge range of topics—electricity, circuits, relays, binary, logic, gates, microprocessors, code, and much more—while doing a remarkable job of gradual Every single person in tech should read this book. Or if you're just interested in tech. Or if you just want a basic appreciation of one of the most important technologies in human history—the computer. This book contains the best, most accessible explanation I've seen of how computers work, from hardware to software. The author manages to cover a huge range of topics—electricity, circuits, relays, binary, logic, gates, microprocessors, code, and much more—while doing a remarkable job of gradually building up your mental model using lots of analogies, diagrams, and examples, so just about everyone should be able to understand the majority of the book, and gain a deep appreciation of what's really happening every time you use your laptop or smartphone or read this review online. I wish I had this book back in high school and college. I've been coding for 20 years and I still found a vast array of insights in the book. Some of the topics I knew already, and this book helped me appreciate them more; others, I knew poorly, and now understand with better clarity; still others were totally new. A small sampling of the insights: * Current is the number of electrons flowing past a point per second. Voltage is a measure of potential energy. The resistance is how much the substance through which electricity is flowing resists the passage of those electrons. The water/pipes analogy is great: current is similar to the amount of water flowing through a pipe; voltage is similar to the water pressure; resistance is similar to the width of the pipe. I took an E&M physics course in college and while I learned all the current/voltage/etc equations, I never got this simple, intuitive understanding of what it actually means! * We use base 10 because we have 10 fingers; a "digit," after all, is just a finger (so obvious when you actually take a second to think about it!). Had we been born with 8 fingers, like most cartoon characters, we'd probably use base 8 math. Computers use base 2 because building circuitry based on two states—the presence or absence of voltage (on and off, 1 or 0)—is much easier than circuitry based on ten states. * The notation we use in math is essential. It's not about looking pretty or not, but actually making the math easier or harder. For example, addition and subtraction is easy in Roman numerals but multiplication and division are much harder. Arabic numerals make multiplication and division much easier, especially as they introduce a 0. Sometimes in math, you switch to different coordinate systems or different geometries to make solving a problem easier. So it's no surprise that different programming languages would have the same properties: while any language can, in theory, solve the same problems as any other, in practice, some languages make certain problems much easier than others. * This book does a superb job of showing how logic gates (AND, OR, etc) can be built from simple physical circuits—e.g., from relays, which are much easier to imagine and think about than, for example, transistors—and how easy it is to do math with simple logic gates. I remember learning this back in college, but it still amazes me every time I see it, and with the crystal-clear examples in this book, I found myself smiling when I could picture a simple physical circuit of relays that could do arithmetic just by entering numbers with switches and passing some electricity through the system (e.g., to add, you have a sum and a carry, where the sum is an XOR and the carry is an AND). * The explanation of circuits that can "remember" (e.g., the memory in your computer) was superb and something I don't remember learning at all in college (how ironic). I love the idea that circuits with memory (e.g., latches) work based on a feedback mechanism: the output of the circuit is fed back into the same circuit, so if it gets into one state (e.g., on, because electricity is flowing through it), that feedback mechanism keeps it in that state (e.g., by continuing to the flow of electricity through it), effectively "remembering" the value. And all of this is possible because it takes a finite amount of time for electricity to travel through a circuit and for that circuit to switch state. * The opcodes in a CPU consist of an operation to perform (e.g., load) and an address. You can write assembly code to express the opcodes, but each assembly instruction is just a human-friendly way to represent an exactly equivalent binary string (e.g., 32 or 64 binary digits in modern CPUs). You can enter these opcodes in manually (e.g., via switches on a board that control "on" and "off") and each instruction becomes a high or low voltage. These high and low voltages pass through the physical circuitry of the CPU, which consist of logic gates. Based purely on the layout of these logic gates, voltage comes out the "other end," triggering new actions: e.g., they may result in low and high voltages in a memory chip that then "remembers" the information (store) or returns information that was previously "remembered" (load); they may result in low and high voltages being passed to a video adapter that, based on the layout of its own logic gates, results in an image being drawn on a screen; or they may result in low and high voltages being fed back into the CPU itself, resulting in it reading another opcode (e.g., perhaps from ROM or a hard drive, rather than physical switches), and repeating the whole process again. This is my lame attempt at describing, end-to-end, how software affects hardware and results in something happening in the real world, solely based on the "physical layout" of a bunch of circuits with electricity passing through them. I think there is something magical about the fact that the "shape" of an object is what makes it possible to send emails, watch movies, listen to music, and browse the Internet. But then again, the "shape" of DNA molecules, plus the laws of physics, is what makes all of life possible too! And, of course, you can't help but wonder what sort of "opcodes" and "logic gates" are used in your brain, as your very consciousness consists entirely of electricity passing through the physical "shape" of your neurons and the connections between them. There are a few places the book seems to go into a little too much detail—e.g., going over all the opcodes of a specific Intel CPU—and a few places where it seems to skip over all the important details—e.g., the final chapter on modern software and the web—but overall, I have not found another book anywhere that provides as complete of a picture of how a computer works. Given the ubiquity of computers today, I'd recommend this book to just about everyone. It'll make you appreciate just how simple computers really are—and how that simplicity can be used to create something truly magical. As always, I've saved a few of my favorite quotes from the book: A computer processor does moronically simple things—it moves a byte from memory to register, adds a byte to another byte, moves the result back to memory. The only reason anything substantial gets completed is that these operations occur very quickly. To quote Robert Noyce, “After you become reconciled to the nanosecond, computer operations are conceptually fairly simple.” The first person to write the first assembler had to hand-assemble the program, of course. A person who writes a new (perhaps improved) assembler for the same computer can write it in assembly language and then use the first assembler to assemble it. Once the new assembler is assembled, it can assemble itself.

  6. 5 out of 5

    Alex Palcuie

    If you work with computers and didn't read this book, you are lame.

  7. 4 out of 5

    Igor Ljubuncic

    This is a great book. Surprisingly interesting. While the subject matter is not a new thing to me - far from it - the way the author goes about telling the story of how modern computers came to life is exciting, engaging and fun. He starts with morse and braille, talks about the principles of mathematics and information, explains the critical concept of switches, and finally moves into the world of circuit boards and binary data, cultimating in ALU. After that, he discusses the idea of analytical This is a great book. Surprisingly interesting. While the subject matter is not a new thing to me - far from it - the way the author goes about telling the story of how modern computers came to life is exciting, engaging and fun. He starts with morse and braille, talks about the principles of mathematics and information, explains the critical concept of switches, and finally moves into the world of circuit boards and binary data, cultimating in ALU. After that, he discusses the idea of analytical and computational engines and machines developed through the late 19th and early 20th century, before we finally start seeing the modern computer around 1940s, with Turing and von Neumann laying down the foundations of what we know and use today. The book is really cool because it's also a nostalgic trip down the memory lane. Charles mentions the famous Bell Labs, the legendary Shannon, Ritchie, Noyce, Moore, UNIX, C language, and other people and concepts without which we would not be sitting here, writing reviews on Goodreads. Or we might, but the fundamentals of the computing devices would be completely different. Computers sound like magic, but the thing is, they are a culmination of 150 years of electric progress, 200 years of data/information progress, and about 350 years of math progress. The first boards, the first programs, the first assembler and the first compiler, they were all written by hand. Control signals are still essentially the same, and if you look at a typical x86 Intel processor, the legacy support for machine instructions goes back to the first microprocessor. The problem is, when you condense the centuries of hard work into a cool, whirring appliance, it does feel like magic. The author wrote the book in the late 80s and then revised it in the late 90s, so some of the stuff may look quaint to us, like the mention of floppy disks, VGA displays and such. But then he also shows uncanny foresight around overall information exchange, because the information principles are universal, and he correctly predicted that Moore's Law would taper out around 2015. He also cheated a little. He described the flip-flop as a perpetuum mobile, which can be sort of excused, and he also skimmed on the concepts of oscillators, transistors (and did not mention capacitors), but then those are fairly complex, and I guess it's not really possible to do that without going deep into the fields of physics and electric engineering. Excusable, because the book is compelling and delightful. Even if you have a PhD in Physics from a top university or have done computer science all your life, you can rap in ASM and name all LoTR characters by heart, this is still a good read. Do not feel like you'd be highschooling yourself with silly analogies. Far from it. This is a splendid combo of history, technology, mathematics, information, and nostalgia. Highly recommended, x49 x67 x6F x72

  8. 4 out of 5

    Lynn

    I have been an IT professional for 20 years, but I never knew what the switches on the front panel of the Altar computer were for. I do now. In fact, because of this book, I know many things about how a computer really works that I never did before. I think this book is great for anyone, except Electrical Engineers who would be bored. Having some background in computers probably makes this book easier to get through, but Petzold assumes nothing and starts from scratch. He does a good job of makin I have been an IT professional for 20 years, but I never knew what the switches on the front panel of the Altar computer were for. I do now. In fact, because of this book, I know many things about how a computer really works that I never did before. I think this book is great for anyone, except Electrical Engineers who would be bored. Having some background in computers probably makes this book easier to get through, but Petzold assumes nothing and starts from scratch. He does a good job of making potentially dry subjects fairly interesting. I think an update to this book would be great, because the discussion of 1999 capacity and pricing makes the book feel dated. Also, the last chapter seemed rushed and not as well focused as the rest of the book. So, if you want to know how any computer really works, read this book.

  9. 5 out of 5

    Jan Martinek

    What a ride! A book about computers “without pictures of trains carrying a cargo of zeroes and ones” — the absolute no-nonsense book on the internals of the computer. From circuits with a battery, switch and bulb to logic gates to a thorough description of the Intel 8080. Great way to fill blanks in my computer knowledge. The book takes the approach of constructing the computer “on the paper and in our minds” — that's great when you're at least a little familiar with the topic, maybe not so when What a ride! A book about computers “without pictures of trains carrying a cargo of zeroes and ones” — the absolute no-nonsense book on the internals of the computer. From circuits with a battery, switch and bulb to logic gates to a thorough description of the Intel 8080. Great way to fill blanks in my computer knowledge. The book takes the approach of constructing the computer “on the paper and in our minds” — that's great when you're at least a little familiar with the topic, maybe not so when trying to discover a completely unknown territory (but the author takes great lengths to go through everything step by step — e. g. the various gates, binary subtraction, memory handling, etc.). In a way, this is a perfect book on the topic. If you know a better one, I want to read it.

  10. 5 out of 5

    Miranda Sikorsky

    It is a great book, I demystified some thoughts I had about software architecture.

  11. 5 out of 5

    Jule

    I LOVE this book. I regard myself an innocent computer illiterate. And Petzold helps me to walk inside an electrical circuit, a telephone, a telegraph, an adding machine, a computer, and to understand the basics behind the design, of what is going on inside. I start getting the math, the logic behind all this technology that has become pretty much the center of my life today. And I should understand the logic behind the center of my life, right? What is so good about this book: it is written in I LOVE this book. I regard myself an innocent computer illiterate. And Petzold helps me to walk inside an electrical circuit, a telephone, a telegraph, an adding machine, a computer, and to understand the basics behind the design, of what is going on inside. I start getting the math, the logic behind all this technology that has become pretty much the center of my life today. And I should understand the logic behind the center of my life, right? What is so good about this book: it is written in a simple language anyone can understand. It uses examples that are entertaining and amusing. Like explaining an electrical circuit with AND, OR, NOR and NAND gates to pick your favourite kitty from a bunch of neutered, unneutered, black, white, brown, tan, male and female cats in their various combinations. Also, he interlinks the historical evolution to the logic and development of technology as we use it today, so you get pretty much a round picture of the whole thing. Love it!

  12. 5 out of 5

    Damon

    This book basicaly tries to take you from the very basics of how to encode information, such as how binary is used to represent complex information, to understanding how a computer uses information like this to perform intricate operations. The route between those two points is the interesting part, and there was some parts that I foudn really illuminating and important. For example, I didn't understand hexadecimal numbers (or indeed what base 4, base 8, etc) numbers meant before I read this boo This book basicaly tries to take you from the very basics of how to encode information, such as how binary is used to represent complex information, to understanding how a computer uses information like this to perform intricate operations. The route between those two points is the interesting part, and there was some parts that I foudn really illuminating and important. For example, I didn't understand hexadecimal numbers (or indeed what base 4, base 8, etc) numbers meant before I read this book. Similarly I knew a fair amount about how various electrical gates work but not how by pairing multiple gates together you eventually get to RAM, a CPU, etc. It did lose me at times, however, and I zoned out a bit when Petzold was talking about the way in which math calculations are carried out using gates and binary information. I probably should have paid more attention, because this is fundamental to understanding how higher level systems work. I really enjoyed the explanatuon of how certain chipsets were important, especially the 8080 and the 6800, and then the creation of assembly language and compilers. Most striking to me was the realisation that modern computing is essentially a brute force operation. We are using the same switches that were invented 150 years ago or so but now they are gigantically faster, smaller and on a exponentially more massive scale.

  13. 4 out of 5

    Alex Telfar

    Very close to my ideal book. Starts from understandable foundations and builds from there. Charles doesnt try to explain through high level metaphors (that do a poor job of capturing the truth -- I am frustrated after picking up another apparently interesting physics book only to find it contains no math), rather, he slowly builds on simple examples. And while it does get pretty complex, Charles doesnt avoid it. !!! For a while I have been frustrated about my understanding of computers. I underst Very close to my ideal book. Starts from understandable foundations and builds from there. Charles doesnt try to explain through high level metaphors (that do a poor job of capturing the truth -- I am frustrated after picking up another apparently interesting physics book only to find it contains no math), rather, he slowly builds on simple examples. And while it does get pretty complex, Charles doesnt avoid it. !!! For a while I have been frustrated about my understanding of computers. I understood how bits can encode information, what the von Neumann architecture was and some of it flaws, how programming languages are compiled to assembly/machine code, what transistors are and how to make logical circuits. But I could never really link them together. I am still a little hazy, and I think I will have to go over a couple of chapters from no. 17 onward (automation, buses, OS) just to cement and clarify, but understanding feels close. More thoughts to come on my blog. Just drafting atm.

  14. 5 out of 5

    Baq

    Wow. I wish I had had this book back when I was taking my first Computer Architecture course in college! It carries you along from the very fundamentals of both codes (like braille) and electric circuits in the telegraph days all the way to the web in a way that even a layperson could understand, with plenty of verbal and diagrammatic explanation. It does at points get pretty deep into the weeds but I really appreciated the author's efforts to provide such an exhaustive dive into how computers w Wow. I wish I had had this book back when I was taking my first Computer Architecture course in college! It carries you along from the very fundamentals of both codes (like braille) and electric circuits in the telegraph days all the way to the web in a way that even a layperson could understand, with plenty of verbal and diagrammatic explanation. It does at points get pretty deep into the weeds but I really appreciated the author's efforts to provide such an exhaustive dive into how computers work (and I regained much of my awe at these machines we take so for granted nowadays). The final chapter was a rushed overview of the web and felt almost like an afterthought after the thoroughness of the rest of the book, but I didn't ding the author on it--there's plenty of great writing about how the web works that you can read elsewhere. Read this book to gain a deeper understanding and appreciation for the birth of the modern digital age. Thank you Charles Petzold!

  15. 4 out of 5

    Laura Marelic

    This book is the perfect depth for novices but also people who are “in tech” and don’t really understand how it all works (like me). I can now look around at all the electronics in my house and feel like I know what’s fundamentally going on. Knowledge is empowering! The last chapter of the book felt a bit rushed and ended abruptly, but maybe that’s just my wanting the book to go on longer/end at present day. Overall, I loved it and will surely be recommending it to anyone who asks how computers This book is the perfect depth for novices but also people who are “in tech” and don’t really understand how it all works (like me). I can now look around at all the electronics in my house and feel like I know what’s fundamentally going on. Knowledge is empowering! The last chapter of the book felt a bit rushed and ended abruptly, but maybe that’s just my wanting the book to go on longer/end at present day. Overall, I loved it and will surely be recommending it to anyone who asks how computers work. 👩🏻‍💻🤖👾 Oh, also I am simultaneously reading The Innovators (Isaacson) on audio and the two books pair very nicely. It was great to read about the tech in Code and then the story of who’s behind it in The Innovators. I recommend this pairing!

  16. 4 out of 5

    Rik Eberhardt

    In brief: be prepared to skim through at least 25% of this book! If I had this book in a seminar freshman year, I might have completed the Computer Science program. In a very fun manner, this book presents 3 years of introductory CS curricula: discrete structures, algorithms, logic gates, ... After reading this during two cross-country flights, I better understand (and remember) classes I took 10 years ago. Almost makes me want to try again (*almost*).

  17. 4 out of 5

    Imi

    This book has really taught me a lot, despite the fact that many of the later chapters lost me somewhat; it felt like it became much more complicated and hard to follow after the earlier chapters, which were great, slowly paced and well explained. While Petzold does assume the reader is starting from scratch, I think it would be easier to follow later on if you had some background in computers/technology. As it was, I had to bombard my dad (an electronic engineer) with questions to even make it This book has really taught me a lot, despite the fact that many of the later chapters lost me somewhat; it felt like it became much more complicated and hard to follow after the earlier chapters, which were great, slowly paced and well explained. While Petzold does assume the reader is starting from scratch, I think it would be easier to follow later on if you had some background in computers/technology. As it was, I had to bombard my dad (an electronic engineer) with questions to even make it to the end of some chapters, but then I haven't attended regular maths/science classes since about age 14, so maybe it's not surprising that I'm missing some of the needed background information. It is outdated, having been written in 1999, but I guess the history, which Petzold follows nearly chronologically, hasn't changed, and the early history is necessary to understand what has come since this book was written. Having said that, the last chapter (on the 'graphical revolution') was strangely rushed and an updated edition would do it some good, I think. Even if I couldn't grasp all of the technical detail, the majority of this book was extremely eye-opening and I have definitely come away from it with new found respect for these devices that we now use day-to-day. Even while using this laptop to complete a supposedly "simple" task such as writing this review, I am fascinated by how much work has gone on behind the scenes to allow me to do this. It's fairly awe-inspiring, the more you think about it.

  18. 4 out of 5

    Carlos Martinez

    Such a fun and interesting book. Petzold goes back to the very basics to explain how to build a computer (of sorts) from the ground up. First he explains binary (via morse code and Braille), then he introduces relays and switches, then gates and Boolean logic, and before you know it you're building an electronic counting machine. He continues with a potted history of transistors, microchips, RAM, ROM, character encoding and all sorts of other fun stuff. I skipped over some pages, because I don't Such a fun and interesting book. Petzold goes back to the very basics to explain how to build a computer (of sorts) from the ground up. First he explains binary (via morse code and Braille), then he introduces relays and switches, then gates and Boolean logic, and before you know it you're building an electronic counting machine. He continues with a potted history of transistors, microchips, RAM, ROM, character encoding and all sorts of other fun stuff. I skipped over some pages, because I don't actually need to know the full set of opcodes for a 1970s CPU, no matter how significant they are to computing history. The only obvious 'flaw' is that the book has aged a bit. Written in 2000, it just about manages a mention of the internet/HTTP/TCP-IP and modems, but not wifi, cloud computing, touchscreen devices, and the brave new world of machine learning. Personally I don't think that detracts from the book at all - the really interesting stuff runs from around 1870 to 1970. Definitely recommended for those that didn't study (or don't remember much) computer science.

  19. 4 out of 5

    Alisa Mansurova

    Just finished reading my b-day gift, the 'Code' by Charles Petzold - probably the best engineering book I've ever read. By saying 'engineering', I mean it. Unlike other computer science books, the 'Code' teaches how computers work in a nutshell. It leads you from the very basics like morse & braille codes to boolean algebra and various numeric systems, from simple tiny electric circuits which bulb the lamp to primitive adding machine (built from relays, hehe), up to history of development and en Just finished reading my b-day gift, the 'Code' by Charles Petzold - probably the best engineering book I've ever read. By saying 'engineering', I mean it. Unlike other computer science books, the 'Code' teaches how computers work in a nutshell. It leads you from the very basics like morse & braille codes to boolean algebra and various numeric systems, from simple tiny electric circuits which bulb the lamp to primitive adding machine (built from relays, hehe), up to history of development and enhancement of computers in the 20th century. There's not much programming or CS (apart from some machine code and assembly language examples). Still, the purpose of the book, as I mentioned, is rather to explain the nature of computer codes and hardware at the very low-level. Written in 1999, the book yet actual nowadays (well, there are funny moments regarding computers' capacity and performance, and probably some other stuff but those don't matter much). Highly recommended for those (like myself...) who work closely with computers but have a lack of engineering education to feel comfortable with this magic going on around when you write your code

  20. 5 out of 5

    Mark Seemann

    Since I loved Charles Petzold's The Annotated Turing: A Guided Tour Through Alan Turing's Historic Paper on Computability and the Turing Machine, I wondered if he'd written other books about the foundations of computer science. Code seemed like an obvious candidate. This book explains, in as much details as you could possibly hope, and then some, how a computer works. Since I've been a professional software developer for about two decades, the title of the book, Code, gave me an impression that it Since I loved Charles Petzold's The Annotated Turing: A Guided Tour Through Alan Turing's Historic Paper on Computability and the Turing Machine, I wondered if he'd written other books about the foundations of computer science. Code seemed like an obvious candidate. This book explains, in as much details as you could possibly hope, and then some, how a computer works. Since I've been a professional software developer for about two decades, the title of the book, Code, gave me an impression that it'd mostly be about the logic of software - something that I already know intimately. The first chapters seemed to meet my expectations with their introductions to binary and hexadecimal numbers, Boolean logic, and the like. Soon, though, I was pleasantly surprised that the book was teaching me something I didn't know: how a computer actually works. It starts by explaining how one can construct logic gates from relays, and then builds on those physical creations to explain how logic gates can be used to add numbers, how RAM works, and so on! Like The Annotated Turing, this part of the book met me at exactly the level I needed. So technical, with no shortcuts, and nothing swept under the rug, that I felt that I deeply understand how things work, but still thrilling and engaging. Whenever I found myself with a question like, ...but how about..? or ...but what if..?, I soon learned to read on with confidence because I consistently found answers to my questions two or three paragraphs further on. The final part of the book, again, moves into territory that should be familiar for any programmers, such as ASCII, high-level programming languages, graphical user interfaces, and such, and that, unfortunately, bored me somewhat. Thus, the overall reading experience was uneven, which is why I only give it four stars. Would someone who's not a professional programmer rate it higher? I don't know. I could imagine that for some, the explanation of logical gates, adders, latches, etc. made from relays would be too technical.

  21. 5 out of 5

    Geoff Rich

    I really enjoyed most of this book. The slow unfolding of how computers are built actually work was extremely fascinating - from simple lightbulb circuits to logic gates to RAM to keyboards and monitors. Unfortunately, parts of this book seem quite dated (most anything discussing "contemporary" technology, i.e. 1990s computers) and the final chapter on the graphical revolution goes through way too much, way too fast to be of any use. A few chapters were tempting to skim For example, Petzold incl I really enjoyed most of this book. The slow unfolding of how computers are built actually work was extremely fascinating - from simple lightbulb circuits to logic gates to RAM to keyboards and monitors. Unfortunately, parts of this book seem quite dated (most anything discussing "contemporary" technology, i.e. 1990s computers) and the final chapter on the graphical revolution goes through way too much, way too fast to be of any use. A few chapters were tempting to skim For example, Petzold includes 25 pages on the machine code instructions of an Intel 8080 microprocessor - did we really need all that detail? The majority of the book, however, is great - I had never really delved into logic gates and circuitry, so it was truly eye-opening even if I couldn't fully understand some parts. I think if I read this when it was released it would manage to eke out 5 stars from me. From a 2017 viewpoint, however, it only manages 4. I'd still recommend it to anyone curious how computers work, down to the nitty-gritty of ones and zeroes. Most of it should be accessible to a layperson, though I may be blinded by my own CS experience. Even if there are parts you don't understand, you'll come out of it with a greater understanding and appreciation of the technology you use daily.

  22. 4 out of 5

    Trevan Hetzel

    With a desire to learn how the high level code (HTML, CSS, JavaScript, etc.) I write on a daily basis actually makes its way through the magical land that is a computer and returns pleasantries to a human being behind the screen, I sat down with this "Code" book. The book is very intriguing from the start, beginning with the earliest forms of code (Morse, Braille, etc.). Petzold spends a long time laying down the basic blocks of electrical engineering before progressing to how bits flow through With a desire to learn how the high level code (HTML, CSS, JavaScript, etc.) I write on a daily basis actually makes its way through the magical land that is a computer and returns pleasantries to a human being behind the screen, I sat down with this "Code" book. The book is very intriguing from the start, beginning with the earliest forms of code (Morse, Braille, etc.). Petzold spends a long time laying down the basic blocks of electrical engineering before progressing to how bits flow through a circuit board and control things. I'll admit that I got very confused at times as to how a computer works, but Petzold gives you all the information you need. It's just a matter of how much time you're willing to spend re-reading and studying each piece of information he gives (there's a LOT to take in). If you have a background in electrical engineering, this book would probably make a lot more sense to you than it did to me. But, nonetheless, it will sit on my shelf awaiting the time I start playing with Arduinos and hacking on things. At that point, the book will REALLY come in handy!

  23. 4 out of 5

    Ieva Gr

    The book reminds me of the courses that students usually have during the first year of the University. It provides a general overview of how computers function. Starting from workings of an electrical circuit and building up to various logical elements with gradually increasing complexity. It also discusses some relevant historical moments as a typical professor in a typical lecture would do and ends with a broad overview of personal computers as they were in 1999. The summary on the back of the The book reminds me of the courses that students usually have during the first year of the University. It provides a general overview of how computers function. Starting from workings of an electrical circuit and building up to various logical elements with gradually increasing complexity. It also discusses some relevant historical moments as a typical professor in a typical lecture would do and ends with a broad overview of personal computers as they were in 1999. The summary on the back of the book says “No matter what your level of technical savvy, CODE will charm you – and perhaps even awaken the technophile in you”. That didn’t happen at all. At some more detailed and complex parts reading the book felt like going through a swamp waste deep. But I think it really helped me to gain some overview on the history of computers and a better understanding of things I sort of kind of heard of but never bothered to read about.

  24. 5 out of 5

    K.C.

    This was a wonderful non-fiction read, especially the first 15 or so chapters. Chapter 17 ("Automation"), however, was where I began to feel a bit in over my head. While that chapter was fairly thorough, when I got to later chapters and realized I couldn't quite grok what was going on in these chips, it was hard for me to tell whether I was holding myself back by not fully understanding the concepts of Chapter 17, or if Petzold was simply glossing over some of the details that might have clued m This was a wonderful non-fiction read, especially the first 15 or so chapters. Chapter 17 ("Automation"), however, was where I began to feel a bit in over my head. While that chapter was fairly thorough, when I got to later chapters and realized I couldn't quite grok what was going on in these chips, it was hard for me to tell whether I was holding myself back by not fully understanding the concepts of Chapter 17, or if Petzold was simply glossing over some of the details that might have clued me in. It was probably a combination of both. While I did enjoy the later chapters as well, much of it felt so rushed compared to the earlier, slower pace of the book. Recommended for anyone who would really like to understand the basic concepts behind computer technology, but doesn't want to go back to graduate school.

  25. 5 out of 5

    Eva

    This book is quite incredible. You start with braille and simple light switches, make your way to oscillators, flip-flops and multiplexer, and suddenly you understand how computer hardware works. And that's coming from someone who already thought they "sorta" understood how it worked. I didn't really. Now I do. Best bottom-up education ever.

  26. 5 out of 5

    Travis Johnson

    I really, really truly love this book. The beginning is slightly slow, but after the 1/3 mark or so, I couldn't put it down(literally. hello, 5am.) I probably learned more about architecture from this book than the quarter in my Architecture & OS class at university. I really, really truly love this book. The beginning is slightly slow, but after the 1/3 mark or so, I couldn't put it down(literally. hello, 5am.) I probably learned more about architecture from this book than the quarter in my Architecture & OS class at university.

  27. 5 out of 5

    Randall Hunt

    Definitely one of the greats. If not already, it soon will be, a staple of computer science literature. It's both a narrative history of Computer Science and a brilliant introduction to systems and programming. This book should be a pre-requisite for introductory CS classes.

  28. 5 out of 5

    Ondrej Urban

    One - in this case one in how the Queen would use this - cannot really talk about this book without comparing it to But How Do It Know? - The Basic Principles of Computers for Everyone, since they cover a lot of the same ground (and one has read the other one first). Code's mission in life is to help the user understand the basic principles behind the computer design and convince them that it's not actually that tricky and that your great-grandparents could have build one themselves. Naturally, t One - in this case one in how the Queen would use this - cannot really talk about this book without comparing it to But How Do It Know? - The Basic Principles of Computers for Everyone, since they cover a lot of the same ground (and one has read the other one first). Code's mission in life is to help the user understand the basic principles behind the computer design and convince them that it's not actually that tricky and that your great-grandparents could have build one themselves. Naturally, there is a lot of overlap with But How Do It Know, since there don't seem to be better ways do design a basic RAM. Starting with looking for ways to transfer information, Code goes from flashlights to telegraph to relays to the logical gates, doing a better job than the other book, which introduces the NAND gate as the basic black box and goes from there. Point Code, reading about that is super enlightening and exciting! The middle part of both books is kind of similar, spend building basic computer parts out of logical gates. I'd maybe lean towards Code doing a bit of a better job reminding the reader of the basics but the other book prevails in its focus and overall a better teaching approach, explaining every single thing done, while computer parts seems to start randomly appearing at some point of Code (buses and registers being two examples). In the final part, Code goes a bit overboard with talking about not-so-basic stuff that at this level needs to necessarily happen at a bit of a high level. Disregarding it having been written in the later 90s and talking about DVDs will at some point take over from CDs as the main software distribution channel, I had a feeling that the author could have stopped a bit sooner, or perhaps expand the chapter on compilers a bit more. In any case, when it comes to the overall impression, the How book wins thanks to its laser-tight focus on building a computer and nothing else, while Code seems more open-ended and talking about particular technologies specific for a given time period makes it feel a bit aged. That said, Code is a great book to read to refresh your knowledge of the basic computer design that can add a lot of basics and fill in the picture. Highly recommended!

  29. 4 out of 5

    Angelos

    A very nice introduction into what makes computers tick. It's detailed enough to give you a sense on how things work, yet not overly complicated to intimidate you. I really liked the gradual introduction to concepts of increasing complexity where each builds on the one before it. I feel like I've learned a lot by reading this book, especially since we had no relevant computer architecture courses in college. That said, I have a couple of complaints. One is that I feel the author covers the initial A very nice introduction into what makes computers tick. It's detailed enough to give you a sense on how things work, yet not overly complicated to intimidate you. I really liked the gradual introduction to concepts of increasing complexity where each builds on the one before it. I feel like I've learned a lot by reading this book, especially since we had no relevant computer architecture courses in college. That said, I have a couple of complaints. One is that I feel the author covers the initial, simple, concepts like Morse code, binary numbers, Braille etc in excruciating detail, yet is quick to cover complex concepts and areas as the book progresses on digital circuits, CPUs etc. Ideally I'd like fewer details on the initial concepts and a better and more detailed explanation of later ones. The second complaint, which is to be expected, is that the book was written in 1999. Although still highly relevant when it comes to computer architecture, it contains a lot of references that feel a bit dated, especially in the later chapters that cover multimedia (CDs, DVDs), GUIs, the WWW, etc. So, I highly recommend this book to anyone interested in how computers are built, from the ground up. I find it's a good fit even for CompSci students/graduates that want to fill-in their knowledge gaps like me. That said, this book is not an easy/quick read. It's pretty technical so prepare to put in some time to grok some concepts if you really want to understand how things work.

  30. 5 out of 5

    Andrew

    Although there are a few chapters that I probably gained very little from (I’m looking at you flip flops!) , there were a majority of them that was incredibly useful in taking what seems a complicated subject matter and simplified it in ways that makes it accessible. When I was near the end I sort of circled back to the chapters on binary and hexadecimal and it definitely helped me with the conversions. I also really like how the author tied in the use of a flashlight for on and off, to morse co Although there are a few chapters that I probably gained very little from (I’m looking at you flip flops!) , there were a majority of them that was incredibly useful in taking what seems a complicated subject matter and simplified it in ways that makes it accessible. When I was near the end I sort of circled back to the chapters on binary and hexadecimal and it definitely helped me with the conversions. I also really like how the author tied in the use of a flashlight for on and off, to morse code, to the computer age we are in currently. I would love for this book to be updated now that fiber optics and blu-ray are prevalent and that we are going more towards cloud computing. I’ll be using this for a reference for years to come. The downside is that this type of book really needs some sort of practicals because when there are just random pages of assembly code and hexadecimal reference codes for a chipset that I am sure is no longer in any of my devices, it doesn’t add any sort of value to the reader.

Add a review

Your email address will not be published. Required fields are marked *

Loading...
We use cookies to give you the best online experience. By using our website you agree to our use of cookies in accordance with our cookie policy.