You never forget the first girl you kissed – in my case, it was Suzie (or maybe, Carol), just as you remember your first computer, in my case the IBM 650 at Brown University in 1957. (Just to set the record straight, that was a year BEFORE Don Knuth saw his first computer, also an IBM 650. Didn’t take him too long to catch up to and pass me.) A bright-eyed, shy 16-year old Freshman, I had enrolled in an Applied Mathematics course. The first half of the semester, we learned to solve difference equations, learned the rudiments of Numerical Analysis, and how to use a Frieden desk calculator. (I’m sure I was assigned one of the ones that some nice lady at Bletchley Park used to crack the Enigma Code.) But then, aah – I learned all there was to know about programming in only half a semester (you have to remember that, after all, Brown IS an Ivy league institution).Smile

Thomas Watson, Jr. the President of IBM was Brown ’37, and made sure the his alma mater got the newest and best if the IBM computer line. The 650 was the only computer to use a bi-quinary number system. A digit consisted of two parts – a 0/5 designator, and a 0/1/2/3/4 designator. The minds reels trying to figure out how that was all implemented in hardware after learning Boolean logic and the 1s and 0s of current computer circuitry. The memory consisted of a rotating magnetic drum with capacity of up to 4000 “words” of 10 digits or 5 characters each. An instruction was fetched from memory using a read head that read from the drum and executed in some fixed (for each different instruction type) time, and the next instruction fetched. Remember, however, that while the CPU was executing the instruction, the drum was rotating under the read head, and it may just have passed the location where the next instruction to be executed existed, forcing us to wait for a (almost) complete revolution to find out what to do next. No problem, right? We just make sure to figure out where the read head would be after the CPU executed, say, a 15 command (which added a number to the lower accumulator) and store the next instructions right where the read head would be at the completion  of the add instruction. Great, but that bookkeeping becomes ridiculous, keeping track of what locations have been used, and remembering how long an add takes, and, oh, yes its code is 15. So along comes SOAP (Symbolic Optimizing Assembly Program) to do all the grunt work. It figures out where the next instruction should be placed for optimal fetch time, knows that AU means add to upper accumulator (= op code 15), oh, and even figures out optimal placement of data, as that also has to be fetched to be operated on.

I thought that was a good as it got in programming, until some guy (actually a team led by John Backus at IBM) went and invented Fortran. This is what I did the programs for the data reduction for my Honors Thesis in Physics (no easy Computer Science majors then). Lots more about Fortran and FORTRAN (don’t get me started on that controversy) later.

 “This book is affectionately dedicated to the Type 650 computer once installed at Case Institute of Technology, with whom I have spent many pleasant evenings” – Donald Knuth (1968) – Dedication to The Art of Computer Programming

Yep, I had a problem here (and I may change my mind back again) as to what to call this. I made the (in retrospect) obvious decision to title the post with what I was writing about. That being said, I’ll still talk about computation later, but I think in a separate topic covering things like logic, complexity, etc.

So we have a nice clean definition of what we are going to talk about – computers. So, what is a computer? Whoops, problems already. Loosely, we can say somewhat circularly, it is anything that computes. So is a pencil and paper, accompanied by the human holding the pencil a computer? That’s stupid, you say, but that is exactly what the (mostly women) at Bletchley Park did and were called. That’s unsatisfactory to me, and it’s my blog, so that ain’t happenin’. (Much, much more about that and them later, but if you want to read ahead, see the references – but no extra credit on the final grade.)  So is it something mechanical/electrical that aids in mathematics? In one sense that’s too narrow – we probably want to include any “machine” that processes information. In another sense it’s too broad – is an abacus a computer? Again just a thingie with a carbon-based I/O system. Or even the Egyptians with 3 sticks tied together in lengths of 3, 4, and 5, so they could run around creating right angles? I think not.

I’m taking the coward’s way out here – to paraphrase  U.S. Supreme Court Justice Potter Stewart re pornography. “I know it when I see it.” Suffice it to say, we will see all kinds of embodiments of “computers”, some of which will make you say. “Huh?”, and post a scathing comment. To those of you who feel like doing that all I can say is. “Bring it on!!!” I thrive on controversy.

So let’s get started: what was the first “computer” (I will henceforth drop the quotes with the understanding that there could (will) be some disagreement)? Let me propose the Jacquard loom (~1800). Another “Huh?” moment. I chose this because it is the first device I know of whose behavior was “programmable” – that is, it didn’t just kind of stand there waiting for some one to tell it what to do. Note the distinction here, someone is telling it what to do. An abacus needs fingers (and nimble ones, at that) to do any computing. Here in the case of the loom, instructions were prepared off-line, loaded into the machine, and the “GO” button pushed. In its simplest manifestation, the pattern to be woven was encoded into a set of punched cards. Without a full Weaving 101 class, the basic idea was that the cards told the loom what to do with the various threads that were being woven, whether to go over or under, or some such. But, Martin Davis points out, that’s how a player piano works, and no one pretends that it’s any sort of a computer, so we’re back to square one. As a side note, Napoleon I stuck his two cents in here, as he did with so many scientific ventures.

Next up, Charles Babbage’s Analytic Engine.

Weave, weave, weave me the sunshine out of the falling rain

Weave me the  hope of a new tomorrow and fill my cup again. Peter Yarrow (1998?) – Around the Campfire


Bletchley Park – The Secret Lives of Codebreakers  by Sinclair McKay

  Colossus: The secrets of Bletchley Park’s code-breaking computers by B. Jack Copeland

Jacquard’s Loom – Jacquard’s Web: How a hand-loom led to the birth of the information age by James Essinger

Any discussion of mathematics and the history thereof must obviously start with a discussion of what numbers are, or must it? On the surface, it would seem that all mathematics is based on numbers, and more specifically the (positive) integers; 1,2, … Negative numbers and even zero come much later in the game, and the ellipsis (pedantry alert – that’s those 3 dots meaning etc.), wow, what a can of worms that opened in the late 19th century. But I digress – why did I say, “or not!!”?

For those of us paying attention in High School, Algebra, (after the hassle of trying to get word problems formulated as equations), seemed just to consist of moving things around from one side of an equal side to the other, adding this number to both sides, dividing both sides by something, and for the more advanced of us, taking square (and yes, even cube) roots: in short just following some rules guaranteed to not “change” anything, in some fundamental meaning of change. These rules we used were the one that kept the validity of the equation intact, so that if what we started with was true, what we ended with was true.

Then (or before, depending on your particular school district) came Geometry, which was much different. Here we drew pretty (or in my case, not so pretty) pictures and proved theorems about these little pictures. You know, this picture can hold twice as many squares as that one. Or if all the angles of one triangle are the same as the angles of another , the triangles kinda look like each other, or some such. Completely different stuff. Or is it? Again, we are starting with something true, whether axioms/postulates (Geometry) or a formulation of an existing condition for a word problem (Algebra), doing some sort of applications of rules to symbols (pay attention, SYMBOLS, not necessarily numbers), and coming up with a new theorem or a statement of the form x=7 in Algebra. Similarly we can view Trigonometry and its crazily named function the same way, all we are really doing is manipulating symbols (not necessarily numbers) according to some rules, that, in some sense, “work”, meaning that if we start with true stuff, we end up with true stuff. And so on, through Calculus, and areas of mathematics that none of us, save the people working in that field know even exist.  Cobordant Manifolds and Thurston’s Geometrization Conjecture? Really, mathematicians?

So it seems that all we are doing in essence is manipulation of symbols according to some set of rules. It all make sense to us because we “know” what the symbols mean. So when we see 2+1=3, it makes sense; we know it’s true. But wait, if we wrote 3=2+1 (same guy, just rewritten), can’t we take that as a definition of 3? That is, “3” is the number we get when we add 1 to 2. Or 3 is the number after 2 or the successor  to 2. (Just a bit more of this and I hope I’ll have made my point. Hang in there.) Please try to recall here what a function is. Simply you put something into the function and get something out. So if I say f(x) = x2, when I put 5 in I get 25, or f(5) = 25. Back to 3=2+1. Let me define S(x) to be the successor function, so that S(x)=x+1; then I have S(2)=2+1=3. Here’s the real point – this makes sense because we have associated our experience with what numbers are. We could consistently use a successor function to define the next letter in the Latin (or any other) alphabet, with proper finagling. But, regardless, we would still be deriving true statements. And S(x) might have nothing to do with any of these semantic interpretations.

The takeaway here, expressed in a somewhat roundabout way, is that mathematics exists independent of the semantics we apply to our symbols. We are only interested in the syntax, the rules for doing things the right way. we will see a lot more discussion of this when we discuss logic in the late 1800s and early 1900s, people like Gottlob Frege, Giuseppe Peano, and Bertrand Russell, etc. (if you want to read ahead.) We will now go back to discussing our main thread, the History of Mathematics. Fear not, we will be dealing with familiar stuff like 1, 3, 8, +,  /, etc. for quite a while. Stay tuned.

I think math is a hugely creative field, because there are some very well-defined operations that you have to work within. You are, in a sense, straightjacketed by the rules of the mathematics.  But within that constrained environment, it’s up to you what you do with the symbols. Brian Greene – The Hidden Reality (2011)


Since the main thread of my blog is going to have some ongoing entries concerning the history of mathematics, I am including some books on the subject here to let you do a deeper dive, if you wish. You can also judge how much of my blog has been directly stolen from these references. Wink

Otto Neugebauer - The Exact Sciences in Antiquity heads the list only because I had the privilege of having the author for a professor. It is written in a stilted form and is difficult to read. That being said, it still remains one of the primary sourcebooks on early (1700 BCE and forward) Egyptian/Babylonian mathematics and astronomy. It signs off during the twilight of Greek mathematics, around 50 CE, with a discussion of Ptolemy.

Morris Kline – Mathematical Thought from Ancient to Modern Times  remains the classic book on the history of mathematics, but beware – here be dragons. This little (~1200 pages) contains actual equations, up to and past the partial differential equation level. The good news – the book is still eminently readable even if you don’t understand the math. Enough context is given to keep the story coherent even though the math is not necessarily comprehended. Update 6/2014 – This is now available as a three-volume set, to save on the schlepping energy. And there is always the Kindle option.

Carl B. Boyer – A History of Mathematics keeps the math to an Algebra II level while still describing the development of more advanced topics. (and weighs in at only ~600 pages.) The book is oriented heavily toward the people involved and kind of scrimps on the description of the flow of increasing knowledge, treating the progress discretely by tying discoveries to individuals rather than a continuous process. Not wrong, just not my personal preference for how to treat the topic. De gustibus …

John Stillwell – Mathematics and Its History is truly written for a mathematician, or at least a college student pursuing a degree in Mathematics. In addition, it comes from the Springer Press, historically a publisher of deeply intensive books. (Also, Springer-Verlag). In graduate school we would learn to dread these yellow books, as they signify we were in for a rough semester. However, if you have any mathematical training, say Advanced Calculus and maybe Modern Algebra (group theory, etc.), and the bucks to spend, you will find this an engrossing read.

So many books, so little time ..

Luke Hodgkin – A History of Mathematics, my penultimate suggestion, is a fine middle ground for readers who can tolerate some mathematical rigor, but who not possess advanced training. There are also examples for the reader to solve, using the techniques available at the time covered in the period under discussion. It is also one of the few books on this subject that discuss the crisis due to modernity (you’ll have to read the book, or Google it).

and finally:

David Burton – The History of Mathematics: An Introduction – Hey, if you like lots of pictures, this is for you. The mathematics is not intense, but the impact of the principles are brought out clearly. It, too, has problems to while away those cold winter nights.

And, of course, all the above contain references to other books, which, in turn …

“All science, logic and mathematics included, is a function of the epoch – all science, in its ideals as well as its achievements.” E. H. Moore (~1920)


Sorry for the long time lapse, but from the dearth of comments, I guess not many people missed me. I invested in renewing my hosting account, so I’d better get back to work on the blog.

I will start back with the first of what I hope will be many entries on the history of computation. Notice I didn’t say the history of computing; I intend to throw in a bit of the history of mathematics and lives of mathematicians. I was very fortunate to take a course in same from Otto Neugebauer at Brown University in the late 50’s (1950’s). A common thread throughout this blog will be to share off-beat things I have been exposed to. In this case, however, I believe that computation as a science (Mathematics) and computing as an industry are inextricably linked. This point is especially well made in a book by another one of the teachers I have been fortunate to have been exposed to, Martin Davis at the Courant Institute of NYU. The book I am referring to is Engines of Logic, and more about it and Martin in a later blog.

“They’re baaaaack!” – Heather O’Rourke in Poltergeist II: The Other Side (1986)

Being of a mathematical bent, I’ll start this series with my favorite books on algorithms. By this I mean books that describe the underlying concepts and algorithms of programming without regard to any particular language, though they may use a specific language to give examples.

Donald KnuthThe Art of Computer Programming (Volumes 1-4)

TAOCP is considered to be the authoritative computer programming algorithm treatise, even though only 4 of the planned 7 volumes have published and there is doubt whether Knuth (b. 1938) will live long enough to complete his plans. The books are amazing in their breadth and depth of coverage, and are not an easy read, but anything that can be gleaned from this set will be well worthwhile. Dr. Knuth is a fascinating person – he has designed and written a type-setting system and designed a technique for solving mathematical “exact cover” problems, of which Sudoku is probably the most famous example. Please read this true story which shows Dr. Knuth’s programming ability, and by all means Google him, and wander through the links. Fascinating.

Robert Sedgewick – Algorithms

As far as algorithms are concerned, this could be considered TAOCP “lite”, but only by comparison. It too is a heavy tome, weighing in at over 2 pounds (it is also available in Kindle format, however). Another advantage over TAOCP is that the examples are given in a high-level language, as opposed to Knuth’s MIX assembler language. The original Pascal examples have been re-written in Java, so they are also accessible to all you C# people. Note that the algorithms referred to here are those of discrete mathematics: queues, stacks, graphs, trees, etc.

Thomas H. Cormen, et al – Introduction to Algorithms

Suffice it to say that this is the most commonly used college textbook on this topic. One topic covered especially well, and not at all in the other books (but give Knuth some time) is multi-threading.

These are the three most famous books on algorithms, but feel free to give some feedback with your favorites or criticisms of these.


I decry the current tendency to seek patents on algorithms. There are better ways to earn a living than to prevent other people from making use of one’s contributions to computer science. – Donald Knuth (19??)

I am starting a series of blogs on what I feel are the “best” programming books I have run across in my career. While the title says “Programming”, I really mean to capture all facets of IT – design, program management, etc. I have been doing this stuff since before most of you were born, but I’ll try to keep my choices to those books that are still in print. (No papayri or chiseled stone tablets.Smile)

I have chosen some arbitrary categories, and will treat each one in a separate post. Please make this truly interactive and chime in with your comments. As well as a learning experience, this is a great way to share with others what has helped you.

To me programming is more than an important practical art. It is also a gigantic undertaking in the foundations of knowledge. – Dr. Grace Hopper – USN Ret. (Everyone in the IT field should know of this remarkable woman. More on her in a future blog.)

Channel 9 is a Microsoft-sponsored video blog with the latest breaking news on releases, hands-on sessions on new and emerging technologies, and other Microsoft-related news. It can be accessed at:

I get it streamed via Tivo so I can see it large-screen, and I would be surprised if the newer “smart” TVs didn’t also have this capability built-in. However you get it, it is worth an hour of your time here and there to at least check out the content.

The internet? We are not interested in it.Bill Gates (1993)

As I have mentioned before, training and education are two of my interests. As a manager, I am often asked why the company doesn’t pay for training. In some cases we do; in other cases I refer them to this excerpt from The Clean Coder by R. Martin. This is a wonderful book, and I heartily recommend it to everyone involved with anything to do with programming, and will be quoting from it frequently in these posts.

I have never let my schooling interfere with my education.Mark Twain (18??)