“Well, I’ll be damned!”, I said. “This looks just like a GW-Basic Interpreter. Where the hell did you find this?”
“Well…”, he started.
“Never mind, everyone go away and leave me alone. I need to think.” I waved everyone away. On my table in those days always lay a copy of Kernighan & Ritchie’s “C Programming Language“, the Second Edition, the ANSI Standard one. It had lain there for oh so long. And today it’s time had come. I opened it and started to read.
For all the hoopla over Steve Jobs, there was hardly a murmur when Dennis Ritchie died. And arguably without his work, Apple Lisa, Next Computers and their successor, the Mac series, would not have existed. For Ritchie invented Unix and he invented C. And in a rather belated response was awarded the National Medal of Technology “For co-inventing the UNIX operating system and the C programming language which together have led to enormous advance to computer hardware, software and networking systems. And assimilated the growth of an entire industry thereby enhancing American leadership in the information age”.
Anyway, Brian Kernighan wrote the first tutorial and K&R is still one of the best technology manuals in the world. It jumps right in and gets down to business without too much fuss. As the authors say “C is not a big language and it is not well served by a big book”, which should be a lesson to all tech authors who build 600 page books to demonstrate the most basic of functions. And right off the bat, K&R tell us that we must learn to print the words ‘Hello World” on the screen and proceed to show us just that on Page 2.
Here is the program.
printf(“Hello World! \n”);
Of course, no male member of my family has ever learned to take direction. So naturally, I skimmed over that on my way to find out how to read a stream of data. The next big exercise was converting Fahrenheit to Celsius and back. Who cares? And then section 1.5 hit. Wham! There it is! Whoeewee!
And within Section 1.5.1 and 1.5.2 I had the basic building blocks. All that was left was to figure out how to construct the code within those curly brackets and keep track of those pesky semi-colons.
With K&R next to me or on my lap, by the next day I was able to demonstrate a rudimentary program that could take in a file and print it on the screen. All I had to do then was (a) cut the stream into record size blocks (b) append record markers to the block and (c) write them out int a file on disk. Next step was to ask for the location of the input data stream and the location of the output file. Done. Then a thought, why not ask for record size as well and make that variable? For in this case each record is 80 characters in length, but in the future we could use this for variable record lengths. Done.
The next morning the new program was demonstrated and much relief swept through the camp. Operators were shown how to use it and the first of the floppies started being converted. The interpreter was naturally slow and so the hunt had already started for a C Compiler. Within the week, the system had an executable file called CHOP.EXE.
chop [input-file], [output-file], [record-length]
If any of the parameters was missing, it gave you a message and dropped out. The data was beginning to flow and my superhero status was saved from extinction. That, ladies and gentlemen, is the first time (and only time) that I’ve ever read a manual before starting to use a product.
But, ladies and gentlemen, I never learned to write the most popular program in the world; the one that prints the words “Hello World” on the screen. It parallels a life where everything I’ve learned I’ve learned on the job under pressure. And I’ve always wished that someday I could go into a classroom and learn things before I have to use them. And so, six years later, as the head honcho at my own software products company I treated myself to a class room course to learn C programming.
And found that I knew it already.