Yes, the pdp-8 was a fun system due to its simplicity. DEC provided, free of charge, a number of wonderful books on their devices and one of them taught me how to program the pdp-8 in assembler. Since our machine had only a very slow ASR-33 teletype/tape punch for I/O, I did all my programing by entering the 12-bit instructions using the switches on the front panel and I tested each routine by single stepping through it and watching the lights. Once a routine worked, I would punch it out on paper tape using the ASR-33.
This approach was slow, but fun and I learned a lot. I used the two 8-bit DACs they provided to drive the x-y inputs of a Tek RM503 scope where I displayed my data (it was a signal averager).
What a difference between this and using Xcode to write programs in Swift and Metal!
I could run Think Pascal on my original Macintosh 128K, with a floppy drive. I think it was after I upgraded it to 512K, which I used for a RAM disk.
I created a program âData Diskerâ, which I uploaded to a BBS. On the original Mac, with no hard drive, you had to insert a system disk when you booted it. If you inserted a data disk, it would display an icon to indicate that it wasnât what was needed. But if you had copied the System file to a disk, then deleted it to make the disk a data disk, inserting that disk at boot would cause a crash.
The purpose of Data Disker was to change these disks back to true data disks, by erasing the boot sector.
For a twist on this topic, how about ancient software on modern systems?
The software behind my public-service TowerBells Website is the direct descendant of a
report-generator program that was originally written in Fortran IV for a mainframe,
with program and data both residing in boxes of 80-column IBM punched cards, and output
on line printers. The first personal computer with enough storage to handle that volume
of data was a DEC Rainbow; I ordered the first available hard drive for it, which was
external and was announced at 5MB for $5K; what was delivered was 10MB for the same price.
I rewrote the program in Pascal to run on that system, with output to a daisy-wheel
printer/typewriter. Years later, when a third-party 40MB hard drive died (long after the
Rainbow marketplace had), I converted the program to Borland Turbo Pascal 5.5 on an IBM
AT-compatible PC, with output to an HP LaserJet printer. When the Web came along, I added
output routines for HTML and XML, with sneakernet to a Mac 9600, where final polishing was
done. In the latest change, just a couple of years ago, I migrated all of the PC-based
software to DOSBox-X, running on the Mojave system where I am composing this message.
That includes not only the report-generator source code but also the Turbo Pascal
compiler, a text editor (Kedit, a PC version of IBMâs widely used Xedit mainframe editor)
and an elegant file-management utility (Stereo Shell). Of course there have been many
incremental improvements in the report-generator program over the last four decades, but
the data that feeds it is still flat files of 80-column card images. The next major
improvement will be to add JSON output, for migrating the totality of the data to
a more modern system so that it can be maintained by people less ancient than I am.
Incidentally, my first exposure to computers was via a free short course in Fortran II,
offered as an adjunct to a graduate course in numerical analysis (which was still being
done primarily with paper and pencil in those days). The computer on which it ran
(the first one that the university owned!) was an IBM 650, which had 20K words of memory
on a rotating drum. While I was later exposed to various assembler languages, I have
always been thankful to have started with a high-level language, because it gave me the
necessary mental framework for procedural problem-solving. The best language that I
encountered during my USAF career was Fortran V, Univacâs proprietary customization
of Fortran IV for its 1100-series mainframes. It was an important predecessor of the
eventual Fortran 77 standard, and produced amazingly efficient machine language.
But thatâs a story for another day.
I loved Fortran IV for doing what I most like to do on a computer: numerical calculations. In the mid to late 60s I used the Fortran compilers which ran on our Universityâs IBM 7094 and later 360/91 computers. To get the most out of it on the 91 one needed to specify region=350k, which allowed the optimizing compiler to run in 350k of memory. I was very impressed with the quality of the optimization in these ancient compilers.
Compare that with the need for 32 GB of RAM to comfortably develop on modern Macs! That is almost a factor of 100,000! Does the GUI or the use of highly abstracted object oriented code really need so much more memory?
I am using the 64-bit version of DOXBox to run the DOS (!) app Open Access on an M2 Macbook Air running Sonoma.
In the late 1980s I wrote a comprehensive business management package for my father-in-law to help him with accounts, payroll, stock control and invoicing.
When I started my consulting engineering business in 1990 I used the same package. Over the decades I saw no need to change to expensive, less-functional commercial software and so migrated Open Access to various PC/Wndows and Mac OS.
Open Access has a very powerful programming language that uses an early version of SQL. I can still tweak my programs and compile the source to generate updated apps.
The Year 2000 transition was a bit traumatic but fortunately a talented developer issued a free patch for users.
I moved to Mac in 2003. Before upgrading OSX/macOS I needed to check that DOSBox would still work.
My main limitation (apart from remembering that I cannot use a mouse) is that I cannot directly print files or generate spreadsheets for my accountant. I have to print to a text file and use a Mac app to print or convert to a PDF/XLS.
My journey with Open Access is briefly documented here:
Development tools definitely required less resources in the past. In 1987, I ran Microsoft Fortran on an 8 MHz 8088 PC with 512K RAM and no hard drive (two floppy drives - one with my code and the other with the compiler - and Iâd have to swap compiler disks a few times during the build).
But modern compilers are much more complicated than they were 35 years ago. They optimize the generated machine code much more and they include much much larger standard libraries, so you donât need to redesign standard things like sorting algorithms, hash tables, etc. And the languages themselves include many advanced features that wouldâve been impossible to implement on the computers back then.
C++ is a perfect example of this. The compiler itself isnât particularly huge (although itâs definitely a lot bigger than the pre-ISO version of the language from the 80âs), but the standard library (see https://www.cppreference.com/) is massive. This doesnât bloat your application, because unused features arenât linked in, but it does consume a lot of space on any installation.
You might find it interesting to compare the size of the standards documents for these languages, as they have grown over the years:
ANSI X3J3/90.4: The FORTRAN-77 standard. 237 pages (approximately - this is the number of times the footer text appears in the file. If printed in two columns, it would probably come to about 100 pages.).
It doesnât require 32 GB of RAM to comfortably develop software on modern Macs. Plenty of people develop major applications with much less. A lot of RAM helps when developing large applications because it will let you compile many files simultaneously. It will also let you run the app in a debugger environment while still allowing it access to as much RAM as your customers are likely to have.
But that having been said, a modern environment like Xcode does a lot more than the development environments from the 70âs and 80âs.
In the old days, youâd have a simple text editor to work on your application, and a standalone compiler to compile those files into executable code. And if you need a debugger, thatâs a third app. And back then, you probably didnât run them all at once either.
A modern IDE, on the other hand, is a massive application, designed to simplify development of large projects. They are going to be comparable in size and complexity to a modern office suite. Maybe even larger. Some things developers today expect of them include:
Editor
Track all the files in your project, no matter how big they are or how many there are.
Let you edit as many as you want at once.
Color syntax highlighting. Color the text in your code based on the syntax of the language(s) you are using.
Identify objects in your application (functions, variables, objects, etc.) and automatically create cross-references so you can quickly view/edit their definitions and all the places where they are used.
Include specialized editors for specialized file types, including images, fonts, sounds, property lists, window layouts and many other kinds of system resources.
Integration with most popular version control systems for tracking code changes over time, as they are created by a variety of other developers.
Compiler
Parallel compilation, up to the limits of your system
Dependency tracking. So if you change a part of your application, it will know what other parts do and do not need to be recompiled. So you donât waste time rebuilding files that donât need to be rebuilt, and you donât create errors by forgetting to rebuild code that depends on your change. And you donât have to manually keep track of what depends on what.
Distributed builds. For a large project, you may have developer tools installed on multiple computers on your network. Your IDE may hand-off parts of your project to these other computers in order to speed up the build process.
Debugger
Let you start, stop, pause, single-step and otherwise take control of how your code is executing.
Examine any parts of memory used by your app, associating it with the specific variables and objects in your code. You can view variables and also watch them change over time as the application runs.
Integration with the editors, so you can just point to objects in the code and quickly see their values in the running app.
Remote debugging, so another computer on the network can take control of your app. Which is critical if your app is doing something that requires it to take control of your local screen, keyboard or mouse. Also for debugging iOS apps running on a device connected via USB.
Automation
Integration with automated testing tools
Integration with CI/CD environments, so changes can be reviewed by other developers and then be automatically deployed (especially for cloud-based apps) as they are approved.
And all this is in addition to the compilers needed for all the supported languages (for Xcode, this will be at least C, C++, Objective C, Objective C++ and Swift), their standard libraries, Appleâs extensions to those libraries and the interfaces to all of the OS APIs.
None of this is anything like it was several decades ago.
Sorry, I was being a bit hyperbolic in my claims, but current systems seem to require much more memory than similar systems did 20 years ago. In the early 2000s (a couple of years after OS X appeared) I wrote a fairly complicated optical design program (raytracing 30,000 rays through many lenses and curved mirrors with a 3D presentation of the system) in objective C and openGL on a G4 powerbook with 500 MB of memory. It compiled fairly quickly, implying that it wasnât hung up with a lot of swapping and even ran smoothly and quickly on the old hardware.
Now, when I attempt to use the iOS simulator on my 16 GB M1 Pro, it often requires swap, which could impact performance and SSD lifetimes. I am giving this machine to my daughter and just bought an M3 Pro MBP with 36 GB. I intend to further develop this program and add many more features.
I have a lovely little book called Tales from the Computer Room, which I bought when I was an undergraduate a lifetime ago (late 1970s). It has one story about a âgrey-haired programmerâ called Alfred. Your story brought it to mind:
The cost accounting suite was written for the IBM 650 and now runs on the IBM/370. Alfred wrote a program (in 650 SOAP) which converted any SOAP program to IBM1410 Autocoder. The 1410 was superseded by a /360 with 1410 emulator, an then by a /370 without, and at that stage a program was written which converted 1410 Autocoder into COBOL, at the instigation of Bob Peaseblossomâs predecessor but three, in the vain hope that the Cost Accounting suite might become intelligible to someone else besides Alfred. The COBOL version was however unhelpful, consisting of such statements as
ADD P13649 TO P14930 GIVING P63341
The program for converting 650 to 1410 was now all running in COBOL on the /370.
Jeremyâs story reminds me of the first time I encountered an IBM 360. I think it was emulating an IBM 709, which was even older than the IBM 7094 that I had previously used elsewhere. In that emulation mode, it would read a card, print a line, read a card, print a line, ⌠at a dreadfully slow speed. My first task was to recompile the program to run without emulation, after which both the card reader and the line printer just purred.
BTW, I always refused to learn COBOL, but I have successfully debugged it.
Same here. It was sort of common knowledge that if you become a COBOL expert, you will never have a problem finding a job, but the rest of your life will be spent developing COBOL software.
Pascal was the first language I learned after I learned to program on Basic (this is the mid-70s, while I was in my teens); a big learning jump for me â first time Iâd used a compiler, and definitely the first time I was exposed to structured programming, which Pascal handled significantly better than the other widely-used languages available to myself and my fellow students.
I grew to like it, but never love it â mostly because other languages captured my attention and imagination before long (Simula-67, C).
Same here. Pascal was the first language I learned after Basic (and Fortran). I spent a couple years at ETH in Zurich and Wirth (who got his PhD from Cal) was very present there. Treated by many like a god, but he never acted that way. I remember hearing a lot about Modula in those days, but I never learned it. I had been hooked on C (although we wrapped it around lots of what was still Fortran-77). And later of course C++ which I still happily use to this day. RIP Niklaus Wirth.
I remember reading this quote from many years ago.
âWhereas Europeans generally pronounce my name the right way (âNi-klows Wirtâ), Americans invariably mangle it into âNick-les Worthâ. This is to say that Europeans call me by name, but Americans call me by value.â
Indeed. His name is German and in German thereâs no real difference in the pronunciation of v and w. Both sound like v (at least to an English speaker)*. Same for t and th. While that difference is huge in English, in German they sound the exact same. In fact, the German word for host is Wirt and that sounds identical to Niklaus Wirthâs last name.
*) This btw is the reason the accent of the German villain in the movies always has a cliche w along the lines of âze vestern alliesâ. And sure enough, a friend of mine who teaches English speech for foreign scholars here tells me that a common exercise for German speaking participants uses phrases such as âa window with a vividly worthy view of Mt. Vesuviusâ. The Germans happily return the favor by teaching us beautiful words such as Maschendrahtzaun.