Back in the mid-late 80's and early-mid 90's, I went through a series of hastily foreshortened computing experiences. I was a bit late to the personal computing scene, so there was an element of catch-up involved, but, from the outset, I was really more interested in development than in personal use.
My first exposure to computing was a class in Fortran, spring semester 1972, work for which was done on a card punch machine and programs run by taking card decks to a counter that separated the computer operators from the rest of us, loading the deck into a card reader, then returning later to pick up the printout. In addition to Fortran, there was a bit of JCL at the beginning (and end?) of each deck. I also had a single experience with using a printing terminal (no screen), having figured it out enough to get started from the documentation on a nearby table.
I did well in Fortran, but bombed out of PL/I (Programming Language One) the following semester. This was probably more due to extracurricular complications than anything to do with that language, but that, combined with my initial failure to comprehend calculus, did put any notion I had of going into computer science on hold for more than a decade. (At that time, computer science was widely viewed as a branch of mathematics, and calculus was required for admission into the degree program.)
I dropped out after my second year in order to pursue an entirely different, non-academic interest, which involved moving to a different city in a different state. Nevertheless, what little I'd learned informed how I then saw the world, and I was beginning to view everything largely in terms of information.
I caught my first sight of a personal computer of some sort in late-summer 1976, just after the end of a summer program I'd attended at a small college in Vermont. I didn't get close enough to it to read any labels, and it's possible that it was merely a desk-sized word processor, but that thought didn't occur to me until much later. I thought I was looking at a stand-alone computer dedicated to use by a single person, and that idea began worming its way into my consciousness and prepared me to understand the significance of Moore's Law, when I finally encountered it as such, and (re)awakened an interest in microprocessors and integrated circuitry in general, and in the programmable code behind digital computation. Clearly there was something there that would only become increasingly important, and would sooner or later enable things that were all but inconceivable at that time.
Even so, when I did go back to school full-time, in 1978, it was in biology, with the half-hearted intention of switching into engineering after completing my bachelor's degree. I've never really regretted my foray into biology, because without it I might never have found my way to (the layman's version of) systems theory, from which I gleaned a few key concepts, including that of an open system, emergence, and the elusive idea of a strange attractor.
All of this had nothing at all to do with what I was doing for a living at the time. So, shifting gears...
The first computer I actually owned was an Atari 600XL (Great keyboard!), which came with just 16 kilobytes of RAM, but I also purchased the external module that increased that to 64 kilobytes and a 300 baud modem, which I used to connect to a mainframe at the local university so I could work online and file programs for a course in Pascal.
Next I moved up to the Atari 1040 ST, and bought the developer's kit (Mark Williams C), but made the mistake of getting the lower resolution color screen instead of the higher resolution grayscale screen, so coding was difficult, and I was unfamiliar both with C and with operating system APIs in general and GEM/TOS in particular (DRI's Graphics Environment Manager and Tramiel Operating System after Jack Tramiel who'd purchased Atari). I did learn a bit, but it was overwhelming, and I ended up selling that machine, including the dev kit, and switching to MS-DOS running on an Epson computer for my next go-round.
As I recall, I used that machine for both a class in data structures, using Pascal, and an accelerated course in C, but that whole period is a blur for me, so I might have the details wrong. I do remember owning a copy of Turbo C for it.
Next came an Amiga 500, which made the Epson superfluous, followed by an Amiga 3000, which made the 500 superfluous. I got the 3000 because I had this idea for a program I wanted to write. There was also a dongle involved, the name of which I don't remember, that converted the Amiga's video output to a standard SD video signal, recordable on VHS. In any case, as with my early attempts at programming the Atari ST, this project involved programming in C using operating system APIs, but this time I had a clue about C and the available documentation was much better!
This project involved first creating a map from code, then scrolling it around (similar to how UIScrollView works in iOS) while the mouse cursor, positioned in the middle of the screen, traced out a path. To avoid artifacts, and as a means of establishing pacing, the map scrolling had to be done during the vertical blank, the time when the electron beam in the monitor moved back from the bottom to the top. I don't clearly recall how this worked, but it probably involved creating a callback function and passing a pointer to that to the OS, so it would be called when the vertical blank happened. In any case, this was my first exposure to time-constrained (real-time) processing.
I might have a hard time proving this, since, if I do still have the code for it squirreled away somewhere, it's probably on an Amiga-formatted floppy disk, but that program did work, and I was able to make a video recording. I was not, however, successful in getting anyone else interested it the project, so I also lost interest. At the same time, my interest in Amiga as a platform was ebbing. I'd recognized the limitations (in the absence of scale and adequate investment) of their custom-hardware approach, and was ready to jump ship.
I nearly forgot one chapter of this story, which was that I replaced the Amiga with one of the original Pentium machines, a Packard Bell Legend 100, as I recall, which came with Windows For Workgroups and a ridiculous graphical wrapper. It had an optical drive as well as a hard disk and a couple of floppy drives. This was a machine I could reasonably have used for development, but my heart wasn't in it, and this was at the time when the web was taking off. I picked up a bit of HTML, but otherwise mainly used that machine as a fancy terminal emulator.
I'd been casually following Steve Jobs's NeXT since the beginning, and around the time of Apple's 'acquisition' of NeXT I donated the Pentium, then, soon after that, switched up my whole situation, which was disruptive, so it wasn't until the iMac came along that I again owned a computer, and even longer before I got into Mac programming. But that's another story, for another time.