| categories:misc

Rik Farrow Bio

The Short Version

Rik Farrow was born in Washington, DC, and attended Catholic grade school, and Good Counsel high school, graduating in 1968. He graduated with a BS in Psychology from the University of Maryland, with more than enough credits for a minor in Computer Science. He worked briefly for Altec in Silver Spring, and for North Star Computers in Berkeley, before beginning a long career as a consultant in 1980.

Rik wrote manuals for micro-computer system hardware, then moved on to writing about Unix. He wrote the Programmer’s Guide to UNIX, working as a ghostwriter, then Unix System Administrator’s Guide to System V (1989), and Unix System Security (1990). He began writing for UNIXWorld Magazine in 1986, and was their Technical Editor from 1989-1994. In 1994, Rik began teaching Internet Security (which was mostly Unix security with networking), something he did until Windows became the more popular choice around 2003. In 1998, Rik began editing one issue a year for USENIX ;login, and became the fulltime editor in 2005.

Rik still consults, and occasionally makes presentations for groups ranging from neighbors to conference attendees.

Ancient History

I began by computing career in 1968, when I took a FORTRAN course at the University of Maryland. The summer of 1969, I worked as a software librarian for GE in Bethesda, Maryland, where I thought I first encountered MULTICS. To be honest, I didn’t understand the importance of MULTICS at the time, and was much more impressed with the analog speed gauge attached to the cabinet containing the dual CPUs of this GE645 computer. Typical speed was around 650,000 instructions per second. A Rasberry Pi2 does about 900 million instructions per second.

While I had long wondered if this mainframe ran MULTICS, Tom Van Vleck told me that no GE645 in the Bethesda area ever ran MULTICS, and this system was likely running GECOS-III. At least I finally knew where the old name for the comment (GECOS) field in UNIX password entries came from.

While in college, I majored in Psychology, and worked as a lab assistant. I learned how to teach rats and pigeons, skills I used later in life, for example, to teach my cats to sit on command. I also wrote software for the group of professors I worked for. I was a ‘special student’, but I didn’t know that this meant I was invited to do research. My advisor, Frederick Hegge, vaguely hinted at this by jiggling keys in front of rats–apparently hoping to inspire me to do something involving sound and rats–while a graduate student warned me not to use the equipment I needed. I followed the clear instructions from the graduate student.

When I graduated, the job market was tough–but then, when hasn’t it been tough? I wound up working for the USPS as a clerk, and later as a truck driver, learning how drive semi’s in the Washington DC area. While this job has little to do with my later career focuses, I did learn that people of average intelligence could memorize long lists of addresses or zipcodes, and the carrier or distribution center associated with each. I also got a lot stronger because working at that level for the USPS involved lots of physical activity. On my days off, I would bicycle 20 miles or cut cords of firewood, as I had become accustomed to working hard.

The USPS job eventually became just too boring to continue. I was fortunate that some friends re-introduced me to the world of computer science at that time.

Interrupts and Microprocessors

First, Tom Mapp, a psychology grad student from the lab I worked at, and best friend, had gone back to the University of Maryland to get his Masters in Computer Science. Tom brought me into the lab, where there were two PDP11’s for the use of students taking the operating system class. Tom showed me the code for handling interrupts, a bit of concurrent programming that I had never heard of before, with the small amount of code needed for the keyboard device driver. I was amazed.

My other influence was a bit more subtle. Madison Jones, ex-Marine Sargent, ex-President of the U Md student body (the only non-frat boy to be elected at that time), had asked me if he could store a box of books where I was living, while he traveled around the US. On top of this box of books was a Zilog Z80 manual. The Z80 was an early, eight bit, CPU that competed with Intel’s 8080 CPU. The Z80 ran at 4 MHz, making it a bit faster than the GE MULTICS mainframe I had encountered ten years earlier. With only 16 bits of address space, the Z80 could only directly address 64k of memory, and an additional 256 IO ports, so it was puny in comparison to what the multiprocessor mainframe could do. But it was also cheap enough that people were building their own computers using the Z80.

My roommate, Steve Salzman, worked for ADP at the time, and he knew of some companies who were using the Z80. I visited LSD (Language, Systems, and Development), a company where one of the CS grad students I had met at U Md was working on developing systems software. The second company I visited, Altec, was using the Z80 in an embedded system. I was interviewed on the spot and offered a job, because I could explain how to build a counter out of two J-K flip-flops. I had been taking a computer architecture course and the operating systems course at U Md, which is how a postal employee would have a clue about using simple integrated circuits to count in binary.

Altec built digitizing tables, large plates of frosted glass with a grid of wires expoxied to the underside. The wires were attached to shift registers, and a clock would shift several bits to enable current to flow in a group of wires across the height and width of the light table. The operator of the table would place a photo or diagram on the table, and move a cursor that contained a coil over a spot of interest, press a button, sending a signal to the Z80. Then, a clever phase-lock loop (PLL) would determine which set of wires were under the cursor when the button was pressed, and software running in the Z80 would return X and Y coordinates. Modern digitizing tablets work the same way, as do touchscreens, although our little screens use capacitance instead of magnetism to do their magic.

Altec hired me as a programmer, the assistant to Grant Woods, but they soon had me working in the field, installing tablets and instructing new users. My first cross country trip brought me to Menlo Park, California, to install two tables for the USGS. I visited San Francisco, where Tom Mapp was living, and decided to leave Maryland, which alternated between steamy summers and ice-filled winters, for the more moderate climate of San Francisco, and the allure of Silicon Valley.

North Star Computers

I eventually got a job in customer support (1979) at North Star Computers, working for Alan Bowker. Alan taught me about the physical circuitry in the Horizon, and encouraged me to build one from a kit. I bought a kit with my employee discount ($800), and built my own Horizon, which I used for many years. I later added an eight inch, Fujitsu, hard drive to the Horizon, and sold it to a commune who used the computer to run their grocery home-delivery business for many years. My initial intuition that simple CPUs like the Z80 would revolutionize the world seemed to be coming true.

Working in support for a company that sold both hardware and software, when the entire support team consists of five people, is unlikely to ever be easy. And given that anyone could buy a kit and call us with questions about their problems with the resulting system, it was a stressful job. We were lucky when the probelm was as simple as the computer not being plugged in. Many times, we would listen to the sounds the computer made over the phone, or ask the caller to describe the sounds to us, as the floppy disks were both noisy and slow, and we could diagnose a lot of problems just by listening to the sounds during an attempt to boot.

I quit working at North Star before the end of 1979. They were moving the company to San Leandro, from Berkely, and I would have wound up in a cube farm, instead of a room with a window (even though it faced the railroad tracks at the back of the building). The stress of dealing with an unending series of calls from desparate folks who had little clue about building or using computers had also taken their toll on me.

Self Employed

I began working as a consultant, doing simple programming jobs that involved North Star Horizon computers at first. I helped people integrate other systems, like a gas chromatograph (for Chevron), or a machine-controlled vertical mill, because I understood the simple operating system CP/M, and could patch in other devices. That’s how I managed to add the Fujitsu hard drive to my homebuilt Horizon: I patched in device drivers to CP/M’s primitive file system, mapping the M: drive to the hard disk device driver.

Finding work was hard at first, and I’ve always recommended that anyone who plans on switching to being self-employed both start consulting part-time first, as well as have a large amount of savings to tide them over for the slack times once they quit (or lose) their fulltime jobs.

I’ll be honest with you: working parttime suited me well. I had a difficult time sitting in an office everyday, because while somedays I would be really focused on working, other days I just wanted to be someplace else. I imagine that many people feel this way–but few actually are willing to take the risks that I did by not working fulltime. Much later in life, I discovered that my restlessness might be related to undiagnosed ADHD. And maybe not: perhaps I just needed a lot of stimulation (not stress!), and cubicle life wasn’t going to work for me.

I didn’t sit around and wait for people to call me with work. I called people at North Star and Morrow Designs, letting them know I was looking for work. Eventually, a call to Jean Morrow did result in my being asked to revise a manual for the Morrow floppy disk controller board. The current manual was written for the earlier version of the board, which, like North Star’s controller, was memory mapped. In 64K systems, giving up 8K bytes of your memory for a disk controller that only used about 256 bytes of memory was terribly wasteful. George Morrow had the brilliant idea that he could use the Z80’s IO ports instead of memory mapping for the disk controller, and he designed a board to do this. George’s design was also much more flexible than other controllers of the day. There were too many ‘standards’ for floppy disk formats, and George’s controller was flexible enough to handle most of them, including both 5 1/4 and 8 inch floppy disks.

I received the old manual as a Wordstar file on a floppy disk – fortunately one my Horizon could read – a working IO-based controller (DJ-IO), and the schematics for the board. I quickly rewrote the manual, but ran into a problem while testing it’s ability to boot my North Star Horizon system. It really shouldn’t matter which disk controller I used, but booting kept failing. I discovered, by reading the board schematics, that George had flipped a low-order bit, so the first bytes of the bootstrapping code were written in one place in a ROM, but the hardware was reading a location 16 bits away. When I showed this to George, he grumped at me, removed the ROM from the board and burned in the three bytes needed at the address being used. A much more elegant solution than doing a hardware patch, as those three bytes had previously been all ones. Now, they continue a jump to the address of the bootstrap code.

Morrow allowed me to keep the DJIO, and the 8 inch floppy drive I had been provided with for testing. I really needed the ability to read and write 8 inch floppy drives formatted for CP/M, and the controller in my Horizon was no help. CP/M was written in assembly, and I suppose I could have written the code to switch between controllers and drive types. Instead, I used a ‘new’ programming language, C, that I had discovered, and wrote the file system and device driver as a separate program. It took just two pages of C code, and I could copy files between my 5 1/4 and the 8 inch drive, rename files, list the directory, and delete files. Still, C was just a couple of steps above assembler, a programming language written by geniuses, [Dennis Ritchie and Ken Thompson](https://en.wikipedia.org/wiki/C_(programming_language), and really not suitable for everyday use by average programmers. C allows the programmer full access (my version had an extension for access to IO ports too), and has resulted in generations of programmers who write unstable and insecure programs.

Ritchie and Thompson had also been the creators of Unix, along with a handful of other Bell Labs employees. I was to learn about Unix shortly, after a short interlude with a group who wanted to build a little internet of sorts: Community Memory.

Community Memory

Community Memory had come and gone before I moved to the Bay Area. But I met a group of people attempting to revive the idea of an online, open access, free database for storing information, opinions, calendars, house and job openings, and other information. The original Community Memory had done all of these things, but used an inefficient computer, and had just three terminals at it’s peak. The group I worked with planned on using X.25, a networking protocol, to connect a much larger number of terminals across the Bay Area. I was hired to locate prospects that would license the X.25 implementation they had written in C, and the money would be used to fund the project.

I did find one prospect, a vice president at a large company, and was quickly let go. But I had made a connection, Sandy Emerson, who would later play a small but important role in my career. And I learned that I really hated cold-calling, the process of calling people on the phone and trying to sell high-ranking corporate people software for handling a protocol few people understood. I only was asked to work at Community Memory because no one who worked there wanted to do that job.

Dual Systems

In 1982, desktop PCs were just catching on, but so were small, multi-user systems. One of my connections contacted me, and asked me if I could write a systems manual. I had no idea what he was talking about, but he convinced me to drop by Dual Systems, another Bay Area microcomputer startup, one that has disappeared with hardly a trace left.

Dual Systems used a Motorola 68000, a 32 bit internal, 16 bit external, CPU that ran a version of Unix, probably System III. This was the first I had heard of Unix, and I was really impressed with this operating system. I was accustomed to editing assembler, or poking bytes into memory locations, to change the workings of the systems I had used, while Unix had actual configuration files. Unix also supported multiple users simultaneously, something I hadn’t experience since my college experiences with mainframes.

I succeeded in writing a systems manual for Dual Systems, learning the basics of Unix system administration the hard way: no examples, no books, and of course, no Internet. I did talk to the people at Mt Xinu, but I would often get answers from people there like, “No one needs to know that, so just ignore it.” It was a frustrating time.

Morrow Micronix

Morrow Designs, George’s renamed company, was also experimenting with Unix. One of their programmers had written a Unix knock-off, which Morrow called Micronix. Micronix could run both CP/M and Unix-style programs on the same system. I was hired again to write a system’s manual, this time for Micronix.

Morrow later became a Unix licensee, and developed their own Motorola 68010 system. The system was a bit of a kludge, as it started life as a Z80 system, with an S100 8 bit bus, which was modified to be a 16 bit bus to support the Motorola CPU. Unlike previous devices I had worked on for Morrow, I had to buy parts of this system, as I wanted a Unix system for my own, and Morrow couldn’t just give it to me. Hard drives were still small in capacity and very expensive. I bought my own hard drive, at an OEM price of $1000 for 34 megabytes of capacity. I had to modify the motherboard to support 24 bit addressing between the CPU and DRAM cards. My total system cost was around $2000, still a bargain for a Unix system in 1983.

I wrote yet another Unix system manual, this time starting with a fictional chapter, A Day in the Life of a Unix System Administrator. I had meant this part to be obviously fictional, and to help people understand their daily duties: booting, monitoring, user and printer management, backup, and shutdown. Some people thought I had actually video-taped a real person, so I guess I was more successful than I thought at writing my first published fiction.

I was facing another challenge around this time. While Morrow Designs was still flourishing, I could tell that things were not going well. IBM had entered the PC business years ago, and small manufacturers of competing desktop products were failing left and right.

I got my last consulting job for Morrow, writing a manual for a very cool hard drive controller, one that used a RISC processor and behaved like channel I/O. I had just returned from a three month sailing cruise, from San Francisco to the Panama Canal, and really needed the work. And I had fun writing this manual, as the designer, not George Morrow, had written the code for the RISC processor using his own programming language, and I used his source code as the basis for the manual. I later heard from people who had used the manual that it was the easiest manual to read and use they had experienced for writing a device driver, but I think the credit goes to the designer, not me. He had designed an elegant interface, where the programmer wrote values into an in-memory table, then sent a command telling the board where to find the table. Commands could be chained, so that as each command finished, the next would take over. And data was read from or written directly to memory, using Direct Memory Access (DMA).

A Problem with Documentation

Have you ever had issues with documentation? You follow the instructions carefully, but the device or software you are trying to use doesn’t work the way the documentation says it should. At this point, you might be suppressing negative feelings about this. Or, perhaps, you have thrown the device across the room, or more likely stomped out because the device or software in question is too expensive to destroy or give up on.

There are often good reasons for this. One of the most obvious is that the development of software and hardware often takes twists and turns during development and testing. Another is that the engineers doing the development hate the process of documenting their work, just can’t be bothered, or simply forgot to update the changes they made to the design and functioning of their project.

I have seen these things happen, or experienced what may have been the result of these behaviors.

But I had another story to tell, that I left out in my first version of this long history. When I returned from my three month sailing cruise, something I barely survived, I headed to Morrow in search of work. I was told by the manager who I worked with that they had already contracted a professional documentation writer to handle the nifty hard drive controller.

I was disheartened, but decided to hang around for a while with the hardware engineers because I liked listening to them talk about new technology. If you are wondering about my NDA with Morrow, I didn’t have one. Things were much more informal in those days. I did have an employee name badge though, as this was a factory, with devices like hard disks being as expensive as used cars, and I needed the badge, plus the willingness to have my backpack searched, to get into and out of the building.

While we were chatting, a woman walks up and cheerily announces that she has written the first ten pages of the documentation for the disk controller. Then she asked how she can install it in her computer–one that doesn’t include a bus for installing additional boards. The engineers and the manager in charge turn and look at me, and the manager says, “Stick around for a bit, would you?” Then he goes off with the professional writer.

When he comes back, I am given the job of writing the manual, one suitable for writing device drivers, which I do. I later heard that the woman complained at a Bay Area meeting for professional documentation writers that “men get all the work”, when it was clear that she didn’t know what she was doing, other than writing. It could have been a man in the role of cluelessness here, but the result would have been the same: a quiet moment shared among the clueful, and a disappointed writing professional.

The finished manual was 22 pages in length, and worked. She had written ten pages without even knowing that she couldn’t even use the device in her computer. I used that device in my first Unix system as well as patching it into my kit computer using CP/M. Sometimes, you really want an engineer, or someone like an engineer, to be writing your documentation, someone who can read code and schematics and understand systems programming.

Becca Thomas

About this time, Sandy Emerson contacted me, and told me that she knew of another person I could work for, if I was interested in learning more about Unix. When I met with Sandy, she told me that there would be a problem if I decided to go ahead: these people were only interested in ghost writers. Anything I wrote would be published under their names. I could tell that Sandy was warning me about a somewhat unpleasant reality, but I was still interested. So far, everything I had written was essentially a work-for-hire, that is, I was already a ghost writer.

I arranged a meeting with Becca Thomas, and visited her at her house near Cole Vally (San Francisco). There I met an English professor, Joe Campbell, from UC Berkeley, and discovered that I had to pass a writing test before I could go any further. I found this interesting, but not that farfetched, as these people were writing books, not manuals, and the bar for comprehensibility was going to be much higher, I supposed.

Joe had me write a description of the Unix file system for non-programmers. I had already attempted doing this several times, for my systems manuals and on my own, so I quickly turned in a couple of pages which Joe found acceptable.

I also met Becca’s partner, Jean Yates, in writing her first book, A User Guide to the Unix System. Sandy had told me that their first Unix book was a best-seller, and if I wanted to check that out, I should go to Stacy’s bookstore, on Market Street. When I went to the basement of Stacy’s, where their technical books were, I asked at the help desk about User’s Guide, and the man there pointed to stacks of her books piled in front of his desk. There were no other books stacked there, and I was impressed.

I soon began ghost writing A Programmer’s Guide to the Unix System. User’s Guide covered the 44 most popular Unix utilities, based on process accounting records for Unix system running at UC Berkeley. The book I was writing needed to cover programming tools, like the C compiler, linker, and debugger, along with the Unix system calls. And I had a Xenix system that I could play with at Becca’s and Jean’s house.

Becca was a wonderful person to be working for, most of the time. One day, I moved the Xenix box out into the backyard, and sat in the shade of a tree playing with Xenix and drinking a beer. Jean Yates was not so nice, but she and Becca were splitting up around this time. So, I would write sections of the book, share them with Becca, who would mark them up (with red ink, on paper), and return them to me for revising. She would also run Writer’s Workbench, to provide more feedback. I got drilled on using more active voice many times.

While I did my research at Becca’s, I did my writing at home, in a flat near in the Western Addition. I would use my Morrow Design system, running Micronix, then use my C program to copy the output to 8 inch floppies for delivery to Becca. I wrote all of Programmer’s Guide, and the system administration appendix for a business user’s guide to Unix, during this period.

The books I had written were not nearly as successful as User’s Guide, perhaps because the market for Unix programmers was much smaller. When my ghost writing work dried up, Becca introduced me to another person who was using Unix, Tom Lubinsky.

Sherrill-Lubinski

Peter Sherrill and Tom Lubinski had a small office in Corte Madera, a few miles north of the Golden Gate Bridge. When Tom interviewed me, he questioned me on both graphics and SmallTalk, two topics I knew little about. I went to Stacy’s and bought a book by SmallTalk-80’s creators, Alan Kay and Adele Goldberg, and started learning about object-oriented programming.

Tom was a talented programmer. He had already written an interpreter for Objective C, and was interested in building an object-oriented graphical modeling system. He already had a basic working system, and needed someone who could write documentation, using the source code for reference. While the system was object-oriented, the underlying language was C, and that was something I understood well enough to become immediately useful. I worked for Sherrill-Lubinski, later SL, on and off from 1985 until 1991.

System Administrator’s Guide to System V

It was at this time that I convinced Becca Thomas that the world needed a book on Unix system administration. As the already successful author, she negotiated a contract with Prentice-Hall. We had wanted to call this book The Handbook, but when we suggested this title to our editor at Prentice Hall, he told us it was already taken. In other words, there was another group busy writing a Unix system administration book, and Prentice Hall also had a contract with them. Evi Nemeth and her students wrote the first editrion of a vastly more popular system administration book.

Our book was finished first, although even though that took much too long. If you used System V, or needed to use Unix-to-Unix-Copy, UUCP, our book was the one for you. I had become an expert in serial communications, modems, and UUCP long before we finished out book. I had wired and configured the offices at SL and UnixWorld Magazine, and routinely used UUCP for email and file transfer. Also, early Linux looked a lot like System V Unix, and Linux users sometimes used our book. But we had picked the wrong operating system to spend four years writing about. It does seem strange to me that you can still buy used copies of our book, which is now terribly outdated.

Evi Nemeth, working at the University of Colorado in Boulder, focused on Berkely Unix, BSD. Sun Microsystems used BSD for their workstations, as did other workstation vendors like DEC ULTRIX. Because of the difficulty in licensing Unix from AT&T, many students got their experience using BSD-related version of Unix at their universities, and that was, of course, what Evi and her students wrote about. They also had a much lighter style than we did, and covered TCP/IP networking, which we didn’t.

I also convinced Becca that if were were going to write a book about Unix system administration, we should try teaching. I thought that we would learn what worked and what didn’t when explaining system administration principles, and that would make the book better. We first taught a small class at a business, then were invited to teach at the USENIX Technical Conference in Atlanta, Georgia, in 1986. That class went well, and was my first USENIX conference as well.

We were invited to teach at the next USENIX conference in January of 1987. The person in charge of tutorials at that time wanted to advertise our class as ‘advanced’, and I told him that wouldn’t work: we didn’t have enough experience to call ourselves advanced. But he did it anyway in the advertising.

The result was very unpleasant for me. We had a few people expecting an advanced class in system administration. I later worked with Dan Klein, organizing the tutorial program for USENIX for many years, and there were advanced topics in system administration, but no such thing as an advanced class at the LISA conferences with as many as 40 tutorials on system administration topics. But we had people asking impossible to answer questions about problems they were having running Unix on mini-computer-class hardware we had never heard of. I literally became sick at one point, but returned to the classroom and finished the day.

On the other hand, Becca and I focused on system administration for years and probably could have been considered advanced by the time we had finished the book. By then, we were getting invited to teach at some interesting places, including a mini-computer vendor and large corporations. We sometimes solved problems in just a few minutes that had stumped the sysadmins at the sites we visited for weeks. But that was later in our process.

By 2010, many companies were moving to cloud-based systems, and system administration started moving toward Software Reliability Engineering (SRE), at Google first, but then at all the large cloud providers. SRE is very different than the system administration that I had learned and taught. Instead of managing a server and perhaps a handful of workstations, you have hundreds or even tens of thousands of servers. Automation is key, as is having the means to simply create, setup, and configure new servers as well as the virtual machines running on them. Backup isn’t the same as it was, as storage is also a cloud service and backup is mostly by means of having distributed copies of data or striping.

##UnixWorld Magazine

Becca was writing a column, called Wizard’s Grabbag, about useful tricks for using Unix systems when we started working on our book. One day, she convinced me to drive with her to Mountain View, California, and sit in on a editors’ meeting at the UnixWorld offices. I went along, and based on some criticism I made of some of their published articles, I was asked by Diane Jacobs, the executive editor, if I would tech edit their articles. I soon started reading drafts of articles, always on the lookout for authors who didn’t know what they were talking about, or just plain bullshitting.

I also took over managing columnists from Becca. I had to corral five people into turning in their articles on time, plus checking them for accuracy, every month. I also had to find a guest author every month. By the time I was dropped from the magazine (1994), my section of the magazine was twelve to fourteen pages long. That may not sound like much, but when you consider that UW magazine had a minimum of 38% content, an amount based on US postal regulations, that winds up being nearly 40 pages of an average issue.

By the mid-90s, Unix had become so mainstream that botique magazines, like UnixWorld and even Unix Review, were falling on hard times. The advertising that had supported these narrowly focused magazines was now appearing in larger, more mainstream magazines like Business Week.

##Introduction to Security

Soon after I became involved working with Joe Campbell and Becca Thomas, Joe suggested that I should look into security for them. The first thing I did was laugh, as I had heard amazing stories about the lack of Unix security, including a magic password that the creators had installed (true, although those versions never escaped Bell Labs), and constant security breaches via sendmail, a complex mail server program.

Joe managed to provide me with a printout, about 15 pages long, of ways that Unix systems at UC Santa Cruz had been hacked. UCSC, like most universities using Unix, had source code licenses, and students not only could read the source to programs and the operating system, they could edit and create their own versions of them. Most of the hacks involved changing a system program, then tricking a system administrator into installing them. Or, just about as common, discovering a terminal logged into root, the system administration account, and using that to install their hack.

While the UCSC hacks were interesting, the vast majority of non-university Unix users don’t have access to AT&T’s source code, so these examples didn’t help me a lot. I would also hear stories when working at Morrow, where engineers would talk about bizarre sounding hacks, like using sendmail to append a new user account to the password file. It turns out that early versions of sendmail did support appending to a local file as root, and this actually worked at first. I found this a great example of unexpected consequences, in that Eric Allman, sendmail’s creator, didn’t expect that his mail server software would be used for hacking. Not that this didn’t wind up happening often…as I will write about later.

I also find it difficult to understand how I learned anything in the 80s. The internet was only available to some universities and companies, there were no search engines, or even page layout standards like HTML. There was netnews, a way of sharing text messages, and later images, that used dial-up connections or the internet. Netnews was a little like Reddit, in that you could follow categories, for example, unix-security, and perhaps learn some useful things. But this wasn’t useful unless you had both an internet connection and someone dedicated to managing the flood of news feeds, which was megabytes a day in the late 80s. In an era where disk space was precious, netnews could quickly waste all your free disk space unless carefully managed.

I did manage to learn a few things. One of the things I learned was that people were fascinated with security, so I decided I would give a talk on security. The venue I chose was the Silicon Valley Unix Users group, who had access to a large auditorium owned by Hewlett-Packard in Palo Alto. I was really going way over my head, perhaps because I was naive, or just crazy bold by going in front of such a high-end, technical audience.

I had eleven slides for an hour long presentation, and the talk had to be cut short at 90 minutes because of all the questions and people adding to the little I had. I only remember one slide, an example /etc/passwd file, that had examples of common hacks, like having a user account with 0 as the user id number, making it a root account. One SCO Systems bug (essentially Xenix or System III Unix), let you become root if you used a letter instead of a number in the user id field. For example,for 1O4, the character to integer function would return zero as the failure indicator, but this wasn’t checked and meant that the account would work as a root account.

By 1987, I had a one day course in Unix security that I started teaching. UniForum, a Unix users group for commercial users, sponsered me at first. During that first public class, I told students that set-user-id didn’t work for shell scripts, and that was a good thing. Someone told me that this wasn’t true on HP Unix (HPUX). I later learned that this was true–and an awful thing too. HPUX had over one hundred set-user-id root scripts, generally vulnerable to simple hacks. At the time, I hadn’t tried this on HPUX, so I just said it didn’t work on the systems I had tried, but it certainly might work on HPUX.

My honesty had an unexpected side effect. There was a representative from the Danish Unix Users Group, DKUUG, in the class, and he later sent me an email inviting me to speak at a conference in Denmark. He suggested I buy myself a business class ticket, and show up in Odense for the conference. He would pay me for the airfare and provide accommodations.

Email, especially until strong anti-spam methods were implemented after 2000, was incredibly easy to spoof. The protocol uses text commands that were not validated, other than having the correct format. You could connect to a mail server, type “HELO spirit.com” and “FROM rik@spirit.com” no matter what system or username you actually were using. I actually had a shell script for sending email with binary file attachments that I used in the 90s, as my favorite mailtool didn’t support attachments then. In other words, my invitation could have been easily faked.

I did agree to speak, bought a $3000 ticket, and flew to Copenhagen. There I was comforted by discovering quite a few men all waiting for the flight to Odense for the conference I was to be speaking at. I did get to speak, and was handed a check by the organizer. Over the years, I was invited to teach in Copenhagen a dozen times, and those visits lead to teaching in Sweden, Norway, Iceland, Paris, Budapest, Bucharist and London as well. I eventually stopped teaching in Copenhagen because the long flights, even in business class, left me feeling ill, like the worst hangover I’d every experienced. I missed getting to present a keynote in Copenhagen because I was pressured to volunteer in the midst of dealing with the flight-induced hangover. Note that I didn’t drink alcohol on those ten hour flights. I did stop flying to Europe.

I started writing a book named Unix System Security based on my class. I also had learned some useful things about network security from a file I found online. I was at a loss for how to credit the author, David Curry, or provide the source info, as things like URLs didn’t exist, so to my later embarrassment, I didn’t say anything. I didn’t plagerize his content, as I tried everything he suggested and wrote about my experiences using his suggestions, but I didn’t credit him either. Curry wrote his own book on Unix security.

My book sold thousands of copies, but nowhere near as many as my book with Becca Thomas, that sold tens of thousands of books. But both books helped establish me as a known figure. Decades later, David Brumley, a professor at Carnegie Mellon University, told me that it was my book that got him interested in security, something several other people have told me. I am grateful that I managed to help some people with my early work.

###The Internet Worm

In November of 1988, the internet slowed to a halt. A worm, that is, a malicious network-spread program, was running on so many key systems that the usual means of communication between connected sites was unworkable. The Morris worm had been created and launched by Robert Tappan Morris, then a graduate student at Cornell University.

The worm attacked using several vectors, with sendmail being one of the most prominent and effective attacks. A software engineer at Sun Microsystems had insisted on adding a new command to sendmail, the debug command, that could be used to execute shell scripts on any server running the software. The worm also tried remote execution, which worked between system set up to trust users, as well as a buffer overflow in the fingerd service, a way of listing logged in users on remote systems.

There were several problems with the worm. First, it was irresponsible and illegal to break into systems. Second, the worm could infect each server multiple times. Although the worm checked to see if a system was already infected, with a one in seven (14%) chance, the worm would re-infect even if the system was already infected. And finally, the worm ran a password cracker, a program that tried a short list of common passwords, then used any cracked passwords to attempt to log into other systems. Password cracking is extremely CPU intensive, meaning that infected systems quickly became overwhelmed and unresponsive.

Morris, the son of Robert Morris, a famous cryptographer at the NSA, said that he didn’t intend for the worm to cause any harm, but to make a point about the insecurity of the current internet. Simply creating a directory, named /usr/tmp/sh, would prevent the worm from working. But by the time Morris decided to send out this fix it was already too late: email servers were overwhelmed.

I symphasized with Morris, as I had become aware of how insecure the internet was at this time, and would remain terribly insecure until around 2000, when the security of servers started to become better, although far from secure. Others were incensed about what Morris had done, the time wasted cleaning up infections, and perhaps embarrassed about being caught with their pants down too. They petitioned the court at Morris’ trial that he be severely punished. Morris was fined $10,050 plus the cost of supervision of three year’s probation, 400 hours of community service, and forbidden to research computer security as part of his probabtion terms.

Morris went on to become a professor at MIT, a founder of Viaweb where he made enough money to become a founding member of Y Combinator, a famous venture capital company. I’ve met Morris several times and liked him. I’ve published multiple articles written by his students, all of them because of the quality of their research.

As for the long term impact of the worm, very little happened. The Computer Emergency Response Team (CERT) was founded as a clearing house for information about ongoing attacks and network malware. But, in general, internet connected systems remained largely insecure. sendmail continued to be used, remote execution based on trust was still allowed, and systems would be delivered with very insecure configurations. I asked some folks from Sun Microsystems who had presented a paper at a USENIX conference about securely configuring Sun servers but only within Sun, they answered that systems for the public were left insecure because they were easier to use that way.

In 1991, I won an award for an article I wrote for UnixWorld about the worm. I met Jonathan Littman (San Francisco Chronicle) and John Markoff (New York Times) and rode an elevator with the two of them up to the ballroom where the awards ceremony was held. They also had written about the worm for much more prominent publications, and I was to meet Littman again a few years later and work on a book with him.

###Kevin Mitnick

I met Kevin Mitnick after he was released from prison, during a party associated with the Las Vegas BlackHat conference in 2000. Mitnick was circling the room, shaking hands with people, mumbling, “Hi, I’m Kevin Mitnick”, then moving to the next person. I was sitting with members of the L0pht, men I had gotten to know because we sometimes worked together to correct, or at least attempt to correct, mistakes made by journalists while covering hacking and computer crime.

Years before I ran into Mitnick, Littman had asked me to drop by his Mill Valley (Marin County) home and help him by writing a few pages about the attack that had gotten Mitnick into more trouble. I was familiar with the attack because it used a technique that Steve Bellovin had written about in 1989, but one that seemed farfetched at the time. The attack involved overwhelming a server that was trusted by the target with internet traffic, then spoofing its internet address. The attack relies on the use of remote execution and trust, and quickly provided Mitnick with access to a server he was interested in.

Tsutomo Shimomura, a consultant who worked in the San Diego area, had written software for the cloning of cellphones. He had used his software to demonstrate the ability to easily do this for testimony to the US Congress, but kept it on a server in his home office. That server had a current connection to the server Mitnick had just compromised, and Mitnick soon had copied the cell phone cloning software. Shimomura was off skiing, leaving me wondering why he left his internal system with the criminal (cloning cell phones is illegal) software on it logged into by an internet-connected server. Mitnick allegedly left voicemail for Shimomura, mocking him for his weak security.

Mitnick surely had provoked Shimomura, who contacted the FBI and started working with them to apprehend Mitnick, who was already wanted for parole violations from a previous conviction. Shimomura and Markoff wrote a book, that later became a movie (Takedown), sensationalizing Mitnick. In prison, and for years later, Mitnick was forbidden from using any communication device other than a landline phone. Some even thought that Mitnick could hack into NORAD and launch nuclear weapons just by whistling into a phone. I hope that sounds as ridiculous to you, as it does to me.

While Mitnick was on the run, he kept in touch with Littman. Mitnick would break into The Well, an early mail service, and leave messages with Littman’s email. Mitnick would tell Littman to head to a nearby payphone, where Mitnick would call and talk to him. Mitnick lived and worked for years in Seattle, but left when he detected that members of the FBI that had been searching for him appeared in the Seattle area. How did he know? Mitnick had left software running on a telephone company switch that would alert him when the FBI agents cell phones appeared in the Seattle area. When Mitnick left Seattle for North Carolina, he left donuts in his apartment’s refrigerator for the FBI agents he expected.

Mitnick later became a security consultant and public speaker.

###Firewalls

In 1991, I was asked if I could provide an analysis of the ANS Interlock, an early firewall product. Someone wanted to acquire that product, and wanted an expert to examine it. In 1991, there were few experts in firewall technology (perhaps Marcus Ranum, but I agreed to do my best. I read the documentation for the Interlock, as well as several other commercial firewalls of the time. I didn’t actually test any of them.

I learned that the Interlock had a feature that I considered much more secure than most of the competing products. Interlock had application gateways, small server programs that recognized the protocols for internet services and enforced the correctness of any commands sent using those protocols. For example, an attempt at a sendmail buffer overflow would be blocked by the Interlock, as it enforced the 256 byte input line limit that was part of the mail server standard (RFC821.

Marcus Ranum had worked on the DEC firewall, one that involved using two DEC workstations and packet filtering, called the AltaVista Firewall. I would later get a job promoting firewalls in general from DEC.

My work studying firewalls helped me understand this early, and still used, network defense. I later would team up with Brent Chapman for USENIX sessions where we would answer questions about firewalls. Brent championed packet filtering where I specialized in commercial firewalls.

By about 2000, most companies were using stateful packet filtering. Whereas packet filtering used a simple ruleset for allowing or blocking packet transit, stateful packet filtering is a bit like if-this-than-that: a packet that is allowed transit through the firewall adds a state that will allow a response packet through the firewall. SPF rules can support even more complex filtering behaviors, for example, changing state based on the data within a packet.

Checkpoint, an Israeli company, introduced SPF in the US in 1994. The UnixWorld product editor grabbed me during Comdex and showed me to their booth. I wound up calling Checkpoint a ‘firewall even a manager could understand’, but pointed out that it lacked application gateways. The editor deleted that paragraph from the published version of the review, so I posted it to Chapman’s firewall’s mailing list.

I left for a work trip, and while I was away, my wife answered my work phone to an angry investor in Checkpoint who said to my wife, “Why did you do that? We paid you!” I presumed that the investor had meant that the products editor had been paid for a favorable review, and had cut out the unfavorable paragraph. But he knew better than to involve me in taking a bribe. I wonder what happened to that editor? Or perhaps that was business-as-usual, and I was too naive to realize I had missed out on money-making opportunities? Honestly, I wouldn’t have taken bribes.

##Internet Security

Shortly after I lost my work with UnixWorld, I was contacted by Ken Cutler, a VP at American Express, and invited to start teaching classes in Internet security. In these days, the mid-to-late 90s, that meant Unix security, my specialty. I soon had signed a letter of agreement to teach classes for $1000/day. At the time, that was about what I had been getting through other sources.

I had enough material about Unix security, networking and firewalls for a three day class. But that company wanted five days, so I would just spend the final two days reviewing the material I taught in the first three, a technique I had learned from Timothy Leary.

Talking for six hourse a day was grueling for me. My feet hurt, and I would sometimes be hoarse by 2PM. I was responsible for paying the hotel for breaks–things like $3 cookies and cans of Coke–plus taking class members to the hotel’s bar when the class was finished. I hated that part because I didn’t like drinking much, but I especially didn’t want to go to a bar where I would have to shout to be heard when my voice was already giving out.

As time went on, I discovered a couple of things about this company. First, they were primarily a ‘paper mill’, a training company focused on presenting paper certificates to people who had sat through a class. Other trainers would rush through the material, finishing by lunchtime the last day, while I was rushing to just to finish all my material before the end of the last day.

The other was my share of the door. For five days, I would get paid $5,000, not bad, but the company was getting $50,000. A typical consulting agreement resulted in the company getting a third of the take, not 90%. That certainly felt unfair to me, my aching feet, and hoarse voice.

And then a manager from DEC who was attending my classwho was attending my class asked me to have lunch with him. He wanted me to present 75 minute talks, twice a day, in a string of cities, for $2000 a day. I thought that this sounds more like fun than what I was doing. But I also thought, “Hey, this guy thinks I get paid $2000/day!” That, and my digust for the company setting up the training, would lead to my ending my work with them a few months later.

##Digital Equipment Lectures

DEC wanted me to present an hour plus 15 minute lectures on firewalls for them, once in the morning and again in the afternoon, in 13 cities over a four week period. Not only was the pay good, close to ridiculous for me at the time, but I really liked the idea. I didn’t have to praise DEC’s firewall (and didn’t), but could say whatever I liked. I created a slide deck–in those days, a set of transparencies that were displayed with overhead projectors–that I could use. Then I used the twice daily talks as speaking practice.

I treated each audience differently, depending on my perception of their technical and social background. I also spoke differently in the afternoons, because I thought my audience might be sleepy from just eating. As a result, every presentation was different. One of the DEC representatives had noticed this, and introduced me by saying that I had never delivered the same presentation twice. I was pleased. And I got a lot better at speaking as I could focus on my presentation skills on a variety of audiences.

##Computer Security Institute

Once I stopped working for the unnamed company, I was soon approached by Pam Salaway, who worked to set up classes for the Computer Security Institute (CSI). While the name sounds impressive, I was to learn that the institute had just a handful of employees, and made their money by setting up classes and running two conferences each year. What made CSI different from the start was the way Pam Salaway approached me, by saying “What do you want?” I thought for a moment, then replied, “$2000 a day, or a third of the gate, whichever is higher.” And Salaway said yes.

CSI was different in other ways. They seemed committed to actually teaching people, not just presenting them with pieces of paper. They were also much easier to work for, and allowed me to buy full fare air tickets. While I was supposed to buy the cheapest tickets, when there’s a problem, those with the cheap tickets are tourists, while those with full fare tickets are much more likely to actually get to where they need to be on time. A lot of my stress during this period had to do with air travel, as weather and mechanical failure surely do disrupt air travel frequently.

I worked for CSI both as an instructor and a consultant for the next eight years, and even gave a keynote speech for one of their San Francisco conferences.

##NT

NT stood for New Technology, and was Microsoft’s answer to Unix and Novell. Novell had local networking support for user logins, file sharing and printing, which Microsoft Windows did not. And Unix was killing Microsoft in the higher end business and technology markets. NT, now called Windows, was going to change all that.

I read the NT course notes of an instructor, who will go unnamed. He would stop lecturing any time I walked into his classroom and make jokes about me. I found this very suspicious, and that’s why I read his course notes. It turns out that he didn’t understand NT security at all, but was an enthusiastic and bombastic instructor that people really liked.

I found a programming book for NT that had a very good section on NT security. The one thing that other instructor had right was the emphasis on groups. NT has lots of groups, and a user’s privileges were associated with their account based on what groups that user belonged to. The user’s account ID, and their group account IDs, were all contained in a security token, one that was presented either from the local security service (LSS) or from a domain controller (DC), using an extension of Kerberos for securely sending the ID across the local network.

I put together a one day course, and taught that for a couple of years. Microsoft liked what I had done, and set me up with their developer’s network. That meant getting a box of CDROMs with new versions of NT and other software many times a year. I found it overwhelming, as I didn’t want to get sucked into the Windows world any deeper, and stopped teaching my NT class–for the most part.

##Department of Defense

In many of the classes I taught, I’d get lists of the people attending and the companies they worked for. In the security world, most of the people who said they worked for the Department of Defense, worked for the NSA. That was fine with me, and I taught several classes there, in their most open facility, Friendship Annex, or Fanex. This still meant visiting a trailor outside the facility every day I’d be teaching there, passing through a gated guarded by a submachine gun carrying contractor, and leaving all digital devices outside. Then you’d pass through a metal detector, before getting to teach in a drab windowless room–actually, there were windows, but the were covered in a mirrored coating–and have to be escorted to the bathroom.

I was asked to license the 2000 version of my course to the DoD, and part of the deal was that I had to teach instructors of the course. I would spend five half days, twice, teaching the class to the instructors, plus two men who might be described as ringers, as they were not at all like the potential instructors. The potential instructors were unimpressive: people with little to no technical background. I not only thought this was a bad idea, but also a bit insulting. How could you teach people about security, when the threat landscape, as well as the software and hardware, was constantly changing? I taught people from examples so they could learn how to think about security, not learn rote behaviors. And I had spent many years learning about programming, operating systems, networking, and exploits to do what I did.

The ringers were a Major and another man, whose name I really wish I could remember. They definitely knew the subject area, and were creating the hands-on portion of the class that would be taught in the afternoons. The Major said little, but the other guy did talk to me some, and left me feeling intrigued. Perhaps if I had taken their offer of getting a security clearance I could have learned more, but I really didn’t want to spend much time in that windowless world, and was also afraid I would fail to compartmentalize my knowledge. I would always speak from notes, and never from a written speech, and on the spur of the moment might start spouting something that was classified.

I had other DoD experiences, teaching Unix and NT security at CENTCOM. But my favorite time was at a Strategic Air Command (SAC) base on the Florida panhandle. I had a group of US and Canadian officers in a conference room where I could stand on a little platform at the end of the conference table. For some reason, I felt I could push things a bit, for example, asking if I could teach in my socks and not where my shoes. The rug was a nice thick one, and teaching almost barefoot was a real relief for me.

The other thing was that I had noticed that they had network jacks in the room. After I had been teaching the network portion of the class, I asked if I could try connecting my laptop to their internal network, thinking I might create a nice, on-the-spot, demonstration. I was told to go ahead, I would not be able to get a network address. They thought they had their network locked down.

I plugged in my network cable and started sniffing their network. Their first line of defense was to use a non-standard network mask. Most people kept things simple, using a netmask of eight or 16 bits, but they had chosen something else. Routers will tell you what the netmask is. I assigned my laptop an IP address and the unusual network mask, and I was in.

I wasn’t about to start a network scan, but I thought I could get away with something fairly benign. I asked if I could look at their DNS information, and got a yes. Again, they felt certain there was nothing I could find there. I guessed the address of their DNS server and requested a zone transfer. A quick glance through the downloaded information turned up some very unusual names, made up of 25 random-appearing collections of letters. I asked if these were important systems, and I was asked to disconnect from the network.

I’m not a hacker, someone who spends a lot of time doing what I was doing for a quick demonstration. I got lucky, but also knew enough to quickly bypass their defenses, which were essentially security by obscurity. Don’t rely on trying to hide things, or obscure what you are doing. For me, that just made things stand out as being different, and thus interesting.

###Information Technology

IT stands for Information Technology, a term reserved for people who manage computer systems and networks in corporations and government offices. I taught many IT people, and I also taught auditors, and I can say I was not overly impressed with either groups. These were people who knew how to do their jobs, but had no interest in really learning about what they were doing. The auditors just wanted checklists, and the IT persons could keep their systems working–just don’t ask them to do something different or new.

I had been spoiled by working with excellent programmers (SL), hardware engineers (Morrow), as well as getting to teach classes at USENIX and the Scandanavian Unix user groups. These were people who were interested in learning how things worked, how to make their services more reliable and easier to operate, how to setup configuration management, and usually awake enough to notice when an intrusion has occurred–very often, that something is subtly different.

The reality is that not everyone works at a university, research center, or cloud provider, places where pushing the limits is normal behavior. IT for most people is just a job working with computers, and we shouldn’t expect anything more from them.

##USENIX

First, I have a confession to make: I have never paid to attend a conference. I’ve had magazine editor credentials that would get me into just about any computer-related conference. I taught short sessions at SANS conferences twice, and when I was told I needed to pay to get in the first time, I told them I would leave instead. They let me in for free.

After teaching at the 1986 summer and 1987 winter USENIX conferences, the next USENIX conference I attended was in 1991. I was teaching at UniForum, and the USENIX conference was also in San Francisco, just a mile or so away. I was working as the technical editor for UnixWorld but didn’t have a business card, so I made one for myself, using the laser printer at SL and some used card stock. This was enough to get a press pass to attend a portion of the USENIX conference, and allowed me to walk into ongoing tutorials.

I soon started getting asked to teach at USENIX conferences, and in 1996 and 1997, I co-chaired the Invited Talks tracks at LISA, the Large Installation System Administration conference. LISA had few papers, so Invited Talks and tutorials were the main attractions. I soon started working with Dan Klein, the official tutorial manager, at first just helping arrange the tutorials into timeslots, then helping by reading proposals and comment on the instructors, and later taking over as the tutorial manager. Tutorials were a huge money-maker for USENIX, helping to support the other five or six conferences USENIX managed each year. But by the end of the 2018, tutorials were no longer attracting enough students to make them viable, and they were ended.

I enjoyed teaching at such a technical conference. I changed my Unix security class into a Linux one, then began experimenting with making them into hands-on classes. Doing this wasn’t easy, as you can’t be sharing a single system when everyone is either trying hack examples or working as root to secure their systems. I had been commissioned to teach a hands-on course for NASA designed to encourage good security practices, especially patching and updating systems. The NASA class included 24 laptops with RedHat Linux installed and a wired network. That was a good start, but it was up to me to get them all setup, on the network, and configured with the exercises I wanted the students to work on.

I used cfengine (configuration engine), a tool Mark Burgess had created in 1990, to configure and install software on each laptop. I had to do some work on each laptop, as DHCP wasn’t available to provide network addresses, and cfengine also required each client to have a certificate for authorization, and that the time be set correctly. Once that was working, I could modify the configuration of each laptop, adding examples of hacks for the students to uncover. As an important side effect, I provided an example of how cfengine could be used for configuration management and patching, although patching wasn’t something cfengine did natively.

I tried doing something similar at USENIX with Tony Delporto’s help setting up the wired network. Using WiFi wasn’t something we wanted to do, as having a roomful of people scanning the network and trying tools like ettercao on the conference WiFi network would have been a terrible idea. It turned out that setting up the wired network was too labor intensive, but it was useful experience.

For my final class in Copenhagen, the plan was to use old desktop systems with Linux installed. The day after I arrived, I felt horrible, as if I had the worse hangover ever, but without drinking! The old systems were flaky, and just getting them running difficult. In the end, less than ten systems were ready. I never wanted to do any of that again.

The next year I started using a bootable Linux CD, called Knoppix, that I would customized by including the exercises I wanted on it. Knoppix already fills a CDROM, so I would remove portions that weren’t required for the class, and add tools like nc and nmap that the class would be using. I also included a vulnerable version of fingerd, so people could experiment with the same buffer overflow used by the Internet Worm. The CD worked well, but I wasn’t done yet.

As virtualization software for workstations had become more common, I started distributing virtual machines as large files. VMWare had established a standard format for VM images, and that made the images fairly portable. All students needed was to show up with a laptop, virtualization software already installed, and about two gigabytes of free disk space, so they could then install the image from a USB drive. This wasn’t perfect, as the USB drive could become infected, a drive that would be used to install the image on multiple systems, but fortunately this didn’t happen as far as I know. I still had the USB drives after the class, and they appeared clean of any malware.

###Running Servers

I began operating my own mail server in 1994. I thought that if I was going to teach Unix security, I needed to practive what I taught. I had covered the UNIX System Laboratories, Inc. v. Berkeley Software Design, Inc. lawsuit for UnixWorld magazine in early 90s, and was later granted a license with free support for BSDI, and that’s what I ran on my mail server.

I also registered the spirit.com domain name that year. I had been contacted by Trent Hein, one of the authors of Evi Nemeth’s Unix Handbook, who suggested I get “on the Internet” using a dial-up connection to Colorado SuperNet. That meant that I was only part of the Internet during the dial-up connection to a server in Colorado, a long distance call.

I fostered a relationship with the local ISP, and eventually was allowed to setup a ‘permanent’ connection to the Internet through their office using ISDN routers. ISDN required that I pay for two pairs of wires running from my house to the ISP’s office using the telephone network (POTS), buy two identical ISDN routers, and pay the ISP for the connection. All together, I paid about $350 for 112,000 bits-per-second always-on Internet connection. But I was always online at that point.

I started getting requests from people who wanted my domain name. Some were silly, like the woman with a horse named Spirit. Others made more sense, like the band named Spirit or Spirit Car Rental. I was doing a lot of work, and getting clients, via email, so giving up the domain name wasn’t an option as far as I was concerned.

Around the end of 1999, a lawyer for Spirit Airlines contacted me, saying that the airline wanted to lease my domain name. I had never heard of someone leasing a domain name, but if I did that I would also no longer be able to use spirit.com for my email address. I told him, no, but I would add a link with a permanent redirect for them. If that link had a lot of traffic, I would discuss having them pay me for the link.

I had thought I had an interesting site, with lots of security articles posted, but it turned out that 95% of the traffic was people looking for Spirit Airlines. I asked them to pay me for each redirect, and we settled on 1.5 cents per click-through. And I had a money machine.

Money machines are things, like royalties for books, or residuals for actors, where you continue to get paid while doing nothing. Having a rental property is a little like this, although a good landlord actually maintains their property. In my case, I just needed to keep the server and its internet connection running, and send Spirit Airline an invoice every month. Creating the invoice was done via a script, and I emailed them the invoice along with some logs.

But in the spirit of things, this being a rather infamous low-cost airline (read the Wikipedia page), they weren’t very good at actually paying me. After a couple of years, I threatened to add Google Ads to my home page if they didn’t pay up, being several months in arrears. When they didn’t respond, I removed the redirect and added the JavaScript for Google Ads. And then I really had a money machine!

For the next seven years, I didn’t have to work, although I still did, because of all the money that I received, mostly from Google Ads. Spirit Airlines did become somewhat better at actually paying me, but Google Ads was incredibly addictive. You just included the JavaScript along with your account id, and they would direct deposit your profits from advertising every month.

I received a somewhat serious offer for spirit.com in 2010. In the past, people who would type “spirit” in the web browser search bar would be directed to my site, as browser developers appended “.com” to the end of words entered there. But Google’s Chrome browser turned the search bar into actual search, and I was seeing the number of daily hits declining. I could see the end of my money machine, and approached Spirit Airlines, telling them I had a serious offer for the domain. After some negotiations, they bought the domain, although they delayed releasing the escrowed funds for ten days after I turned over control, out of spite, it appeared.

You may think I was being a bit corrupt, taking advantage of Spirit Airlines, but they were never very nice to me, thinking, it appeared, that they should be the owner of the domain, even though they were, to me, just like the other dozen groups that had asked me for the domain name. And I was addicted to the regular inflow of money, just because I owned the domain name. I noticed that Spirit Airlines jets started including the domain name as part of their livery, for example, on the winglets at the tips of newer Boeing planes.

For my part, I was no longer receiving one thousand spam emails every day, on average. Soon after I registered spirit.com, I received the first of hundreds of thousands of spam emails, but this one was special: an offer of a CDROM with a million email address on it. I assumed my email address was there, and by the 2000s, I was getting 30,000 spam emails a month. I didn’t miss that at all.

My server was also being attacked, just like any other server on the internet. I had changed to using Linux after I started teaching Linux instead of Unix security. BSDI had stopped giving me free upgrades by 2000, but Linux had also improved its default security posture by then, so changing over made sense, given I needed to understand Linux, not BSD Unix.

I also started being attacked by someone who would carry out the attacks while I was at USENIX conferences. Sometimes the attacks had nothing to do with my servers but involved instead the credit card I typically used for travel. I would be on the road, and discover the card had been cancelled. Once, I got an email from a travel agent because someone had booked a flight on United for two from Chicago to Las Vegas. My credit card number had been sold on the black market. This happened every year for at least five years, but only that particular credit card. I had others that I used, some for buying airfare or another just for online purchases, so it seemed like it was someone who worked at the credit card company. I finally reached the right person at the credit card company, and I imagine they found the perpetrator because I stopped having to have that card replaced every year.

I seem to have done a pretty good job of securing my own server. I did monitor the server for changes to its configuration, and left traps in place of programs typically used by attackers after a successful exploit, and these traps were never set off. I did suffer from denial-of-service in the form of opened connections to my mail server that never completed, revealing a bug in Wietse Venema’s Postfix software that was supposed to prevent such attacks. Another time, the attacker used rik@spirit.com as the From address in a spam campaign that generated 10,000 bounces to my email account. Just like the credit card attacks, these occurred while I was traveling for USENIX.

I stopped running my own servers around 2013, when I was no longer teaching Linux security and when the friendly, local, ISP got bought by someone incompetant, and I had to give up my /26 IPv4 network address.

##SELinux

Security Enhanced Linux (SELinux) was merged into the mainline Linux kernel in 2003. SELinux involved changes to the operating system to change Linux into a system that supports mandatory access control and Role Based Access Control. MAC means that access control is no longer discretionary, that is, up to the users to manage. RBAC is a style of MAC that better matches the needs of commercial and defense system users than the early MAC-based systems that attempted to match security classifications used in governments.

I became interested in SELinux as I saw it as a big step forward toward locking-down systems by OS distributors, following work done early on FLASK by the University of Utah and the NSA to create a core framework for securing an operating system. Developers at both the NSA and a handful of corporations completed SELinux, a task that involved adding control to all security-sensitive operations inside the kernel and policy for adding this control to both applications and files.

What really caught my attention was that when system administrators had a problem with their Linux system, the first thing they would do is disable SELinux. This usually didn’t fix anything, but people left SELinux disable just-in-case. I thought this was stupid, and created a half-day course, later expanded to a full day. Like my Linux security class, this was hands-on, giving people experience working with SELinux administrative commands, as well as some experience with creating or editing policy.

I never became an SELinux expert. The policy is complex, with things like macros obscuring what sections of the policy were doing. Still, I advise people to leave SELinux enabled, as disabling only helps attackers. While RedHat and Fedora were early adopters, Debian used AppArmor instead until February 2025. Apparmor has an easier to understand policy that uses the same hooks in the Linux kernel, but bases its policy on file and directory names unlike SELinux, making it easier to bypass.

Another weakness of SELinux is that it relies on the kernel itself being secure. While reading about an attack on Android, Google’s Linux-based smartphone kernel, once the attacker manages to subvert the kernel, the next step is to flip the kernel flag disabling SELinux.

##;login: Editor

USENIX has had a member magazine since the early 80s. Originally, it was several pages that got mailed to about 80 organizations. Over time, it became an actual magazine serving the USENIX membership, which had grown to around 10,000 members by the late 90s.

The name, ;login:, got its weird punctuation because of ‘invisible’ control characters that were sent to command a particular terminal to clear the screen (the semi-colon), followed by login: prompt asking for a username.

I was asked in 1996 by the Executive Director if I would be willing to take over editorship of ;login:. I wrote a proposal, but nothing came of it at that point.

Two years later, I was invited to edit a security special edition. I would be paid, and payment included attending USENIX Security and all expenses! That was amazing good fortune for me, as I loved attending Security, but I had to take time out from my teaching and consulting to do that–and at that time, I wasn’t even taking vacations, much less attending conferences often.

In April of 2005, the current editor of ;login:, Rob Kolstad, discovered that he didn’t have any articles on hand for the June issue. The USENIX Board asked if I’d like to take over, and I sent them the proposal I had written in 1996. I’ve been editing ;login: ever since, although I learned that the Board wants to discontinue ;login: in late 2025. I am still trying to keep the door open to adding new articles.

##Retirement

I consider Peter Neumann, of SRI, as a model: in 2025, he’s almost 90 and still working in computer security. I don’t intend to fully retire unless I suffer brain damage and can no longer read papers or write.

I used to joke that I’ve already ‘retired’ several times: after quitting the USPS, after moving to California, and after moving to Sedona. In each case, I was really just between jobs, but did have a lot of free time. I am really glad I took advantage of that free time, because when you reach official retirement age, your body is often too worn out from stress and hard work to do the things you might have imagined you’d do when retired.

After moving to San Francisco, and then quitting working at North Star, I met some folks who ran a sailboat leasing company based at Pier 39. I soon began crewing for them whenever they had a charter with a skipper on a large sailboat. What that meant is that I would handle lines around docking, raise or lower sails, handle the jib when tacking as well as adjusting the sheets. Oftentimes, the paid skipper was drinking or snorting cocaine with the paying clients, and I would be sailing a beautiful sailboat in San Francisco Bay. The afternoon winds in the summer time were usually above 25 knots, and it never rained in the summer there either.

When I moved to Sedona, I was still consulting as technical editor for UnixWorld magazine. I would work early in the morning, go hiking and mountain biking mid-day, then work some more in the late afternoon. I was able to do things in my 40s that became impossible in my 60s, and if I hadn’t taken advantage of partial retirements along the way, I would have missed taking advantage of the hiking/biking available in the Sedona area.

So will I ever retire? Who knows. I certainly hope not.