Go to TogaWare.com Home Page.
GNU/Linux Desktop Survival Guide
by Graham Williams


GNU/Linux is fashioned on Unix. Unix dates from 1969 when Ken Thompson at Bell Telephone Laboratories initiated work on this new operating system. Others involved in the project included Dennis Ritchie and Brian Kernighan. The name Unix is a pun on an alternative operating system of the time called MULTICS (MULTiplexed Information and Computing Service). MULTICS was developed by The Massachusetts Institute of Technology, General Electric and Bell Labs. Unix was originally spelt UNICS, an acronym for UNiplexed Information and Computing Service!

Some of the basic ideas introduced by Multics and then Unix were the tree structured file system, a program for command interpretation (called the shell), the structure and nature of text files and the semantics of I/O operations. Some of the philosophy that rose with the development of Unix included the desire to write programs that performed one task and to do it well, to write programs that worked together to perform larger tasks, and to write programs that communicated with each other using text from one program to the other.

The advantages of Unix were quickly identified by many and quite a few varieties of Unix emerged over time. Sun Microsystems have pioneered many of the developments in Unix, followed by such greats as the old Digital Equipment Corporation (DEC, which was swallowed by Compaq, which was swallowed by Hewlett-Packard), Silicon Graphics Incorporated (SGI), International Business Machines (IBM), and Hewlett-Packard (HP). A variety of flavours have existed, including SunOS, Solaris, Ultrix, Irix, BSD, System V, HPUX, and so on. Although computer programs written for one version of Unix could sometimes be ported to other versions, it was not always an easy task. The diversity of Unix implementations (more so than the proprietary nature of most of them) made it difficult for Unix to become a commodity operating system. The GNU project worked hard to free software development from nuances of each of the different Unix versions through providing a common programming language environment (GNU C) and a sophisticated packaging tool (autoconf and automake) to carefully hide the differences. GNU/Linux has now become the most popular Unix variant and all the major Unix players support GNU/Linux in some way.

A particularly touted feature of Unix comes from a tools philosophy where complex tasks are performed by bringing together a collection of simpler tools. This is contrasted with the philosophy of providing monolithic applications that in one fell swoop solve all your problems, supposedly. The reality is often different.

Most operating systems supply a collection of basic utility programs for managing your files (things like arranging your files into folders, trashing files, and copying files from one place to another). Large applications then provide the word processing, spreadsheet, and web browsing functionality.

Unix places less emphasis on the monolithic applications. Instead, tools provide simple functionality, focusing on doing well what they are designed to do. They simply pass their results on to another tool once they're done. Unix pipes provide the mechanism for doing this: one tool pipes its output on to another tool. This allows complex actions to be performed by piping together a collection of simpler commands.

A typical example is to determine the number of users logged on to your system:

  $ who | wc -l

The who command will list, one per line, each user logged on. The wc command will count the number of characters, words, and lines that it comes across, with the -l option only counting the number of lines. (GNU tools, like Unix, introduce options with the minus sign.)

For various reasons though this tools philosophy was often overlooked when large monolithic applications arose that did not adhere to the philosophy--they did not share components. Common tools such as Netscape, ghostview, Acrobat, FrameMaker, and OpenOffice essentially share very little. Compare that with the Microsoft community where, for example, an application like Internet Explorer is component-based. This is now changing in the GNU world with the operating system software and the Gnome project encouraging component-based architectures.

Another feature of Unix is that Unix applications tend to use open file formats allowing a variety of tools to collaborate to work on those open formats. Indeed, this has been a key in recent developments to remove the strangle-hold of Microsoft proprietary formats. Rather than electronic document storage providing a longer term solution to the archival of documents, it is delivering an even shorter lifetime than paper-based archives! How can that be so? The formats created by proprietary software are often binary and not fully publicly specified. How many packages today can read old Word Perfect and MS Word documents? The standardisation on open formats, often text-based formats like XML that allow anyone to read them, provides a solution to this problem.

So why Unix? It is a conceptually simple operating system facilitating creativity by not restricting the developer. Many have found it to be a fun operating system to work with allowing many innovative developments to be combined in new and even more innovative ways to deliver powerful ideas. A very large world wide group of people willingly provide excellent, free support over the Internet. Anyone can learn more about the operating system by studying the code itself. Anyone can contribute to porting the operating system to their favourite computer.

And finally, the much touted stability. There is very little doubt that GNU and Linux are extremely stable. The habit of rebooting your computer every time you come back to it is something Microsoft seems to encourage because of it's notorious instability and tendency for the operating system not to carefully manage its use of memory. Also, install a new package under MS/Windows and chances are you need to reboot the computer. Most Unix users rarely need to reboot their machine. Check the uptime and you will generally find the machine has not been rebooted for months or years. Installing packages invariably does not require rebooting. Indeed, the only time it does is when you upgrade your Linux kernel!

Copyright © 1995-2006 [email protected]
Contribue and access the PDF Version