|Purchase||Copyright © 2002 Paul Sheer. Click here for copying permissions.||Home|
The capabilities of LINUX are constantly expanding. Please consult the various Internet resources listed for up-to-date information.
This section covers questions that pertain to LINUX as a whole.
LINUX is the core of a free UNIX operating system for the PC and other hardware platforms. Developement of this operating system started in 1984; it was called the GNU project of the Free Software Foundation (FSF). The LINUX core (or kernel), named after its author, Linus Torvalds, began development in 1991--the first usable releases where made in 1993. LINUX is often called GNU/LINUX because much of the OS (operating system) results from the efforts of the GNU project.
UNIX systems have been around since the 1960s and are a proven standard in industry. LINUX is said to be POSIX compliant, meaning that it confirms to a certain definite computing standard laid down by academia and industry. This means that LINUX is largely compatible with other UNIX systems (the same program can be easily ported to run on another UNIX system with few (sometimes no) modifications) and will network seamlessly with other UNIX systems.
Some commercial UNIX systems are IRIX (for Silicon Graphics); Solaris or SunOS for Sun Microsystem SPARC workstations; HP UNIX for Hewlett Packard servers; SCO for the PC; OSF for the DEC Alpha machine and AIX for the PowerPC/RS6000. Because the UNIX name is a registered trademark, most systems are not called UNIX.
Some freely available UNIX systems are NetBSD, FreeBSD, and OpenBSD and also enjoy widespread popularity.
UNIX systems are multitasking and multiuser systems, meaning that multiple concurrent users running multiple concurrent programs can connect to and use the same machine.
UNIX systems are the backbone of the Internet. Heavy industry, mission-critical applications, and universities have always used UNIX systems. High-end servers and multiuser mainframes are traditionally UNIX based. Today, UNIX systems are used by large ISPs through to small businesses as a matter of course. A UNIX system is the standard choice when a hardware vendor comes out with a new computer platform because UNIX is most amenable to being ported. UNIX systems are used as database, file, and Internet servers. UNIX is used for visualization and graphics rendering (as for some Hollywood productions). Industry and universities use UNIX systems for scientific simulations and UNIX clusters for number crunching. The embedded market (small computers without operators that exist inside appliances) has recently turned toward LINUX systems, which are being produced in the millions.
LINUX itself can operate as a web, file, SMB (WinNT), Novell, printer, FTP, mail, SQL, masquerading, firewall, and POP server to name but a few. It can do anything that any other network server can do, more efficiently and reliably.
LINUX's up-and-coming graphical user interfaces (GUI) are the most functional and aesthetically pleasing ever to have graced the computer screen. LINUX has now moved into the world of the desktop.
LINUX runs on
Other projects are in various stages of completion. For example, you may get LINUX up and running on many other hardware platforms, but it would take some time and expertise to install, and you might not have graphics capabilities. Every month or so support is announced for some new esoteric hardware platform. Watch the Linux Weekly News <http://lwn.net/> to catch these.
(See also ``What is GNU?'' and ``What is LINUX?''.)
In 1984 the Free Software Foundation (FSF) set out to create a free UNIX-like system. It is only because of their efforts that the many critical packages that go into a UNIX distribution are available. It is also because of them that a freely available, comprehensive, legally definitive, free-software license is available. Because many of the critical components of a typical LINUX distribution are really just GNU tools developed long before LINUX, it is unfair to merely call a distribution ``LINUX''. The term GNU/LINUX is more accurate and gives credit to the larger part of LINUX.
Hundreds of web pages are devoted to LINUX. Thousands of web pages are devoted to different free software packages. A net search will reveal the enormous amount of information available.
But don't stop there--there are hundreds more.
All applications, network server programs, and utilities that go into a full LINUX machine are free software programs recompiled to run under the LINUX kernel. Most can (and do) actually work on any other of the UNIX systems mentioned above.
Hence, many efforts have been made to package all of the utilities needed for a UNIX system into a single collection, usually on a single easily installable CD.
Each of these efforts combines hundreds of packages (e.g., the Apache web server is one package, the Netscape web browser is another) into a LINUX distribution.
Some of the popular LINUX distributions are:
There are now about 200 distributions of LINUX. Some of these are single floppy routers or rescue disks, and others are modifications of popular existing distributions. Still others have a specialized purpose, like real time work or high security.
LINUX was largely developed by the Free Software Foundation <http://www.gnu.org/>.
The Orbiten Free Software Survey <http://www.orbiten.org/>
came up with the following breakdown of contributors after
surveying a wide array of open source packages. The following lists
the top 20 contributors by amount of code written:
|1||Free Software Foundation, Inc.||125565525||(11.246%)||546|
|2||Sun Microsystems, Inc.||20663713||(1.85%)||66|
|3||The Regents of the University of California||15192791||(1.36%)||156|
|6||Thomas G. Lane||8746848||(0.783%)||17|
|7||The Massachusetts Institute of Technology||8513597||(0.762%)||38|
|13||Lucent Technologies, Inc.||4991582||(0.447%)||5|
|18||Carnegie Mellon University||4272613||(0.382%)||23|
|19||James E. Wilson, Robert A. Koeneke||4272412||(0.382%)||2|
|20||ID Software, Inc.||4038969||(0.361%)||1|
This listing contains the top 20 contributors by number of projects contributed to:
|1||Free Software Foundation, Inc.||125565525||(11.246%)||546|
|3||The Regents of the University of California||15192791||(1.36%)||156|
|6||Sun Microsystems, Inc.||20663713||(1.85%)||66|
|7||RSA Data Security, Inc.||898817||(0.08%)||59|
|12||Alfredo K. Kojima||280990||(0.025%)||40|
|13||The Massachusetts Institute of Technology||8513597||(0.762%)||38|
|14||Digital Equipment Corporation||2182333||(0.195%)||37|
|15||David J. Mackenzie||337388||(0.03%)||37|
|20||Peter Mattis, Spencer Kimball||1981094||(0.177%)||28|
The preceding tables are rough approximations. They do, however, give an idea of the spread of contributions.
If you are a private individual with no UNIX expertise available to help you when you run into problems and you are not interested in learning about the underlying workings of a UNIX system, then you shouldn't install LINUX.
This section answers questions about the nature of free software and the concepts of GNU.
The LINUX kernel is distributed under the GNU General Public License (GPL) which is reproduced in Appendix E and is available from the FSF Home Page <http://www.gnu.org/>.
Most of all other software in a typical LINUX distribution is also under the GPL or the LGPL (see below).
There are many other types of free software licenses. Each of these is based on particular commercial or moral outlooks. Their acronyms are as follows (as defined by the LINUX Software Map database) in no particular order:
GNU (pronounced with a hard G) is an acronym for GNUs Not UNIX. A gnu is a large beast and is the motif of the Free Software Foundation (FSF). GNU is a recursive acronym.
Richard Stallman is the founder of the FSF and the creator of the GNU General Public License. One of the purposes of the FSF is to promote and develop free alternatives to proprietary software. The GNU project is an effort to create a free UNIX-like operating system from scratch; the project was started in 1984.
GNU represents this software licensed under the GNU General Public License--it is called Free software. GNU software is software designed to meet a higher set of standards than its proprietary counterparts.
GNU has also become a movement in the computing world. When the word GNU is mentioned, it usually evokes feelings of extreme left-wing geniuses who in their spare time produce free software that is far superior to anything even large corporations can come up with through years of dedicated development. It also means distributed and open development, encouraging peer review, consistency, and portability. GNU means doing things once in the best way possible, providing solutions instead of quick fixes and looking exhaustively at possibilities instead of going for the most brightly colored or expedient approach.
GNU also means a healthy disrespect for the concept of a deadline and a release schedule.
Proprietary software is often looked down upon in the free software world for many reasons:
The result of these limitations is that proprietary software
GNU software, on the other hand, is open for anyone to scrutinize. Users can (and do) freely fix and enhance software for their own needs, and then allow others the benefit of their extensions. Many developers of different areas of expertise collaborate to find the best way of doing things. Open industry and academic standards are adhered to, to make software consistent and compatible. Collaborated effort between different developers means that code is shared and effort is not replicated. Users have close and direct contact with developers, ensuring that bugs are fixed quickly and that user needs are met. Because source code can be viewed by anyone, developers write code more carefully and are more inspired and more meticulous.
Possibly the most important reason for the superiority of Free software is peer review. Sometimes this means that development takes longer as more people quibble over the best way of doing things. However, most of the time peer review results in a more reliable product.
Another partial reason for this superiority is that GNU software is often written by people from academic institutions who are in the center of IT research and are most qualified to dictate software solutions. In other cases, authors write software for their own use out of their own dissatisfaction for existing proprietary software--a powerful motivation.
The following is quoted from the GPL itself.
When we speak of free software, we are referring to freedom, not price. Our General Public Licenses are designed to make sure that you have the freedom to distribute copies of free software (and charge for this service if you wish), that you receive source code or can get it if you want it, that you can change the software or use pieces of it in new free programs; and that you know you can do these things.
To protect your rights, we need to make restrictions that forbid anyone to deny you these rights or to ask you to surrender the rights. These restrictions translate to certain responsibilities for you if you distribute copies of the software, or if you modify it.
For example, if you distribute copies of such a program, whether gratis or for a fee, you must give the recipients all the rights that you have. You must make sure that they, too, receive or can get the source code. And you must show them these terms so they know their rights.
See ``Where do I get LINUX?'' on page .
This situation is not possible. Because of the legal terms of the GPL, for LINUX to be distributed under a different copyright would require the consent of all 200+ persons that have ever contributed to the LINUX source code. These people come from such a variety of places, that such a task is logistically infeasible. Even if it did happen, new developers would probably rally in defiance and continue to work on the kernel as it is. This free kernel would amass more followers and would quickly become the standard, with or without Linus.
There are many kernel developers who have sufficient knowledge to do the job of Linus. Most probably, a team of core developers would take over the task if Linus no longer worked on the kernel. LINUX might even split into different development teams if a disagreement did break out about some programming issue, and it might rejoin later on. This is a process that many GNU software packages are continually going through, to no ill effect. It doesn't really matter much from the end user's perspective, since GNU software by its nature always tends to gravitate towards consistency and improvement, one way or another. It is also doesn't matter to the end user because the end user has selected a popular LINUX distribution packaged by someone who has already dealt with these issues.
Open Source is a new catch phrase that is ambiguous in meaning but is often used synonymously with Free. It sometimes refers to any proprietary vendor releasing source code to their package, even though that source code is not free in the sense of users being able to modify it and redistribute it. Sometimes it means ``public domain'' software that anyone can modify but which can be incorporated into commercial packages where later versions will be unavailable in source form.
Open Source advocates vie for the superiority of the Open Source development model.
GNU supporters don't like to use the term Open Source. Free software, in the sense of freedom to modify and redistribute is the preferred term and necessitates a copyright license along the same vein as the GPL. Unfortunately, it's not a marketable term because it requires this very explanation, which tends to bore people who don't really care about licensing issues.
Free software advocates vie for the ethical responsibility of making source code available and encouraging others to do the same.
Shareware refers to completely nonfree software that is encouraged to be redistributed at no charge, but which requests a small fee if it happens to land on your computer. It is not Free software at all.
This section covers questions that about how LINUX software is packaged and distributed and how to obtain LINUX.
You as the user are not going to download arbitrary untested software any more than you would if you were using Windows.
When you get LINUX, it will be inside a standard distribution, probably on a CD. Each of these packages is selected by the distribution vendors to be a genuine and stable release of that package. This is the responsibility taken on by those who create LINUX distributions.
Note that no corporate body oversees LINUX. Everyone is on their own mission. But a package will not find its way into a distribution unless someone feels that it is a useful one. For people to feel it is useful means that they have to have used it over a period of time; in this way only good, thoroughly reviewed software gets included.
Maintainers of packages ensure that official releases are downloadable from their home pages and will upload original versions onto well-established FTP servers.
It is not the case that any person is free to modify original distributions of packages and thereby hurt the names of the maintainers of that package.
For those who are paranoid that the software they have downloaded is not the genuine article distributed by the maintainer of that software, digital signatures can verify the packager of that software. Cases where vandals have managed to substitute a bogus package for a real one are extremely rare and entirely preventable.
(See also next question.)
The LINUX kernel is now on release version 2.4.3 as of this writing. The only other stable release of the kernel was the previous 2.2 series which was the standard for more than a year.
The LINUX kernel version does not affect the LINUX user. LINUX programs will work regardless of the kernel version. Kernel versions speak of features, not compatibility.
Each LINUX distribution has its own versioning system. RedHat has just released version 7.0 of its distribution, Caldera, 2.2, Debian, 2.1, and so forth. Each new incarnation of a distribution will have newer versions of packages contained therein and better installation software. There may also have been subtle changes in the file system layout.
The LINUX UNIX C library implementation is called glibc. When RedHat brought out version 5.0 of its distribution, it changed to glibc from the older libc5 library. Because all packages require this library, this was said to introduce incompatibility. It is true, however, that multiple versions of libraries can coexist on the same system, and hence no serious compatibility problem was ever introduced in this transition. Other vendors have since followed suit in making the transition to glibc (also known as libc6).
The LINUX community has also produced a document called the LINUX Filesystem Standard. Most vendors try to comply with this standard, and hence LINUX systems will look very similar from one distribution to another.
There are hence no prohibitive compatibility problems between LINUX distributions.
The different distributions are very similar and share binary compatibility (provided that they are for the same type of processor of course)--that is, LINUX binaries compiled on one system will work on another. This is in contrast to the differences between, say, two UNIX operating systems (compare Sun vs. IRIX). Utilities also exist to convert packages meant for one distribution to be installed on a different distribution. Some distributions are, however, created for specific hardware, and thus their packages will only run on that hardware. However, all software specifically written for LINUX will recompile without any modifications on another LINUX platform in addition to compiling with few modifications on other UNIX systems.
The rule is basically this: If you have three packages that you would need to get working on a different distribution, then it is trivial to make the adjustments to do this. If you have a hundred packages that you need to get working, then you have a problem.
If you are an absolute beginner and don't really feel like thinking about what distribution to get, one of the most popular and easiest to install is Mandrake. RedHat is also supported quite well in industry.
The attributes of some distributions are:
What's nice about RPM based distributions (RedHat, Mandrake, and others) is that almost all developers provide RedHat .rpm files (the file that a RedHat package comes in). Debian .deb package files are usually provided, but not as often as .rpm. On the other hand, Debian packages are mostly created by people on the Debian development team, who have rigorous standards to adhere to.
TurboLinux, SuSE, and some others are also very popular. You can find reviews on the Internet.
Many other popular distributions are worth installation. Especially worthwhile are distributions developed in your own country that specialize in the support of your local language.
Once you have decided on a distribution (see previous question), you need to download that distribution or buy or borrow it on CD. Commercial distributions may contain proprietary software that you may not be allowed to install multiple times. However, Mandrake, RedHat, Debian, and Slackware are all committed to freedom and hence will not have any software that is not redistributable. Hence, if you get one of these on CD, feel free to install it as many times as you like.
Note that the GPL does not say that GNU software is without cost. You are allowed to charge for the service of distributing, installing, and maintaining software. It is the nonprohibition to redistribute and modify GNU software that is meant by the word free.
An international mirror for LINUX distributions is Metalab distributions mirror <ftp://metalab.unc.edu/pub/Linux/distributions/>. Also consult the resources in Chapter 13, ``What web pages should I look at?'' on page , and the Web sites entry in the index.
Downloading from an FTP site is going to take a long time unless you have a really fast link. Hence, rather ask around who locally sells LINUX on CD. Also make sure you have the latest version of whatever it is you're buying or downloading. Under no circumstance install from a distribution that has been superseded by a newer version.
It helps to think more laterally when trying to get information about LINUX:
Would-be LINUX users everywhere need to know how to install LINUX. Surely the Free software community has long since generated documentation to help them? Where is that documentation?
Most distributions have very comprehensive installation guides, which is the reason I do not cover installation in this book. Browse around your CD to find it or consult your vendor's web site.
Also try see what happens when you do a net search with ``linux installation guide.'' You need to read through the install guide in detail. It will explain everything you need to know about setting up partitions, dual boots, and other installation goodies.
The installation procedure will be completely different for each distribution.
This section explains where to get free and commercial help with LINUX.
LINUX is supported by the community that uses LINUX. With commercial systems, users are too stingy to share their knowledge because they feel that they owe nothing for having spent money on software.
LINUX users, on the other hand, are very supportive of other LINUX users. People can get far better support from the Internet community than they would from their commercial software vendors. Most packages have email lists where the very developers are available for questions. Most cities have mailing lists where responses to email questions are answered within hours. New LINUX users discover that help abounds and that they never lack friendly discussions about any computing problem they may have. Remember that LINUX is your operating system.
Newsgroups provide assistance where LINUX issues are discussed and help is given to new users; there are many such newsgroups. Using a newsgroup has the benefit of the widest possible audience.
The web is also an excellent place for support. Because users constantly interact and discuss LINUX issues, 99% of the problems a user is likely to have would have already been documented or covered in mailing list archives, often obviating the need to ask anyone at all.
Finally, many professional companies provide assistance at comparable hourly rates.
This section discusses the relative merits of different UNIX systems and NT.
LINUX has several times the installed base of any UNIX system.
This is an answer nobody really knows. Various estimates have been put forward based on statistical considerations. As of early 2001 the figure was about 10-20 million. As LINUX begins to dominate the embedded market, that number will soon surpass the number of all other operating systems combined.
What is clear is that the number of LINUX users is doubling consistently every year. This is evident from user interest and industry involvement in LINUX; journal subscriptions, web hits, media attention, support requirements, software ports, and other criteria.
Because it is easy to survey online machines, it is well-established that over 25% of all web servers run LINUX.
Although LINUX is free, a good knowledge of UNIX is required to install and configure a reliable server. This tends to cost you in time or support charges.
On the other hand, your Windows or OS/2 server, for example, has to be licensed.
Many arguments put forward regarding server costs fail to take into account the complete lifetime of the server. This has resulted in contrasting reports that either claim that LINUX costs nothing or claim that it is impossible to use because of the expense of the expertise required. Neither of these extreme views is true.
The total cost of a server includes the following:
When all these factors are considered, any company should probably make a truly enormous saving by choosing a LINUX server over a commercial operating system.
(See previous question.)
Proprietary UNIX systems are not as user friendly as LINUX. LINUX is also considered far easier to maintain than any commercial UNIX system because of its widespread use and hence easy access to LINUX expertise. LINUX has a far more dedicated and ``beginner friendly'' documentation project than any commercial UNIX, and many more user-friendly interfaces and commands.
The upshot of this is that although your proprietary UNIX system will perform as reliably as LINUX, it will be more time consuming to maintain.
UNIX systems that run on specialized hardware are almost never worth what you paid for them in terms of a cost/performance ratio. That is doubly if you are also paying for an operating system.
LINUX typically performs 50% to 100% better than other operating systems on the same hardware. There are no commercial exceptions to this rule for a basic PC.
There have been a great many misguided attempts to show that LINUX performs better or worse than other platforms. I have never read a completely conclusive study. Usually these studies are done with one or other competing system having better expertise at its disposal and are, hence, grossly biased. In some supposedly independent tests, LINUX tended to outperform NT as a web server, file server, and database server by an appreciable margin.
In general, the performance improvement of a LINUX machine is quite visible to users and administrators. It is especially noticeable how fast the file system access is and how it scales smoothly when multiple services are being used simultaneously. LINUX also performs well when loaded by many services simultaneously.
There is also criticism of LINUX's SMP (multiprocessor) support, and lack of a journalling file system. These two issues are discussed in the next question.
In our experience (from both discussions and development), LINUX's critical operations are always pedantically optimized--far more than would normally be encouraged in a commercial organization. Hence, if your hardware is not performing the absolute best it can, it's by a very small margin.
It's also probably not worthwhile debating these kinds of speed issues when there are so many other good reasons to prefer LINUX.
LINUX is supposed to lack proper SMP support and therefore not be as scalable as other OSs. This is somewhat true and has been the case until kernel 2.4 was released in January 2001.
LINUX has a proper journalling file system called ReiserFS. This means that in the event of a power failure, there is very little chance that the file system would ever be corrupted, or that manual intervention would be required to fix the file system.
LINUX supports a full 64 gigabytes of memory, with 1 gigabyte of unshared memory per process.
If you really need this much memory, you should be using a 64-bit system, like a DEC Alpha, or Sun UltraSPARC machine.
On 64-bit systems, LINUX supports more memory than most first-world governments can afford to buy.
LINUX supports as much swap space as you like. For technical reasons, however, the swap space formerly required division into separate partitions of 128 megabytes each.
The principles underlying OS development have not changed since the concept of an OS was invented some 40+ years ago. It is really academia that develops the theoretical models for computer science--industry only implements these.
There are a great many theoretical paradigms of operating system that vary in complexity and practicality. Of the popular server operating systems, UNIX certainly has the most versatile, flexible, and applicable security model and file system structure.
FreeBSD is like a LINUX distribution in that it also relies on a large number of GNU packages. Most of the packages available in LINUX distributions are also available for FreeBSD.
FreeBSD is not merely a kernel but also a distribution, a development model, an operating system standard, and a community infrastructure. FreeBSD should rather be compared to Debian than LINUX.
The arguments comparing the FreeBSD kernel to the LINUX kernel center around the differences between how various kernel functions are implemented. Depending on the area you look at, either LINUX or FreeBSD will have a better implementation. On the whole, FreeBSD is thought to have a better architecture, although LINUX has had the benefit of having been ported to many platforms, has a great many more features, and supports far more hardware. It is questionable whether the performance penalties we are talking about are of real concern in most practical situations.
Another important consideration is that the FreeBSD maintainers go to far more effort securing FreeBSD than does any LINUX vendor. This makes FreeBSD a more trustworthy alternative.
GPL advocates take issue with FreeBSD because its licensing allows a commercial organization to use FreeBSD without disclosing additions to the source code.
None of these arguments offset the fact that either of these systems is preferable to a proprietary one.
Most companies tend to underestimate how entrenched they are in Windows skills. An office tends to operate organically with individuals learning tricks from each other over long periods of time. For many people, the concept of a computer is synonymous with the Save As and My Documents buttons. LINUX departs completely from every habit they might have learned about their computer. The average secretary will take many frustrating weeks gaining confidence with a different platform, while the system administrator will battle for much longer.
Whereas Windows does not offer a wide range of options with regards to desktops and office suites, the look-and-feel of a LINUX machine can be as different between the desktops of two users as is Windows 98 different from an Apple Macintosh. Companies will have to make careful decisions about standardizing what people use, and creating customizations peculiar to their needs.
Note that Word and Excel documents can be read by various LINUX office applications but complex formatting will not convert cleanly. For instance, document font sizes, page breaking, and spacing will not be preserved exactly.
LINUX can interoperate seamlessly with Windows shared file systems, so this is one area where you will have few migration problems.
GUI applications written specifically for Windows are difficult to port to a UNIX system. The Wine project now allows pure C Windows applications to be recompiled under UNIX, and Borland has developed Kylix (a LINUX version of Delphi). There are more examples of LINUX versions of Windows languages, however, any application that interfaces with many proprietary tools and is written in a proprietary language is extremely difficult to port. The developer who does the porting will need to be an expert in UNIX development and an expert in Windows development. Such people are rare and expensive to hire.
The following is based on my personal experience during the migration of three large companies to LINUX.
Commercial UNIX third party software that has been ported to LINUX will pose very little problem at all. You can generally rely on performance improvements and reduced costs. You should have no hesitation to install these on LINUX.
Managers will typically request that ``LINUX'' skills be taught to their employees through a training course. What is often missed, is that their staff have little basic UNIX experience to begin with. For instance, it is entirely feasible to run Apache (a web server package) on a SCO, IRIX, or Sun systems, yet managers will request, for example, that their staff be taught how to configure a LINUX ``web server'' in order to avoid web server licensing fees.
It is important to gauge whether your staff have a real understanding of the TCP/IP networks and UNIX systems that you are depending on, rather then merely using a trial-and-error approach to configuring your machines. Fundamentally, LINUX is just a UNIX system, and a very user-friendly one at that, so any difficulties with LINUX ought not to be greater than those with your proprietary UNIX system.
Should their basic UNIX knowledge be incomplete, a book like this one will provide a good reference.
Many companies also develop in-house applications specific to their corporation's services. Being an in-house application, the primary concern of the developers was to ``get it working'', and that might have been accomplished only by a very small margin. Suddenly running the code on a different platform will unleash havoc, especially if it was badly written. In this case, it will be essential to hire an experienced developer who is familiar with the GNU compiler tools.
Well written UNIX applications (even GUI applications) will, however, port very easily to LINUX and of course to other UNIX systems.
Before installing any LINUX machines, you should identify what each person in your organization does with their computer. This undertaking is difficult but very instructive. If you have any custom applications, you need to identify what they do and create a detailed specification of their capabilities.
The next step is to encourage practices that lean toward interoperability. You may not be able to migrate to LINUX immediately, but you can save yourself enormous effort by taking steps in anticipation of that possibility. For instance, make a policy that all documents must be saved in a portable format that is not bound to a particular wordprocessor package.
Wean people off tools and network services that do not have UNIX equivalents. SMTP and POP/IMAP servers are an Internet standard and can be replaced with LINUX servers. SMB file servers can be replaced by LINUX Samba servers. There are web mail and web groupware services that run on LINUX servers that can be used from Internet Explorer. There are some word processors that have both UNIX and Windows versions whose operation is identical on both OSs.
Force your developers to test their Web pages on Netscape/Mozilla as well as Internet Explorer. Do not develop using tools that are tied very closely to the operating system and are therefore unlikely to ever have UNIX versions; there are Free cross platform development tools that are more effective than popular commercial IDEs: Use these languages instead. If you are developing using a compiler language, your developers should ensure that code compiles cleanly with independent brands of compiler. This will not only improve code quality but will make the code more portable.
Be aware that people will make any excuse to avoid having to learn something new. Make the necessary books available to them. Identify common problems and create procedures for solving them. Learn about the capabilities of LINUX by watching Internet publications: A manager who is not prepared to do this much should not expect their staff to do better.
This section covers various specific and technical questions.
Yes. You can browse the installation documentation on the CD (if it has any) using Internet Explorer. LINUX software tends to prefer Windows floppy disk formats, and ISO9660 CD formats, even though almost everything else uses a different format.
Yes, LINUX will occupy two or more partitions, while Windows will sit in one of the primary partitions. At boot time, a boot prompt will ask you to select which operating system you would like to boot into.
A useful distribution of packages that includes the X Window System (UNIX's graphical environment) will occupy less than 1 gigabyte. A network server that does not have to run X can get away with about 100-300 megabytes. LINUX can run on as little as a single stiffy disk--that's 1.4 megabytes--and still perform various network services.
LINUX runs on many different hardware platforms, as explained above. Typical users should purchase an entry-level PC with at least 16 megabytes of RAM if they are going to run the X Window System (UNIX's graphical environment) smoothly.
A good LINUX machine is a PII 300 (or AMD, K6, Cyrix, etc.) with 64 megabytes of RAM and a 2-megabyte graphics card (i.e., capable of run 1024x768 screen resolution in 15/16 bit color). One gigabyte of free disk space is necessary.
If you are using scrap hardware, an adequate machine for the X Window System should not have less than an Intel 486 100 MHz processor and 8 megabytes of RAM. Network servers can run on a 386 with 4 megabytes of RAM and a 200-megabyte hard drive. Note that scrap hardware can be very time consuming to configure.
Note that recently some distributions are coming out with Pentium-only compilations. This means that your old 386 will no longer work. You will then have to compile your own kernel for the processor you are using and possibly recompile packages.
About 90% of all hardware available for the PC is supported under LINUX. In general, well-established brand names will always work, but will tend to cost more. New graphics/network cards are always being released onto the market. If you buy one of these, you might have to wait many months before support becomes available (if ever).
To check on hardware support, see the Hardware-HOWTO <http://users.bart.nl/~patrickr/hardware-howto/Hardware-HOWTO.html>
This may not be up-to-date, so it's best to go to the various references listed in this document and get the latest information.
LINUX has read and write support for all these file systems. Hence, your other partitions will be readable from LINUX. In addition, LINUX supports a wide range of other file systems like those of OS/2, Amiga, and other UNIX systems.
LINUX contains a highly advanced DOS emulator. It will run almost any 16-bit or 32-bit DOS application. It runs a great number of 32-bit DOS games as well.
The DOS emulator package for LINUX is called dosemu. It typically runs applications much faster than does normal DOS because of LINUX's faster file system access and system calls.
It can run in an X window just like a DOS window under Windows.
Yes. WineLib is a part of the Wine package (see below) and allows Windows C applications to be recompiled to work under LINUX. Apparently this works extremely well, with virtually no changes to the source code being necessary.
Yes and no.
There are commercial emulators that will run a virtual 386 machine under LINUX. This enables mostly flawless running of Windows under LINUX if you really have to and at a large performance penalty. You still have to buy Windows though. There are also some Free versions of these.
There is also a project called Wine (WINdows Emulator) which aims to provide a free alternative to Windows by allowing LINUX to run Windows 16 or 32 bit binaries with little to no performance penalty. It has been in development for many years now, and has reached the point where many simple programs work quite flawlessly under LINUX.
Get a grip on what this means: you can run Minesweep under LINUX and it will come up on your X Window screen next to your other LINUX applications and look exactly like what it does under Windows--and you don't have to buy Windows. You will be able to cut and paste between Windows and LINUX application.
However, many applications (especially large and complex ones) do not display correctly under LINUX or crash during operation. This has been steadily improving to the point where Microsoft Office 2000 is said to be actually usable.
Many Windows games do, however, work quite well under LINUX, including those with accelerated 3D graphics.
See the Wine Headquarters <http://www.winehq.com/faq.html> for more information.
A virus is a program that replicates itself by modifying the system on which it runs. It may do other damage. Viruses are small programs that exploit social engineering, logistics, and the inherent flexibility of a computer system to do undesirable things.
Because a UNIX system does not allow this kind of flexibility in the first place, there is categorically no such thing as a virus for it. For example, UNIX inherently restricts access to files outside the user's privilege space, so a virus would have nothing to infect.
However, although LINUX cannot itself execute a virus, it may be able to pass on a virus meant for a Windows machine should a LINUX machine act as a mail or file server. To avoid this problem, numerous virus detection programs for LINUX are now becoming available. It's what is meant by virus-software-for-LINUX.
On the other hand, conditions sometimes allow an intelligent hacker to target a machine and eventually gain access. The hacker may also mechanically try to attack a large number of machines by using custom programs. The hacker may go one step further to cause those machines that are compromised to begin executing those same programs. At some point, this crosses the definition of what is called a "worm." A worm is a thwarting of security that exploits the same security hole recursively through a network. See the question on security below.
At some point in the future, a large number of users may be using the same proprietary desktop application that has some security vulnerability in it. If this were to support a virus, it would only be able to damage the user's restricted space, but then it would be the application that is insecure, not LINUX per se.
Remember also that with LINUX, a sufficient understanding of the system makes it possible to easily detect and repair the corruption, without have to do anything drastic, like reinstalling or buying expensive virus detection software.
LINUX is as secure as or more secure than typical UNIX systems.
Various issues make it more and less secure.
Because GNU software is open source, any hacker can easily research the internal workings of critical system services.
On one hand, they may find a flaw in these internals that can be indirectly exploited to compromise the security of a server. In this way, LINUX is less secure because security holes can be discovered by arbitrary individuals.
On the other hand, individuals may find a flaw in these internals that they can report to the authors of that package, who will quickly (sometimes within hours) correct the insecurity and release a new version on the Internet. This makes LINUX more secure because security holes are discovered and reported by a wide network of programers.
It is therefore questionable whether free software is more secure or not. I personally prefer to have access to the source code so that I know what my software is doing.
Another issue is that LINUX servers are often installed by lazy people who do not take the time to follow the simplest of security guidelines, even though these guidelines are widely available and easy to follow. Such systems are sitting ducks and are often attacked. (See the previous question.)
A further issue is that when a security hole is discovered, system administrators fail to heed the warnings announced to the LINUX community. By not upgrading that service, they leave open a window to opportunistic hackers.
You can make a LINUX system completely airtight by following a few simple guidelines, like being careful about what system services you expose, not allowing passwords to be compromised, and installing utilities that close possible vulnerabilities.
Because of the community nature of LINUX users, there is openness and honesty with regard to security issues. It is not found, for instance, that security holes are covered up by maintainers for commercial reasons. In this way, you can trust LINUX far more than commercial institutions that think they have a lot to lose by disclosing flaws in their software.