With Lightworks for Linux in development, Phil Rhodes answers the tricky questions about Linux that everyone assumes you knew the answers to anyway.
I write this article with a degree of trepidation, because if anything is universally accepted about Linux, it's that almost nothing about it is universally accepted, and no matter what I say, someone will argue with me. We'll start by defining what we're talking about which, with any luck, shouldn't be too controversial.
In the very strictest sense, Linux is a kernel, the core part of an operating system that is responsible for fundamental tasks such as the management of memory, low-level control of peripherals such as hard disks and USB ports, and the sharing of processing time between programs that allows modern computers to appear to do several things at once.
It's all about the Kernel
Given this purpose, it's clear that the behaviour and performance of the kernel is critical to the sort of modern computing experience that most people are used to. Even so, the kernel is only part of the story – the parts of an operating system with which we're most familiar, the windows, icons, menus and pointers, are quite separate. Some cellphones for instance, also use a Linux kernel, and the experience of using any of them is wildly different to that provided by what most people call Linux on a desktop computer – so different that we don't even call it Linux, we call it Android.
In any computer system, the most basic, fundamental levels of usable software include things like the software compiler, a text editor, and associated tools, with which other things can be created. In most Linux installations, these things are generally provided by, or at least closely based on the work of, the GNU project. Founded in the early 80s, GNU was intended to create a free equivalent to the then-popular Unix. By 1991, when Linus Torvalds released the first version of the Linux kernel, GNU still lacked one of its own, and the two have since been closely associated. GNU is the reason so many Linux commands begin with a G.
It's a GNU
Beyond the GNU tools are the things that create what we actually expect of a modern computer. These include a myriad of graphics (particularly X.Org, the current incarnation of the X Window System which draws the user interface), audio and other subsystems which may be provided by any number of people, projects and organisations.
The complete package of software, from the Linux kernel to the GNU tools, via the graphics and sound systems right up to the desktop calculator, is generally referred to as a distribution (colloquially, “distro”). The familiar names of Debian, Red Hat, Gentoo and many others are Linux distributions; packages of software that allow a desktop computer running the Linux kernel to present the resources of the machine to the user in a way that's useful and usable.
As such, what we refer to, perhaps glibly, as “A Linux PC”, is actually running:
⁃ The Linux kernel
⁃ At least some GNU tools
⁃ Some other tools to provide graphics, sound, etc.
⁃ As part of a distribution
There will, of course, be a written test at the end of this article.
You can learn an awful lot about the Linux community – and perhaps get your first taste of the prevailing aproach of these people - by knowing two things. First, GNU stands for GNU's Not Unix, which is what these people like to call a “recursive acronym” because apparently that helps. Second, to all practical purposes GNU, or at least GNU plus Linux, absolutely is a kind of Unix, at least according to the opinion of one of the people who actually invented Unix.
I won't dwell on the subject of Linux's relationship with Unix because it is complicated and divisive and doesn't really illuminate very much that's useful to modern users of either system. Suffice to say that the very early experiments in software that was the recognisable forerunner of modern operating systems occurred in the late 1960s, that one of these experiments was (and is) called Unix, and Linux and Mac OSX are both more like Unix than Windows is.
Sharp-eyed readers will notice that I've already sneaked in a reference to the Free Software Foundation. Most people are aware of the idea of free software, most frequently encountered in the wild as the concept of open source, the approach of publicly releasing a program in a format that allows it to be expanded and changed, usually with the proviso that any changes are also made publicly available to anyone at no cost. This concept is closely associated with Linux, and the majority of any practical Linux install, including the kernel, will be free, open source software. It's important to differentiate between free as in cost, and free as in liberated. It is entirely legitimate to charge money for open source software, although it is usually difficult to do so because it is intrinsically available free of charge; more usually, people charge to support and work on the software.
What this means in practical terms is that you can get access to the text-based source code, often in the C programming language, which can be compiled to produce a working program. In the closed-source world, the source code to an application such as Photoshop is a closely-guarded proprietary property of Adobe, to prevent the theft of intellectual property. The intention of open source is to encourage community involvement in software development, to prevent people's work being co-opted and made private, and to encourage the continuous improvement of software. In terms of software such as the Linux kernel , this model has been successful and it is now in its twenty-first year of continual refinement. Several critical pieces of international infrastructure, particularly the internet, the world wide web that operates on it, and closely associated database and page-preprocessing technologies, rely either commonly or entirely on Linux as an operating system. It is still more commonly used in back-room server farms than on the desktop, but these roles are not limited to just web and business applications – linux is also commonly used in storage servers for film and TV work, for instance.
There is a cost
Agreeably egalitarian as this sounds, though, there are problems with open source. The concept was developed by people writing computer code in an academic environment where making a living was not contingent on meeting deadlines and satisfying customers. Companies such as Red Hat make a living by offering paid support contracts for the Linux distribution that bears their name, but the overwhelming majority of open source is written by unpaid volunteers whose abilities and dedication to the project are likely to less reliable than in the commercial world. Unless you are a competent software engineer yourself, the much-vaunted ability to modify source code and alter the behaviour of the software is irrelevant. Because there is no effective managing agency, as Microsoft manages Windows, there can be a lack of standardisation. Unpaid software engineers are often willing to put enormous effort into solving interesting technical problems, but less interesting, less technical tasks, such as documentation and user support, are often patchy at best.
Perhaps the greatest difficulty is that it can be difficult to create and maintain unpaid teams big enough to produce large-scale software, and despite efforts like Cinelerra, there has never been a really competent, feature-complete video editing application on Linux until, tellingly, the commercial world ported Lightworks, which is not open source, to the platform. Probably for this reason, there is in general a paucity of media software, with an honourable mention to the excellent Audacity digital audio workstation as almost the only exception. While many high-end devices, such as colour correctors and visual effects tools, do run Linux very effectively, it is difficult to consider these examples of “media software for linux”, as they're generally sold and installed as complete turnkey systems with the underlying operating system as something the user is rarely aware of.
Easy to use?
It is a mistake to approach Linux with the idea that there is a wide variety of full-featured, easy to use and reliable software available at no charge. Basic office tasks are possible, and distributions such as Ubuntu are increasingly able to automatically recognise and use most of the hardware of any desktop PC in the same way Windows does. Linux is the almost unchallenged champion of networking, and the overwhelming majority of the internet is run on Linux, as are many of the fastest (and all of the top ten) supercomputers in the world, because “a supercomputer” in 2012 is generally a lot of more normal computers networked together. Because Linux is free of charge and good at networking, it is also the darling of people who are interested in creating render farms or banks of machines to do hard work such as video encoding on a massive scale. Linux can be highly reliable in these situations, with continuous running times of many months – something that other operating systems struggle to match.
Easy to try!
Although it still isn't perfect, there has been a huge degree of advancement in the general usability of Linux over the past five to ten years. It's now possible to download a version of Ubuntu that can be installed to, and run from, a USB flash key, so it's entirely possible to try Linux on an existing Windows PC without compromising the existing setup. This has made it easier than ever to try Linux and find out if any of its rather particular characteristics are either a boon or a dealbreaker for your particular situation, which, ultimately, is the only way to be sure. In any case, it's hard to beat the price.