linux magazine uk 13

Page 1



COMMENT

General Contacts General Enquiries Fax Subscriptions Email Enquiries Letters

01625 855169 01625 855071 www.linux-magazine.co.uk subs@linux-magazine.co.uk edit@linux-magazine.co.uk letters@linux-magazine.co.uk

Editor

John Southern jsouthern@linux-magazine.co.uk

CD Editor

Richard Smedley rsmedley@linux-magazine.co.uk

Contributors

Alison Davis, Colin Murphy, Richard Smedley, Richard Ibbotson, Jono Bacon, Jason Walsh, Chris Brown, Jim Cheetham

International Editors

Harald Milz hmilz@linux-magazin.de Hans-Georg Esser hgesser@linux-user.de Ulrich Wolf uwolf@linux-magazin.de

International Contributors

Simon Budig, Mirko Dölle, Björn Ganslandt, Georg Greve, Jo Moskalewski, Christian Perle, Stefanie Teufel, Patricia Jung, Oliver Kluge, Ulrich Wolf, Bernhard Röhrig, Thomas Drilling, Anja Wagner, Andreas Jung

Design

Renate Ettenberger vero-design, Tym Leckey

Production

Bernadette Taylor, Stefanie Huber

Operations Manager

Pam Shore

Advertising

01625 855169 Carl Jackson Sales Manager cjackson@linux-magazine.co.uk Verlagsbüro Ohm-Schmidt Osmund@Ohm-Schmidt.de

Publishing Publishing Director

Robin Wilkinson rwilkinson@linux-magazine.co.uk Subscriptions and back issues 01625 850565 Annual Subscription Rate (12 issues) UK: £44.91. Europe (inc Eire) : £73.88 Rest the World: £85.52 Back issues (UK) £6.25

Distributors

COMAG, Tavistock Road, West Drayton, Middlesex England UB7 7QE

Print

R. Oldenbourg

Linux Magazine is published monthly by Linux New Media UK Ltd, Europa House, Adlington Park, Macclesfield, Cheshire, England, SK10 4NP. Company registered in England. Copyright and Trademarks (c) 2001 Linux New Media UK Ltd No material may be reproduced in any form whatsoever in whole or in part without the written permission of the publishers. It is assumed that all correspondence sent, for example, letters, e-mails, faxes, photographs, articles, drawings, are supplied for publication or license to third parties on a non-exclusive worldwide basis by Linux New Media unless otherwise stated in writing. ISSN 14715678 Linux is a trademark of Linus Torvalds

INTRO

CURRENT ISSUES

ONWARDS AND UPWARDS Nearly the whole of my month has been dedicated to setting up networks and dealing with people who seem to excel in adding new requirements at the last minute – people who want to print to all printers; want to put all cheap inkjets onto a network; want to monitor and control users desktops. Admittedly, there haven’t been many requests for Linux on the desktop, but for everything else there has been great demand. Corporate-wise, Lotus Notes seems to be very popular at the moment along with PHP and, surprisingly, Fortran. In each case Free Open Source software has coped and in a lot of cases has been better than some of the outrageously priced commercial options. The ability to change some part so that it suits a client rather than have a client change to suit the package is very refreshing. On the other hand, the big Linux distributors have started to release commercial packages. Here they are using a base of Open Source and adding all the services you would expect to let you get a

good night’s sleep without worry. It can range from firewalls to Web-based catalogues. Yes we can do it all from free packages but sometimes it is nice just to throw in a disc and not have to run through a mental checklist. Finally, having more choice gives any potential IT proposal more flexibility. The option of no cost is still the biggest win but, if needed, having business products sold with backup from a third party may also swing the balance At the end of the day, formulating proposals simply comes down to a case of meeting customer needs. To that end – when a client requires a little more guidance – the introduction of the subject of Open Source software certainly won’t prejudice a proposal. Now where have I put that Fortran primer? Happy coding!

John Southern, Editor

Linux New Media UK Ltd is a division of Linux New Media AG, Munich, Germany Disclaimer Whilst every care has been taken in the content of the magazine, the publishers cannot be held responsible for the accuracy of the information contained within it or any consequences arising from the use of it. The use of the CD provided with the magazine or any material providied on it is at your own risk. The CD is comprehensively checked for any viruses or errors before reproduction. Technical Support Readers can write in with technical queries which may be answered in the magazine in a future issue, however Linux Magazine is unable to directly provide technical help or support services either written or verbal.

We pride ourselves on the origins of our magazine which come from the very start of the Linux revolution. We have been involved with Linux market for six years now through our sister European-based titles Linux Magazine (aimed at professionals) and Linux User (for hobbyists), and through seminars, conferences and events. By purchasing this magazine you are joining an information network that enjoys the benefit of all the knowledge and technical expertise of all the major Linux professionals and enthusiasts. No other UK Linux magazine can offer that pedigree or such close links with the Linux Community. We're not simply reporting on the Linux and open source movement - we're part of it. 13 · 2001 LINUX MAGAZINE 3


NEWS

LINUX NEWS Red Hat database dates released Red Hat has launched a new database training course. This is the first in a planned series of courses on Red Hat Database, the database is based on PostgreSQL. It is an intensive four-day course that covers topics ranging from installation and SQL fundamentals through to more advanced topics such as transactions and stored procedures. Scheduled for late November in Europe, it will appeal to database developers, Web and application developers and system administrators migrating to Red Hat Database or PostgreSQL, as well as current users. The course will provide the foundation for additional courses, such as database administration, Web development and application development, as well as migration and co-habitation with Oracle and DB2. Information regarding Red Hat’s e-business training, course descriptions, registration and facility locations is available at http://www.europe.redhat.com/training/ebusiness/ To find information about the Red Hat database training course please visit: http://www.europe.redhat. com/training/ebusiness/rdb147_desc.php3

European course dates UK: 20-23 November 2001 Germany: 26-29 November 2001 France: 4-7 December 2001

Borland support Linux on the Web Borland, has launched Web services support for Linux. As Linux continues to make in-roads as the fastest-growing operating system for servers, the benefits of Web services and Apache Web server will help to accelerate the adoption of Linux as an e-business server. Borland now plans to provide a RAD solution for Web services that will expand the Internet capabilities of Apache Web servers and applications for Linux using the Borland Kylix RAD development platform. Borland RAD solutions for Web services enable companies to create enterprise applications easily and simplify business-to-business integration of applications running on any platform with support for emerging industry standards: XML, SOAP and WSDL.

Smaller Red Hat Red Hat unveiled the Red Hat Embedded Linux Developer Suite, a packaged offering that combines new versions of its embedded Linux platform, development tools, runtime technologies, and support designed for developers looking for a standardised open source platform for faster creation, deployment and testing of target software components for embedded devices. Targeting MIPS, SuperH, X86, PowerPC and ARM/StrongARM/ XScale architectures. It’s based on Red Hat eCos: An open source real-time operating system for deeply embedded applications, supporting µ-Itron and EL/IX APIs. 6 LINUX MAGAZINE 13 · 2001

RedBoot is a standardised embedded debug and bootstrap solution that provides firmware for running and debugging embedded Linux, eCos and GNUPro applications. GNUPro ToolSuite: Red Hat’s commercial software development suite of tools built around the open source GNU standard. GNUPro products are tested, certified and produced as an integrated tool suite for developers of both desktop and embedded products. http://www.redhat.com/embedded. ■


NEWS

Ask Mandrake MandrakeSoft has started its second version of its Web-based support centre called MandrakeExpert V2 support centre and accessed at MandrakeExpert. co. The initial site was launched in early 2000 to connect enterprises and users to online Linux experts so as to answer a variety of technical questions. The first version was extremely successful in providing communityoriented free support. The new “MandrakeExpert V2” allows MandrakeSoft experts along with 2000 MandrakeSoft Affiliates to provide support for a fixed fee with reductions for multiple incidents purchased. MandrakeExpert.com V2 now offers: · Support provided by the Linux community – the free support that everybody is used to.

Premium support provided by MandrakeSoft and affiliate Experts – this paid support channel provides a top level and fast support option. The pricing starts from $10 when buying a unique incident down to $6.7 per incident when buying 10 incidents, for subscribed users. MandrakeExpert currently has 85,000 subscribed users. http://www.mandrakeexpert.com ■

Bundling Bluecat LynuxWorks Inc has announced a new series of bundle options to include service and support with enhanced tools for its popular BlueCat Linux operating system platform. At the core of the bundle options is LynuxWorks BlueCat Linux 3.1. BlueCat supports architectures including Intel IA-32 and XScale micro-architecture, MIPS, ARM family (including Thumb extensions), StrongARM, PowerPC (including PowerQUICC) and Hitachi SuperH. ● BlueCat Standard Bundle: BlueCat 3.1, commercial license and support package for Windows or Linux host development, priced at $2,699.

AMD is good for Linux – official AMD announced that major Linux distributors Caldera, MandrakeSoft, Red Hat, SuSE, and Turbolinux have certified the AMD Athlon MP processor and the AMD-760 MP chipset, AMD’s multiprocessing platform for one and two-way workstations and servers. This certification assures end users and computer manufacturers that AMD’s multiprocessing platform has been thoroughly tested by the major Linux distributions. The certifications demonstrate that business customers can take advantage of robust and reliable AMD Athlon MP processor-based workstation and server solutions running on the Linux operating system. AMD’s multiprocessing platform supports Double Data Rate (DDR) memory. theresa.zimmer@amd.com

● Professional Trace Bundle: BlueCat 3.1, commercial license, SpyKer kernel trace tool and support package for Windows or Linux host development, priced at $3,799. ● Professional IDE Bundle: BlueCat 3.1, commercial license, LynuxWorks’ VisualLynux Integrated Development Environment (IDE) or CodeWarrior IDE Edition and support package for Windows or Linux host development, priced at $4,899. ● Developer’s Bundle: BlueCat 3.1, commercial license, SpyKer kernel trace tool, VisualLynux IDE or Code Warrior IDE Edition and support package for Windows or Linux host development, priced at $5,999.

Exabyte and Ecrix merge Exabyte Corporation, who makes network storage, and Ecrix Corporation, (tape solutions for midrange servers and high-end workstations), announced an agreement to combine the two companies. Under the terms of the merger agreement, Exabyte will issue 10 million shares of common stock in exchange for all of the outstanding equity of Ecrix. Additionally, certain Ecrix investors and persons related to them will invest $9.4 million in new Exabyte Series H preferred stock at $1.00 per share at the time of closing. The preferred stock will be convertible into common stock on a share-forshare basis. Ecrix announced that Compaq has selected both the SCSI and IDE versions of VXA-1 for their commercial desktop and workstation product lines.

13 · 2001 LINUX MAGAZINE 7


NEWS

Technical TechnicalCommand CommandPrompt Prompt Command Prompt Inc launch DocPro. DocPro is a compilation of tools designed to allow technical writers to effectively process their DocBook SGML and XML layout. DocBook itself is a powerful mark up language. Command Prompt, Inc. has created a completely selfcontained environment, which allows for easy upgrading and maintenance. DocPro Basic includes: DSSSL StyleSheets for SGML transformation, XSLT StyleSheets for XML transformation, SGML/XML processor, XSLT Processor. Works with GhostScript and Latex. http://www.commandprompt.com/products_DocPro.lxp ■

Compaq trains with RedHat Compaq will be offering its newly announced Accredited System Engineer and Accredited Platform Integrator level Linux certification programs in conjunction with Red Hat’s Certified Engineer program. These aim to extend the scope of professional training available to Linux systems and software professionals and position them to take greater advantage of the expanding Linux marketplace. The API qualification is designed to provide a certification of basic networking system integration and support skills for system engineers with Compaq customers. The ASE qualification provides a certification of advanced networking, systems integration and support skills. The RHCE is now listed as one of the credentials required for ASE certification.

National grid IBM has been selected to build the world’s most powerful computing grid, an interconnected series of Linux clusters capable of processing 13.6 trillion calculations per second. The grid system or distributed terascale facility will enable thousands of scientists across the USA to share computing resources. IBM Global Services will deploy clusters of IBM eServer Linux systems at the four DTF sites beginning in the third quarter of 2002. The Linux clusters will be connected to each other via superfast 40-Gigabit per second Quest network, creating a single computing system able to process 13.6 teraflops making the DTF more than a thousand times faster than IBM’s Deep Blue supercomputer. Using open protocols the Linux cluster will smoothly connect to a heterogeneous collection of existing high performance computers at the four labs, creating a giant virtual computer that may be accessed from any point on the grid.

New sphere for NuSphere NuSphere corporation – creator of the first Internet application platform based open source components – has appointed the Logicsoft Group as its first reseller. Logicsoft supplies boxed application development, systems software tools and applications to developers and small to medium sized enterprises, as well as larger organisations. The deal covers all the NuSphere products including MySQL and Advantage, which includes a comprehensive set of editing tools to speed development of database driven web applications using PHP. Demand for the products led to Logicsoft requesting the reselling agreement from NuSphere who were keen to take advantage of the offer of local support and services for the growing number of UK users.

Virgin data Quadratec has announced that Virgin.Net has adopted their Time navigator software for enterprise data protection to protect data assets at the company’s Leicester Square headquarters. Virgin.Net says it’s Solaris focused but also uses MS and Linux platforms where they make sense. Time Navigator’s graphical user interface, visual file versioning and redundant delta cataloguing are key advantages of the Quadratec product. Time Navigator 3.5 software is now being used by Virgin.Net to maintain an operational availability level of 99.99% with a maximum allowable downtime window of four hours, excluding system boot time. Backup data is sent to large third party tape arrays with several hundred server tapes fed into it, cycled on a regular basis with a remote, off-site facility. Total storage capacity in use is 500G with 160G in modified data being backed up daily. 8 LINUX MAGAZINE 13 · 2001


BRAVE GNU WORLD

COMMUNITY

Pitbull dogs Linux Argus systems Group Inc has announced that its award winning Pitbull LX security software can now lock down all popular distributions of Linux and the 2.4 kernel, securing Web facing servers against malicious attack and Code Red-type threats. With turnkey installation in thirty minutes, Pitbull provides rapid security for Red Hat (6.2, 7.0, 7.1) SuSE (7.1, 7.2) Mandrake (7.2 8.0) Debian GNU/Linux (2.2) and TurboLinux (6.5). The software also supports custom installations of the 2.4 Linux kernel. Linux is used in 29% of public Web servers, more than any other non-Windows OS, and has had its share of attacks in the past. Pitbull LX collects bugs and security flaws which could grant an attacker system-wide access into a single security compartment. It also has the facility to restrict users to authorised areas, eliminating Superuser vunerabilities.

Mandrake 8

Enterprise blasts off SuSE Linux has presented its Enterprise Server version, based on the latest 2.4 Linux kernel and optimised for deployment in high performance servers. Enterprise Server 7 comprises all server services relevant in Linux, enabling the implementation of email, Internet and application services as well as file and print services in heterogeneous networks. SuSE Linux Enterprise Sever 7 has powerful support of various hardware architectures and a high degree of fail-safety, making it an ideal medium for consolidating complex server structures. It provides professional Linux users with a unique full service offer with release cycles of one year and maintenance periods tuned to meet customer requirements. Since the end of August, SuSE Linux Enterprise Server 7 has been available for AMD and Intel 32-bit architecture, 64-bit architecture and IBM’s mainframe platform S/390 versions for IBM’s iSeries, pSeries and zSeries are to follow in the late autumn.

MandrakeSoft has announced the availability of Mandrake Linux PPC 8.0 final release, building on Mandrake Linux 8.0 release for the x86 architecture. It is now available for Power PC G3 and G4. Mandrake 8.0 features Drakx for an entirely graphical installation, easy and safe partitioning with DiskDrake, the Mandrake control centre for one place configuration, automatic management of software packages and supermount for transparent access to CDs and peripherals. Tools have been updated, based on the 2.4 version of the kernel and it now boots from CD-ROM for most machines.

Real 3D from Realsoft Realsoft Graphics has released a version of its popular animation software ‘Realsoft 3D’ for the Linux operating system. Realsoft 3D is a full-featured, high-end 3D modelling, rendering and animation software package. The product offers an impressive collection of advanced features with a price tag that is suitable for a large part of graphic oriented computer users. Realsoft 3D for Linux is currently shipping as a commercial beta version. The list price is $300 USD (education price $200). The price includes a free upgrade to the final product release. 13 · 2001 LINUX MAGAZINE 9


NEWS

Open Vistas Monta Vista software is now offering embedded developers using its flagship product. Hardhat Linux 2.0 Professional Edition offers a choice of graphical development environments – the first full graphics add-on for HardHat. HardHat graphics and Qt/Embedded graphics toolkit from TrollTech Inc are now available from MontaVista. HardHat graphics is an open source environment for building graphicsbased applications for embedded devices and provides and extensible multimedia platform for developing the next generation of digital multimedia devices. Qt/Embedded for HardHat Linux is a package designed to work with HardHat Linux and includes Qt Designer, a full function GUI builder and Qt Linguist which provides international language translation. Both packages will help developers create and deliver powerful graphics based applications for HardHat Linux. Both are extensible and modular, provide access to source code and are scalable for embedded devices. Monta Vista has also announced the introduction of the High Availability Framework for HardHat Linux 2.0, which is for data communications applications with heavy I/O loads. It provides a standards-based open architecture software platform for building fault resilient systems based on the industry standard compactPCI and customer specific hardware. Marketing manager Glenn Seiler said, “With this release, MontaVista continues the trend of moving embedded development away from proprietary, closed software and hardware designs toward open, standardsbased systems.”

SuSE suits you

IBM send mail with SendMail Sendmail has announced the availability of its complete line of products for Linux on the IBM S/390 mainframe Sendmail Switch, Sendmail advanced message server and Sendmail mobile message server on Linux for the mainframe provide a reliable and manageable email system for business. Joann Duguid, director of marketing for Linux on the IBM eServer zSeries said “Consolidating the management of critical applications, such as email, is a priority for our customers, particularly in today’s economic environment. Companies like Sendmail, Inc. and their products and expertise in messaging add to the benefits of Linux for the mainframe, with a low cost of ownership and the ability to grow with a company’s needs without requiring additional investment.”

10 LINUX MAGAZINE 13 · 2001

German textile facility Lauffenmüehle, has installed the first SuSE Linux application on an IBM eServer iSeries to be used for commercial purposes. Based in Lauchringer, Germany, the company uses SuSE version 7.1 running in a partition on an iSeries 820 machine for its stock planning and control system. An initial Linux partition was run in a test environment on an AS/400 720; once active and stable it was migrated to the iSeries 820. The LPAR technology available on the iSeries now runs the Linux stock planning and control application in parallel with the test partition, OS/400 and traditional Web applications such as a firewall. The company will also consolidate its servers using Linux and the iSeries. The project was implemented in six weeks by IBM Business Partner Fritz and Macziol GmbH. They now aim to create additional applications for Lauffenmühle using the GNU codes provided by SuSE.

Red Hat gets mobile Red Hat and 3G Labs have announced that they are jointly developing a real-time operating system for 2.5G and 3G mobile devices. Ecos/M3 will aim to give developers a cheaper and more reliable base to develop new mobile devices. The hardware and software will be supported at 3G Labs in Cambridge.


BRAVE GNU WORLD

SuSE Database

SuSE Linux has announced the UK availability of the SuSE Linux Database Server combining the operating system platform SuSE Linux Enterprise Server with IBM’s DB2 database. DB2 supports the widest variety of hardware architectures, allowing the implementation of very large databases on various hardware architectures, as well as for data exchange between the platforms. It allows a scalable architecture for customers that want to expand their infrastructure as their businesses grow, it also supports smaller PC-based databases and is very flexible. The workgroup edition of SuSE Linux Database Server comes with a multiple workstation version of the DB2 database for applications and data that are jointly used by a workgroup or department over a PC-based LAN. The Internet edition delivers stored information to an unlimited number of users over the Internet.

Panda protection Panda Software is offering Panda Antivirus for Linux as freeware. This application ensures effective protection against viruses specifically designed to attack Linux like ‘Ramen’ or ‘Lion’. Allowing for perfect integration with the most common Linux applications – Red Hat, Mandrake, Debian and OpenBSD. The system scans both DOS and UNIX/Linux disks. It also offers email protection, as it scans base 64-MIME files. Panda Antivirus for Linux can be downloaded at http://www.pandasoftware.com/com/linux/ linux.asp

COMMUNITY

HP tee off with Chai HP Introduces Linux Security Software and its First Pervasive Linux Product for Intelligent Devices Hewlett-Packard strengthened its commitment to open source with several new offerings for Linuxbased systems. HP Secure OS Software for Linux and its HP ChaiLX embedded Linux software platform, expanding the company’s comprehensive Linux offerings into two key strategic markets where customers are demanding reliable Linux solutions security and intelligent devices. HP also announced its intention to open source its ChaiServer product, unveiled a broad offering of manageability and high availability products for Linux environments, announced its Embedded Software Developers Network, and unveiled two new Linux-based workstations in Europe. Chai-LX is to make Linux pervasive by extending it to intelligent, connected devices. HP developed HP Chai-LX to power new HP consumer appliances. Devnet provides a Web-based collaborative development environment for Chai. http://devnet.hp.com ■

More support from Tivoli Tivoli Systems Inc has announced an extension of its support for Linux to its security and Web management software. Tivoli now offer support for Linux on a number of key products within its ebusiness infrastructure management solutions including security, storage performance and availability software. Tivoli plans to extend Linux support to Policy Director 3.8 secure access management software from October. Risk Manager 3.8 from September. Tivoli User Administration already supports Linux. Both Web Services Manager and Web Services Analyser, which enable businesses to manage their infrastructures across the Internet without compromising security, will support Linux by the end of the year. Tivoli Storage Manager and Storage Network Manager 1.2 will support Linux from September.

Lara under Linux Remember the Net Yaroze development kit for the Sony PlayStation1? Well Sony is planning on launching a Linux development kit for the PayStation2 if enough interest is shown. The development kit is already available in Japan. To show your interest sign up at http://www.technology.scee.net.

13 · 2001 LINUX MAGAZINE

11


COVER FEATURE

FOCUS ON PRINTING

13 Laser printers on Test

CAPTURED ON PAPER OLIVER KLUGE, MIRKO DOLLE

The Competence Centre took over three rooms at Linux Magazine for two weeks recently. 13 laser printers were put through their paces in a completely new, rigorous procedure.

Linux Magazine is breaking new ground with this test of laser printers, our team developed more than 200 tests to examine and assess printers in as a precise and detailed a way on the Linux platform as Windows users have long become accustomed to for their reviews. These tests fall into three distinct parts, probing for speed and gauging both graphical and photographic resolutions. To accurately test the speed of the laser printers we used Star Office and a simple letter allowing us to also do the costing and timing calculations. To allow us to fully compare the graphical capabilities, a complete range of graphic tests had to be produced. Written by Felix Marcon and Oliver Kluge, using CorelDRAW, which was the only package that could accurately cope with the fine detail required for such a task. Their efforts incorporate some novel features, putting the subjects through a most testing time. As you can see from the sample printed opposite, the first part of the test is a colour saturation test, which mono printers should correctly present in spectacular shades of grey. The second of these graphical tests is the most telling, with coloured lines on white or black backgrounds. As the lines become narrower they break down in resolution and interference patterns form. The lines begin to merge into a solid block and the size of the block, which is proportional to the resolution of the printer. Test three shows the shading capabilities of rastered images. Test five is in 12 LINUX MAGAZINE 13 ¡ 2001

five per cent graduations of the main printing colours – Cyan, Magenta, Yellow and blacK (CMYK). The remaining tests are used to try out the rastering capabilities of the printer on fine angled lines. The test with yellow text on a black background and vice versa proved very difficult for most printers once the point size was below 4. A red-brown relief picture permits a good optical check of the overall print quality. The outer limits of the page allowed us to calculate printable coverage, colour graduations showed the ability of the tones to run into one another and the straight lines in yellow, white and black very quickly determinned the precision of printing The photographic test was carefully controlled with the set up taking longer than the actual shoot, calling on the skills of a professional photographer to arrange the still life elements. Using a large format camera means we now have a high resolution image with which we can test scanners later on. The Agfa IT8 standard colour chart allows us to accurately determine ink saturation levels for both scanners and colour printers. The typewriter has an open side allowing us to check the colour depth of rastering, as does the toy Tux, with his image being reflected in both black and white. Bianca, the model, is a challenge for printers and scanners. Her fine flyaway hair fails to print if resolution is poor, with the dark shirt also giving a good optical impression. The resulting image was printed to the test subjects via Gimp.


FOCUS ON PRINTING

COVER FEATURE

Recommended Kyocera FS-3800 One printer proved its strengths so convincingly that the editors found their choice easy. The FS-3800 from Kyocera Mita comes in first. Its permanent photoconductive drum, made from ceramic instead of plastic, means the device has unbeatably low running costs. All this printer needs, practically, is paper and toner, nothing else. With a smooth print unit, network interface and smart paper handling (which accepts a full 500-sheet stack in the tray and, with the aid of a sensor pin, also prevents paper overflow) this printer is the best choice in the Linux office. There are good expansion possibilities by adding on extra trays and other accessories. 2001/13

Printing under Linux is one of the areas that has been running for a long time without any great problems. Obviously there are differences – and the purpose of the test was to work these out. The range of printers we tried consisted mainly of black and white laser printers with a print output of at least twenty pages per minute and built-in network interface. This class of printer, which can cost anywhere up to £1000, is the typical choice to make in furnishing small to medium sized offices, especially when the printer is likely to be used by several people on a network. A colour laser printer was also examined, for comparative purposes and to give us all a chance to dream. Office work calls for mastery of as many emulations as possible (PCL and Postscript) and smooth data handling. It goes without saying that printing should also be fast and produce clean copy, but the weight, speed and quality must be offset by the costs. The purchasing costs are only part of the Total Cost of Ownership (the TCO), since over the printer’s lifetime the cost of consumables, such as toner and drums will amount to more than the original investment. This is why we went to the trouble of printing a toner cartridge until completely empty on each printer, using a high capacity cartridge if available. Also, the laser printers were measured for electrical power consumption, which is not insignificant – when a laser printer can draw up to 800 watts in pulses, if the fixing roller has to be reheated.

Planning In planning the installation and location of an office laser printer, you must remember that smells are given off by these devices. Although advertised as such by many manufacturers, there is no question of them being ozone-free. Toner dust and softeners in the plastics that heat up are not exactly pleasant either. The software-end control of the printers caused no aggravation in the test. All printers could be configured on the system using – SuSE Linux 7.2 – with YaST, without any problems. The printers all got themselves an IP address from the DHCP server and worked smoothly under it. PCL or Postscript, it made no difference, apart from a rare exception, all the devices worked straight off. Both Gimp and Star Office printed willingly and with good quality on the devices offered by the system. It was only with Wine (used for CorelDRAW) that there was some initial bother. Wine has problems with over-long printer and alias names in /etc/printcap. Names which are too long cause Wine to crash immediately after starting. If the name is edited by deleting alias names, which are defined several times, everything works. Thus configured, there is no longer any obstacle to printing under Linux. But for use in larger companies, the configuration software, which is sometimes also available for Linux, or installation of the devices via built-in Web interfaces is recommended. The use of additional print software such as CUPS or Turboprint can be a good idea – these are also presented in this issue. ■ 13 · 2001 LINUX MAGAZINE 13


Printer Test

Manufacturer

Model

Emulation

Resolution

Date, Time

Print Length

Penguins can see text at 10 point

Penguins can see text at 10 point

Penguins can see text at 9.5 point

Penguins can see text at 9.5 point

Penguins can see text at 9 point

Penguins can see text at 9 point

Penguins can see text at 8.5 point

Penguins can see text at 8.5 point

Penguins can see text at 8 point

Penguins can see text at 8 point

Penguins can see text at 7.5 point

Penguins can see text at 7.5 point

Penguins can see text at 7 point

Penguins can see text at 7 point

Penguins can see text at 6.5 point

Penguins can see text at 6.5 point

Penguins can see text at 6 point

Penguins can see text at 6 point

Penguins can see text at 5.5 point

Penguins can see text at 5.5 point

Penguins can see text at 5 point Penguins can see text at 4.5 point Penguins can see text at 4 point Penguins can see text at 3.5 point Penguins can see text at 3 point

Linux Magazine October 2001

Penguins can see text at 5 point Penguins can see text at 4.5 point Penguins can see text at 4 point Penguins can see text at 3.5 point Penguins can see text at 3 point

Cyan 12 Point Magenta 12 Point Yellow 12 Point Red 12 Point Green 12 Point Blue 12 Point


FOCUS ON PRINTING

COVER FEATURE

Brother HL-2460N

Marking system

The 24-pager from Brother has only 600 dpi print resolution but this doesn’t lead to any disadvantages. Equipped with 48MB RAM, it can easily keep up in terms of quality, even when printing Bianca, with its 1200 competitors. And yet the device is speedy – only two devices are faster. The display looks good, as it uses not only text, but also three types of backlighting to make it clear from a long way off what’s going on. Unfortunately the printer reports an overfilled paper output tray as a paper jam and thereby misleads the user. Configuration is practical via the built-in Web interface, although you have to look very hard for the password in the CD version of the network manual. In summary, the Brother is a good and reliable printer for office use.

Marks in this hardware test are awarded out of a possible 5, where 5 out of 5 represents the lowest possible award and 1 out of 5, the highest.

Brother HL-2460N Price Output Print quality Speed Cost of printing Overall mark

£750 24 P/min, 600x600 dpi 3 out of 5 3 out of 5 2.5 out of 5 (0.97 pence / page) 3 out of 5

Epson EPL-N2050+ It’s obvious that Epson is related to the Xerox printer, but the innards are not as similar as the printer engine would lead you to think. The EPL-N2050+ does not offer Postscript, hence the device had to be controlled in the test with PCL. A bug prevents resolutions higher than 300 dpi. At 600 dpi groups of lines are displaced by a couple of dozen dots to the right, making it impossible to use this resolution. This limitation is also responsible for the poor cut off in the quality test. If Epson corrects this bug, the device might cut off better, but at the moment it is in last place for quality. On the other hand, Epson’s control panel is a winner. With six LEDs and eight buttons the Epson is very nice to use.

Epson EPL-N2050+ Price Output Print quality Speed Cost of printing Overall mark

£580 20 P/min, 1200x1200 dpi 4.5 out of 5 3.5 out of 5 4 out of 5 (1.04 pence / page) 3.5 out of 5

Genicom Micro Laser 210 The printer from Genicom also looks enough like the Xerox to be mistaken for it, but unlike Epson this offers Postscript and 1200 dpi resolution. The high resolution means that four times as much data has to be transferred as at 600 dpi. The extra sharpness in the Genicom gives a clear typeface. The quality of the photo reproduction does not, however, greatly improve, and the Microlaser is in the middle of the field for this. It was striking that the device did not automatically switch into energy-saving mode – to do this it was necessary to re-enter the switch off time via the printer menu. But even in energysaving mode, the fans continued to run. At present the manufacturer only gives two sources of supply, and an Internet presence.

Genicom Micro Laser 210 Price Output Print quality Speed Cost of printing Overall mark

£1028 21 P/min, 1200x1200 dpi 2.5 out of 5 5 out of 5 3 out of 5 (1.23 pence / page) 3.5 out of 5

13 · 2001 LINUX MAGAZINE 15


COVER FEATURE

FOCUS ON PRINTING

HP Laserjet 4100N Just by looking at the HP printer, you can see the long printer tradition of its American manufacturer. In many details it is persuasive because of its solid, wellthought-out quality. This starts with the expansion slot, which can be used without tools and does not stop even with the very well-made Web front-end. In all test categories the printer delivered completely satisfactory results and thus it wins second place. When it comes to the costs of consumables, you can see that there is particular commitment. Despite the disposable drum the HP is one of the most economical printers. But the seals on the toner cartridges could be improved – if the cartridge is shaken just before it comes to the end, some toner trickles out. The HP remains true to itself. Printers from HP have the image of an office work horse, which this device lives up to.

Kyocera-Mita FS-3800N The crucial construction criterion that sets this printer apart is the almost everlasting photo drum. With it, an impressive page output, as well as extremely low running costs, is possible. In practical terms, this printer needs only toner and paper. At 0.33 pence for toner per printed page it makes even normal copier paper (0.3 pence) a dear item in the printing business. A sensor ensures that the FS-3800 does not overfill the output tray in its excessive zeal, and the print out is paused in due course, before the papers fly off all over the place. The display certainly provides concise but comprehensible information about current status. The Web interface looks tidy and businesslike. The Kyocera is the ideal printer for small to medium sized offices.

HP Laserjet 4100N Price Output Print quality Speed Cost of printing Overall mark

£980 24 P/min, 1200x1200 dpi 3.5 out of 5 3 out of 5 3 out of 5 (0.98 pence / page) 3 out of 5

2001/13

Kyocera-Mita FS-3800N Price Output Print quality Speed Cost of printing Overall mark

£900 24 P/min, 1200x1200 dpi 3 out of 5 2.5 out of 5 1 out of 5 (0.63 pence / page) 2.5 out of 5

Lexmark Optra M412n The smaller of the two Lexmark printers promises low-priced output for the office. It ‘s easy to operate the control panel and the Web interface. We liked the Mark Vision control software. The page output, at 17 pages, is fast but the paper tray holds only 220 sheets and has no fill level indicator. The covers rattle or stick out. The clamp for keeping the sheets in order keeps slipping out of the loops. In energy saving mode the device demands 23 watts and the high pulse output causes light bulb flickering on the same power circuit. After 90% of total output stripy print outs appear. In all, both test devices achieve only three quarters of the output claimed by the manufacturer. The result is a cost per page of 1.87 pence, the highest in the test field.

16 LINUX MAGAZINE 13 · 2001

Lexmark Optra M412n Price Output Print quality Speed Cost of printing Overall mark

£770 17 P/min, 600x600 dpi 3.5 out of 5 3 out of 5 5 out of 5 (1.87 pence / page) 4 out of 5


FOCUS ON PRINTING

COVER FEATURE

Lexmark Optra T620n The larger of the two Lexmarks is distinguished by its poor quality finish and, as with the Optra M412n, the output tray clamp is shoddy. Instead of the claimed 500 sheets, the paper cassette can only hold 490, the rest of the sheets get jammed between cassette and printer. We’re accustomed to better in the £1000+ category. The Web interface can be described as a success. The super-fast device, at 28 pages per minute, can be smoothly configured or remote controlled with it. What remained incomprehensible was why the device does not recognise the company’s own Mark Vision software. Marks were deducted for the documentation, too, since for a printer in this price class we expect more than just the installation instructions to be printed.

Lexmark Optra T620n Price Output Print quality Speed Cost of printing Overall mark

£1085 28 P/min, 1200x1200 dpi 3 out of 5 2.5 out of 5 5.0 (1.63 pence / page) 3.5 out of 5

Minolta-QMS Page Pro 4100 GN The Minolta-QMS shows that a good device does not always come with an impressive appearance. The Minolta is plain, it doesn’t even have a display, the printer configuration is possible only via the Web front-end. Also impractical is the open, small-dimensioned paper supply and the output flap, which has to be open before starting printing – otherwise a jam forms and the flap cannot be opened. It is also the slowest in the test, 18 pages per minute is claimed by the manufacturer – the lab result was 16.5. The big surprise emerged in the quality test, where it showed the rest of the test field what a good photo print is. Sharp and rich in detail and well rastered – only the Xerox can match this quality, but the ‘pages per’ did not quite make it.

Minolta-QMS Page Pro 4100 GN Price Output Print quality Speed Cost of printing Overall mark

£750 18 P/min, 600x600 dpi 3 out of 5 3.5 out of 5 5 out of 5 (1.83 pence / page) 4 out of 5

Oki Okipage 20 plus/n Initially, the Oki Okipage 20 can scarcely be told apart from its big brother Okipage 24, even the consumables for both printers are identical. Toner change with an Oki isn’t a very clean affair: To swap a cartridge, it’s rotated through 90 degrees in the holder, which dirties the drum unit. When it is taken out, toner often falls out. Every 25 pages the printer inserts a short break, giving a comparatively slow print speed. Also, after the end of a print job the message “Data loaded” is displayed for about 30 seconds, although no more data follows. However, the next job can be sent without any problem, and this is then started after a few seconds. One good point was that Oki has come up with the idea of separating toner and drum. The latter only has to be replaced after 30,000 sheets.

Oki Okipage 20 plus/n Price Output Print quality Speed Cost of printing Overall mark

£800 20 P/min, 600x600 dpi 3 out of 5 4 out of 5 3 out of 5 (1.42 pence / page) 3.5 out of 5

13 · 2001 LINUX MAGAZINE 17


COVER FEATURE

FOCUS ON PRINTING

Oki Okipage 24 dx/n The larger of the two Oki printers differs from the small one in its mechanism. Unlike the Okipage 20, it has built-in a duplex unit, which makes it heavier. The print unit is also identical to that of its little brother, but Bianca clearly showed differences in quality between what are actually identical print units. The Okipage 24 also prints the image darker, as the colour bars of the graphics test also show. Grey areas are also crossed by more stripes. Oki earns praise for the extremely comprehensive documentation which comes with both printer models. Apart from the manual, the CD includes films on removing paper jams and changing toner (see “Laboratory Report”).

Oki Okipage 24 dx/n Price Output Print quality Speed Cost of printing Overall mark

£999 24 P/min, 600x600 dpi 3 out of 5 3.5 out of 5 3.5 out of 5 (1.45 pence / page) 3.5 out of 5

Samsung ML-7300N Samsung demonstrates some good solutions. We liked the backlit display, which can be raised to make it easier to read. The mains switch is in the only correct position – at the front. Unfortunately one equally nice feature is not – at least not yet – useable under Linux: the duplex unit. But Samsung has in fact packed a PPD file in the collection of accessories, with the aid of which any generic Postscript driver should be capable of controlling Duplex. But even under the CUPS printing system, which is not exactly short of features, it was not possible to print double-sided. But the device is not as fast when it comes to printing as it is when warmed up. It certainly wakes up speedily, but the page output is at the rear of the test field.

Samsung ML-7300N Price Output Print quality Speed Cost of printing Overall mark

£800 20 P/min, 1200x1200 dpi 4 out of 5 4.5 out of 5 5 out of 5 (1.61 pence / page) 4 out of 5

Xerox Docuprint N2125 The low operating noise is one of the best features of this device. Apart from the high print rate, the print quality impresses most. Only the Minolta-QMS can beat Xerox, none of the devices of the same type (Epson and Genicom) even comes close to the quality of the nice raster of Xerox. Images are printed evenly and with good contrast. Caution is advised when inserting paper. If you put in a 500-sheet stack, the bottom 50 sheets become crimped. Tip: Put in about 100 sheets first and then the rest. The print is almost marginless, fractions of a millimetre are left white.If this print method is used, on one sheet or another something goes wrong with the mechanism, but otherwise this causes no problem. 18 LINUX MAGAZINE 13 · 2001

Xerox Docuprint N2125 Price Output Print quality Speed Cost of printing Overall mark

£990 21 P/min, 1200x1200 dpi 2.5 out of 5 3 out of 5 4 out of 5 (1.32 pence / page) 3.5 out of 5


FOCUS ON PRINTING

COVER FEATURE

Minolta-QMS Magi Color 2200 N The Minolta-QMS Magi Color 2200 was, as an unofficial competitor, the only colour laser printer. Fully equipped with network card, Postscript and PCL module, the price of £1395 appears all the more astonishing. It’s more what you’d expect from GDI printers. The print unit has four large, colour toner cartridges in a carousel. Since only one colour can ever print at one time, colour printing takes longer than monochrome. The print image is good, the colours on our test image were well reproduced, although with a slight orange tinge. The printer menu was impressive, almost everything can be adjusted, right up to colour correction. The three-minute warm up phase is too long. Recalibration every 24 pages is a nuisance when printing out longer text documents.

Laboratory report The very different levels of equipment and finish of the printers were especially noticeable. Lexmark falls out of the frame with the models Optra M412n and Optra T620n, both printers made a poor impression. In particular the manual paper feed is extremely shoddy, even loaded with 100 sheets it sags visibly, and anyone who collides with it in passing is liable to break off a good sized chunk of it. In terms of documentation all printer manufacturers can take a leaf out of Oki’s book. The manual is more than comprehensive, apart from the usual content such as installation, maintenance and expansion, in the annex there is even the pin configuration of all the ports with signal description, timing diagram and sometimes even component designation. The documentation CD rounds the whole thing off with films on the most important maintenance tasks of the printer. It could hardly be better.

The toner cartridges for the printers are recognised by means of coding plates for Xerox (top), Epson (middle) and Genicom (bottom). The plates can however easily be swapped.

Minolta-QMS Magi Color 2200 N Price Output Print quality Speed Cost of printing Overall mark

£1395 20 P/min, 1200x1200 dpi 2 out of 5 3.5 out of 5 3.5 out of 5 (2.02 pence / page) 3 out of 5

To finish off, one curiosity: Although Xerox, Epson and Genicom use the same printer engine, we were unable simply to change the toner cartridges. Obviously no manufacturer wants to give up the after sales business. Hence the code plates illustrated below were placed on the otherwise completely identical cartridges. The plates can, however, easily be swapped using a knife – so you can always obtain them from the most economical source.

Conclusion Printing is no problem under Linux. Apart from minor details such as the Wine stumbling block, laser printers are completely non-critical. But even under the aegis of the penguin not all printers are equal. The differences are enormous. So the Competence Centre Hardware even found three printers of identical construction which delivered such differing print outs through using different firmware that in the quality test, one came out in front, one in the middle field and one as tail end Charlie. In printers of identical construction the manufacturers like to try to prevent the use of competitors’ toner by the use of lugs. With a little DIY, it is easy to save a pound or two. On aspect is still worth pointing out: Most printer manufacturers boost their devices as lowozone or even ozone-free. The test crew determined the complete opposite, however. The penetrating sharp smell of O3 wafted through the corridors, and sensitive people could feel it on their skin. Tester and volunteer fireman Mirko Dolle even felt obliged to order a breathing protection device from the industrial safety equipment supplier Drager. Hence the Linux Magazine must issue a warning that office printers are only installed in well-ventilated places. ■ 13 · 2001 LINUX MAGAZINE 19


COVER FEATURE

FOCUS ON PRINTING

Equipment and results Ranking Manufacturer/Name

1 Kyocera-Mita FS-3800N

2 Hewlett-Packard Laserjet 4100N

3 Brother HL-2460N

4 Xerox Docuprint N2125

5 Oki Okipage 24 dx/n

6 Lexmark Optra T620n

Website

www.kyocera. co.uk 900

www.hp.com/uk

www.brother co.uk 750

www.xerox. co.uk 990

www.oki.co.uk 999

www.lexmark. co.uk 1085

1/2 500 / 1000 optional / no yes 395 x 344 x 515 19.10

600 x 600 24 16 2/ 272 PS 3,PCL6, FX-850, IBMXL, HPGL Parallel, Serial, Ethernet, USB 1/3 500 / 3500 optional / optional yes 469 x 423 x 469 22.45

1200 x 1200 21 32 / 192 PS 3, PCL 6, PCL 5e Parallel, Ethernet, USB 1/2 550 / 1100 optional / optional yes 428 x 406 x 453 26.60

600 x 600 24 16 / 80 PS 2, PCL 6, Epson FX, IBM Proprinter Parallel, Serial, Ethernet 1/2 530 / 1060 yes /no yes 375 x 337 x 461 22.45

1200 x 1200 28 32 / 384 PS 3, PCL 6, PPDS Parallel, Serial, Ethernet, USB 1/2 490 / 3000 optional / optional yes 419 x 399 x 493 19.50

85 10,000 -

95 11,000 no -

130 15,000 no -

40 6,000 no 30,000 130

315 30,000 yes -

14,707 0.98 5.0 3.0 3.5 5.0 54.5 411.8 216.2

16,594 0.97 4.0 2.52 3.52 1.5 55.0 423.12 222.82

14,060 1.32 2.5 3.0 2.0 1.5 46.2 422.8 185.4

4,722 1.45 2.5 3.0 4.0 4.5 59.8 440.2 277.0

25,645 1.63 2.5 3.0 3.5 3.0 36.9 407.7 119.4

Price (approx.) [£] Equipment Resolution [dpi] Pages per minute (manuf) Memory (stand/max) [MB] Emulations

1200 x 1200 24 32 /272 PS 2, PCL 6, KPDL 2, XL24E, LQ-850, etc. Ports Parallel, Serial, Ethernet Number of paper feeds (stand/opt) 1 / 2 Total capacity of feeds 500 / 2500 Duplex/Sorter optional / optional Web interface yes Dimensions (W x H x D) [mm] 358 x 301 x 388 Weight (measured) [kg] 13.95 Consumables Price of toner [£] 70 Lifetime of toner (manuf) 21,000 Drum replaceable separately yes Lifetime of drum [pages] 300,0001 Price of drum [£] -1 Measured values Number of pages at 5% 30,034 Cost of printing (Incl. paper) [pence] 0.63 DTP Quality 4 4.0 4 Graphics quality 2.0 Photo quality 4 3.0 Area quality 4 3.5 DTP Speed [secs] 38.4 Graphics speed [secs] 418.5 Photo speed [secs] 199.5 Results

Print quality Print rate Cost of printing Equipment Handling Service Overall rating

980 1200 x 1200 24 32 / 256 PS 3, PCL 6, PCL 5e Parallel, Ethernet

satisfactory satisfactory satisfactory good satisfactory satisfactory very good satisfactory good satisfactory satisfactory satisfactory satisfactory satisfactory satisfactory satisfactory satisfactory poor good satisfactory satisfactory 1 Lifetime is in excess of depreciation limit (3 years) 20 LINUX MAGAZINE 13 · 2001

good satisfactory satisfactory satisfactory adequate satisfactory satisfactory adequate satisfactory good adequate adequate satisfactory satisfactory 2 Tested with 48MB RAM

satisfactory good poor satisfactory adequate satisfactory satisfactory


FOCUS ON PRINTING

COVER FEATURE

7 Oki Okipage 20 plus/n

8 Epson EPL-N2050+

9 Genicom Micro Laser 210

10 Lexmark Optra M412n

11 Minolta-QMS Page Pro 4100 GN

12 Samsung ML-7300N

(non-official competitor) Minolta-QMS Magi Color 2200 N

www.oki.co.uk

www.genicominternational.com 1028

www.lexmark. co.uk 770

www.qms.co.uk

www.samsung electronics.co.uk 800

www.qms.co.uk

800

www.epson. co.uk 580

600 x 600 20 16 / 80 PS 2, PCL 6, Epson FX, IBM Proprinter Parallel, Serial, Ethernet 1/2 530 / 1060 optional /no yes 375 x 377 x 461 20.30

1200 x 1200 20 16 / 272 PCL 6, ESC/P2, HPGL, FX-8800, IBM 239plus Parallel, Serial, Ethernet 1/2 550 / 1100 optional / optional yes 428 x 406 x 453 26.70

1200 x 1200 21 16 / 96 PS 3, PCL 5e

600 x 600 17 8 / 132 PS 3, PCL 6, PPDS

1200 x 1200 20 16 / 208 PS 3, PCL 6,

Parallel, Ethernet, USB 1/2 550 / 1000 optional / optional yes 428 x 406 x 453 26.55

Parallel, Ethernet, USB 1/1 220 / 500 optional / no yes 404 x 315 x 429 14.65

600 x 600 18 8 / 104 PS 3, PCL 6 PCL 5e Parallel, Ethernet 1/2 250 / 1000 optional / no yes 435 x 394 x 502 13.45

1200 x 1200 20 64 / 384 PS 2, PCL 6, PDF, HP-GL, TIFF, etc. Parallel, Ethernet

40 6,000 no 30,000 130

130 15,000 yes -

170 15,000 no -

163 15,000 no -

120 9,000 no -

120 10,000 no -

50 (black) 6,000 no yes 25,000 3 275 3

4,949 1.42 2.5 3.0 3.5 4.5 64.5 438.3 269.4

20,356 1.04 5.0 4.0 5.0 3.0 50.8 406.9 222.2

20,463 1.23 2.5 2.5 2.5 5.0 49.7 456.7 588.6

11,078 1.87 2.5 3.0 4.0 4.5 51.1 419.0 126.6

8,404 1.83 2.5 3.0 3.0 1.5 57.1 451.7 299.4

9,898 1.61 5.0 3.0 4.5 5.0 67.1 465.8 533.8

10,371 2.02 3.0 2.0 1.5 4.5 131.3 43.9 198.8

satisfactory adequate good satisfactory adequate satisfactory poor satisfactory satisfactory satisfactory adequate poor adequate adequate satisfactory adequate good satisfactory satisfactory adequate adequate satisfactory adequate satisfactory satisfactory satisfactory satisfactory adequate 3 Costs of all consumables averaged over 25,000 pages (drum, fixer, image carrier tape etc.)

750

Parallel, Ethernet 1/1 500 / 500 yes /no no 424 x 299 x 455 16.90

satisfactory adequate satisfactory adequate poor poor poor adequate adequate good satisfactory adequate adequate satisfactory 4 Low value good – High value bad

1395

1/1 500 / 500 optional / optional yes 463 x 511 x 522 44.50

good satisfactory satisfactory satisfactory satisfactory good satisfactory

13 · 2001 LINUX MAGAZINE 21


COVER FEATURE

PRINTING WITH CUPS

The Common UNIX Printing System

PRINT-OMATIC ULRICH WOLF

CUPS is the printing system of the Internet era. Our overview shows what can now be done with it, and what’s going on behind the scenes. CUPS implements the Internet Printing Protocol (IPP) in Version 1.1. It can completely supersede, both locally as well as on the network, the ageing Line Printer Daemon. A few Linux distributions such as Mandrake, Caldera, EasyLinux or Connectiva have already stopped installing the LPD. And in the latest SuSE variants CUPS can now be configured with YaST. On stand-alone workstations CUPS exhausts all the individual properties of the existing printer, but its real strengths only come into play on the network. This is where, on any free printer, it is possible to print using all its usual device options, without a driver having been installed on the client. More than that, the client automatically recognises which printers are installed in the network, provided these are attached to a CUPS server.

Gimp Print for inkjet printers, a universal driver with special abilities ● About 120 inkjet printers from the companies Epson, HP, Canon and Lexmark are supported by the latest Alpha version (4.1.99-a2) of the Gimp Print Project (last stable version is the 4.0.5). ● Gimp Print includes a joint code base for three different target projects: GIMP, Ghostscript and CUPS. ● ./configure - -help displays the options. The CUPS system can be expanded using - -with-cups. The option - -with-ghost - -with-foomatic generates the source code files which are merged with the Ghostscript source code. Then Ghostscript is recompiled. As a new component it now includes stp. The CUPS/Gimp Print filter together with the PPDs created and the normal Ghostscript filter, here under the name stp, can be used with CUPS. The following points are important to remember: ● The PPDs must match the filter: the foomatic files belong to stp as can be seen as GIMPPRINT+foomatic or stp+foomatic ● The CUPS form of Gimp Print is called CUPS + gimp-print ● Only copy PPDs from other computers with the appropriate conversion filter ● Avoid Gimp Print, Foomatic or stp and associated PPDs in PostScript printers 24 LINUX MAGAZINE 13 · 2001

Architecture like a webserver The beating heart of CUPS is the server daemon or scheduler cupsd, which is similar to an ordinary webserver. It can be addressed with any browser via the reserved port 631. Clients also communicate with the server via this port. But the exchange of data with the printers connected to the server can occur in a variety of ways. Ideally these can even cope with IPP, but this is only the case in some 200 models. Good laser printers such as the ones in our test are more likely to be able to use the Appsocket protocol, free printers on Windows computers can be addressed with Samba via SMB. For the rest LPD is used. Communication with the printers via these protocols is done by back-ends, which are separately configured.

CUPS and Postscript Anyone printing on UNIX systems cannot avoid Postscript; CUPS is no exception to this rule. So the simplest way is to use Postscript-compatible printers. Their integrated processors process PS files into raster data in the highly calculationintensive Raster Image Process (RIP). For printers which cannot cope with Postscript, a computer has to do it. There are often dedicated RIP servers in the preliminary print stage. In a company network, were there are printers which are not Postscriptcompatible being used regularly, the print server takes over these tasks and should therefore have sufficient computing power. CUPS is flexible enough to allow the preparation of print data on the client. But then the client obviously needs its own driver. The characteristics of each printer model are defined in a Postscript Printer Description (PPD) in ASCII format.


PRINTING WITH CUPS

Printing without Postscript devices CUPS filters, are based on the older Ghostscript version 5.5, but have been modified and improved. Some filters are in place after the standard installation: for PCL-compatible inkjet and laser printers from the Deskjet and Laserjet families of Hewlett-Packard, Epson Stylus Color and Stylus Photo and 9-pin and 24-pin printers from Epson. These are usually found, together with the filters derived from Ghostscript, in the directory /usr/lib/cups/filter. For other, non-Postscript-compatible printers there is Cups-O-Matic. This is a version of the Foomatic script collection adapted to CUPS by the Linux Printing HOWTO author Grant Taylor. It covers all Ghostscript filters for use with the CUPS system and thus requires that Ghostscript is installed on the print server. On the Cups-O-Matic website, you first create a PPD for the existing printers in combination with a Ghostscript filter. From the point of view of the user, this is already the printer driver and can be installed in the CUPS system by the usual methods (command line, GUI or browser). Internally, however, you will need cupsomatic, a Perl script which must sit in the filter directory /usr/lib/cups/filter. With the aid of this script the printer-specific data from the PPD is sent to the relevant Ghostscript filter of the system, which then prepares the print data accordingly.

CUPS and Ghostscript Cupsomatic passes the print data out of the usual CUPS environment to the Ghostscript filter specified in the Cups-O-Matic-PPD and returns the result to CUPS. With Cups-O-Matic, then, CUPS prints as good as any other Ghostscript systems, but all network features of CUPS for the printers supported by Ghostscript are available. Apart from Cups-O-Matic there are other options for integrating filters in CUPS, namely for Turboprint and Gimp-Print.

Big print runs

COVER FEATURE

Info CUPS homepage IPP specification Linux printing homepage/Cups-O-Matic Gimp Print CUPS documentation

http://www.cups.org http://www.ietf.org/html.charters/ipp -charter.html http://www.linuxprinting.org http://gimp-print.sourceforge.net/ http://www.cups.org/documentation.html

The graphical front-end Xpp is heavily based on the ESP PrintPro software of the inventor of CUPS, Michael Sweet

rights. CUPS has its own administrator, so needs no root rights. With lpoptions all the printer characteristics, which can be configured via the PPD, can be displayed and altered. But this is very tedious, since they vary from model to model. This is why in this case graphical tools like xpp, gtklp, kups, or the Web interface via http://[computername]:631 are vital. These read out the respective PPDs and make them configurable with a mouse. In KDE 2.2, KDEPrint is an intermediate layer between KDE applications and various printing subsystems. This means CUPS can be administered more simply in KDE applications. The former configuration tool kups is no longer being developed. General settings for the CUPS print system can be found in /etc/cups/cupsd.conf. This is similar to the Apache configuration file httpd.conf. In cupsd.conf it is possible to announce printers to the whole network, using Browsing On, define browse relays for inter-network printing, options for authentication and encryption and much more. CUPS is largely ready to use with the default settings as soon as it has been installed. â–

If there are several printers of the same type in a network, or if certain printers are only available for a short time, these can be combined into classes. If a print job is sent to a class, the job goes to the next free device. In this way, it is possible to achieve load balancing.

Configuration of CUPS The CUPS print system can be monitored and administered via diverse command line commands. The main ones are lpadmin, lpoptions and lpstat. With lpstat -p -d the available printers are displayed. lpadmin configured printers, printer classes and user

In KDE 2.2 the CUPS configuration is integrated seamlessly, thanks to KDEPrint, into the system control 13 ¡ 2001 LINUX MAGAZINE 25


COVER FEATURE

NETWORK PRINTING

Linux printing on heterogeneous networks

EASY PRINTING ON ANY NETWORK DR BERNHARD RÖHRIG

Thanks to Samba, printing from Linux and Windows machines on heterogeneous networks is not a problem. Although established printing systems such as Berkeley and System V have by now got competition they are still the most commonly used. We are going to tell you how to print on heterogeneous networks, irrespective of the operating system. Linux users tend to be quite spoilt when it comes to the networkability of their system. After all, Linux – as a member of the great UNIX family – acquired the TCP/IP protocol suite in its infancy. These protocols also form the basis of all Linux printing. These days all operating systems understand TCP/IP and it is therefore not difficult for a Linux machine to act as print server or client in any network.

Setting up a server under Linux First of all, let’s look at the server: the central element is the print daemon lpd, which runs as a background service and administers all of the user application’s print jobs (Figure 1). Only lpd needs to be set up for a Linux print server, while for Windows workstations the Samba server is required as well. Current Linux distributions will install lpd as standard. The quickest way to find out whether it is active is with a ps command: [user@host]$ ps ax | grep lpd 459 ? S 0:00 lpd 725 tty1 S 0:00 grep lpd [user@host]$ If necessary, the daemon must be started or – very rarely – installed. Starting and stopping is done using a script that has the same name as the daemon and can be found in the runlevel scripts directory (/etc/init.d, /etc/rc.d or /etc/rc.d/init.d). If there are problems starting the daemon this is also where you can see whether one or more system variables need to be set, for instance START_LPD for SuSE Linux or NETWORKING for Red Hat. 26 LINUX MAGAZINE 13 · 2001

Configuration files for the lpd print daemon Name apsfilterrc.* printcap hosts.lpd

Contents Special printer driver configuration Configuration of the print queue Print server access control by host

A special configuration program for the relevant distribution will normally help with this. For installation you need the package lprold-*.rpm or lpr-*.rpm, both of which are bound to be included with your distribution. Developed at Berkeley University, it is still the most widespread version of the print daemon, its configuration files are in the /etc directory and are listed in the following table. The most important of the three files is /etc/printcap. Here the capabilities of the printers and queues are specified. The print data are deposited in a directory below /var/spool/lpd, while the actual printing is a background process carried out by the lpd daemon. Manual printcap configuration is complicated and prone to errors. It is therefore advisable to use the relevant tool from your Linux distributor (YaST for SuSE, printtool for Red Hat). The same is true for the configuration file apsfilterrc, which normally does not need to be amended. That leaves /etc/hosts.lpd. Because the Berkeley print daemon does not handle user authentication, access to the server is limited by host, as a safety measure. The file administrator simply lists the names or IP addresses of all hosts


NETWORK PRINTING

COVER FEATURE

that are permitted access. This type of operation does not require the print server to be rebooted. Windows workstations do not have to be included on this list if the Samba server is running on the same machine as lpd.

Printing with Linux on Linux A Linux workstation that’s going to use a remote Berkeley-type print server must also have lpd set up. The only difference is the configuration of the print queue. Again, the Linux distributor’s tool can help. The short listing in the box shows an example of a resulting printcap entry.

Remote print queue lpremote|laser printing on adamix:\ :lp=/dev/null:\ :rp=lp:\ :rm=adamix.paradix.own:\ :sd=/var/spool/lpd/lpremote:\ :lf=/var/spool/printers/lpremote/log: :bk:sh:mx#0:

Figure 1: It is simple to print under Linux

The remote printer rp, the remote machine rm and the local spool directory sd are mandatory entries, and :lp=/dev/null: must always be present as well. For a local printer the print device will be indicated here. Filtering of the data to be printed, as is required on local printers, is not necessary. The local lpd simply puts the files into the pigeonhole of his colleague on the server, so to speak, who then has to work out what to do with them. This is why it is so important to choose that print queue on the server as your remote printer that best ensures the correct processing of the selected data. When selecting the name, it’s always worth having a look at the server’s /etc/printcap. In most cases you will want to use the auto queue, which is simply called lp on almost every system and which – apart from other little luxuries – can also determine automatically from the file content how it should be prepared for printing. :sh: suppresses unnecessary banner pages, :mx#0: removes the file size limit. As with local configuration it is also possible for network printers to have several entries in /etc/printcap and to label them with their own device names, depending on what is to be printed in each case and how. If you want the printer lpremote to also act as standard printer for the client at the same time, prefix the relevant printcap entry with lp as another alias name:

The Samba daemon also provides print services and uses the Linux spool system to do this, so setup shouldn’t pose any problems. All that is required is a section in the file /etc/smb.conf that gives access to all print queues set up under Linux and has the following structure:

or a HUP signal to the running process is normally sufficient. Configuration programs will usually take care of this step automatically.

From Windows to Linux

[printers] comment = Linux print server path = /var/spool/smb print ok = Yes guest ok = Yes browseable = No This section often already exists or can be quickly created with SWAT. After restarting the smbd and nmbd daemons the Linux queue is available to Windows machines on the same network. Whether they will also be visible in the network environment depends on several factors: ● TCP/IP has been set up on the workstation and is bound to the network card. ● The network client on the workstation is bound to TCP/IP. ● Client and server machines are in the same Windows workgroup or domain. ● If you have not set guest ok in /etc/smb.conf then you must be logged on to the Windows workstation with a name that also exists as a Samba user on the Linux station.

lp|lpremote|laser printer on adamix: \ ... After each creation or amendment of a print queue the appropriate print daemon needs to be restarted. Calling the startup script with the parameter restart,

If the requirements have been met, the result in Windows Explorer should be similar to Figure 2: on the left you can see the workgroup computers, on the right – apart from the shared directories (folder 13 · 2001 LINUX MAGAZINE 27


COVER FEATURE

NETWORK PRINTING

Figure 3: Configuration of a Windows printer under Linux with Red Hat’s printtool

Figure 2: Linux printers become visible in the Windows environment

Info Network Printing Todd Radermacher, Matthew Gast O’Reilly UK ISBN: 0596000383 Linux for Windows NT/2000 Administrators: the Secret Decoder Ring Mark Minasi, York Dan Sybex International ISBN: 0782127304

icon) – are the print queues available to the selected machine. This example, from Windows Me, should be similar to what you can expect under Windows 95, 98 and NT 4. In Windows 2000 the elaborate security system may at first present you with some stumbling blocks. That subject would be enough in itself to fill a separate article. Documentation is available if you want to find out how to get older versions of Windows (3.1, 3.11, NT 3.5) to print on Linux print servers. In order to use a Linux print queue under Windows, click on the required printer icon with the right mouse button – preferably on the Linux default printer, which automatically identifies all data formats. It can be recognised by the description auto, lp, lp2 or similar. For older versions of Samba or lpd it may be necessary to use the raw queue. In that case the parameter PRINT _RAW_SUPPRESS_FORMFEED in the server file /etc/apsfilterrc.<Driver name> must be set to yes to avoid unnecessary form feeds between the individual print jobs. On the context menu of the print queue use Install. Windows will inform your workstation that a printer driver needs to be installed locally. The process is slightly different for each version of Windows, but it always involves the selection of manufacturer and model and the transfer of the required files to the workstation. Once you have successfully managed all of that, a new network printer is available.

From Linux to Windows Of course, the whole thing also works the other way. In the simplest case, after a good look at the manual pages on smbclient you can just build yourself a command line or a script which takes care

28 LINUX MAGAZINE 13 · 2001

of the whole printing process including connecting to the Windows machine and user authentication. If this seems like too much work to you, don’t despair – the distributors have come to your rescue again. Figure 3 shows the configuration of the Linux queue lpwin, which establishes a link to a Windows machine and the Epson printer connected to it. We used the Red Hat printtool. The process is very similar with SuSE’s YaST: System administration, Configure network, Address printer via Samba. Most Windows systems require a valid user account with password in order to print at all. This is great in terms of security, but the Linux machine’s BSD spool system currently stores this information unencrypted, either in .config in the spool directory or in the aforementioned /etc/apsfilterrc.. To make matters worse, this file is world readable in some configurations. As a solution it is therefore normally recommended to set up an additional user on the Windows side specifically for printing, that doesn’t own any files. The only possible damage would then be unauthorised printing.

Conclusions Printing on heterogeneous networks poses no real problems these days. lpd and Samba allow mutual utilisation of Linux and Windows printers. Sophisticated configuration tools from the distributors also play their part in providing simple solutions in term of the setup. However, there is still room for improvement regarding security. ■

Dr Bernhard Röhrig writes books on the subjects of Linux and UNIX and holds seminars, workshops and training for administrators. He can be contacted at http://www.roehrig.com/.


TURBO PRINT 1.4

COVER FEATURE

Professional colour Printing under Linux

SUMPTUOUS COLOURS THOMAS DRILLING

The driver package Turboprint allows (colour) printers on Linux PCs to easily produce high quality output. Version 1.42, just out, also works with CUPS. The old print solutions for Linux (lpr, Apsfilter, Ghostscript) suffer primarily from the fact that this system is orientated towards outdated printers. The latest colour printers demand specific controllers, which seems impossible without the co-operation of the printer manufacturers. Turboprint from Zedonet is a driver package for Linux, which builds on the tried and trusted Ghostscript-based print standard. Turboprint functions as Magicfilter and thereby replaces, Apsfilter. With the aid of a Turboprint driver Linux users, can use all the features of their printer, such as adaptation to special papers and formats and maximum print resolutions. Equally, brightness, contrast and colour reproduction are adjustable. The configuration tools of Turboprint work in administrator or user mode. In administrator mode new printers can be added and installed, which are then available to all users. Also, any user can individually alter and save settings.

Turboprint and CUPS together Turboprint (since version 1.2) has the option of working with the CUPS printing system. CUPS, apart from using the Internet Printing Protocol (IPP) can make unlimited use of the PPD Files (Postscript Printer Definition) from the printer manufacturer. These files define all the features of the respective devices for Postscript printers. PPD files have a simple ASCII format. PPD acts as a standardising programming language for printer

control, occasionally expanded by manufacturerspecific commands. With CUPS it is possible to make full use of the contents of these PPD files without additional changes. Owners of Postscript printers can choose from a wealth of PPD files on the Adobe homepage. These are self-unpacking EXE archives for Windows, but they can be unpacked under Linux with the appropriate utilities. While real Postscript printers can already be used without limitation with the availability of such original PPD files, for non PS-compatible printers (pseudo) PPD files have to be generated via Ghostscript.

PPDs under CUPS and Turboprint Since the print output is done in the Postscript format, such PPD files must be formatted

Turboprint Version 1.4, Free Edition Version 1.42, Full version, EUR 19.95 Download and info: http://www.turboprint.de/german/index.html Manufacturer: Zedonet Meinrad-Spiefl-Platz 2 D-87660 Irsee Tel. 0049 8341 908 3905 Fax 0049 8341 120 42 Email: mail@turboprint.de

13 路 2001 LINUX MAGAZINE 29


COVER FEATURE

TURBO PRINT 1.4

commercial software ESP Print Pro from Easysoftware concentrate on the abilities of the IPP. This makes it easier to manage a large number of printers from any location, based on one or more CUPS servers and to control specific features of high-performance professional printers. Turboprint is aimed more at the inkjet market. It also uses its own process, which goes beyond the CUPS-O-Matic method, to prepare PPD files for CUPS.

Installing and starting up Turboprint

The Gtk front-end Xtpconfig is used to configure the special features and the colour correction system. CUPS options can also be configured from here

appropriately for the respective printer – a task for the commercial package Ghostscript from Aladdin. A somewhat older version of Ghostscript is available for free. In most Linux distributions this version serves as the basis for printing without Postscript. PPD files based on the free Ghostscript are available for all the usual printers. The Perl script Cupsomatic prepares the free PPDs for CUPS. On the linuxprinting.org website, it is possible to click together your printer features with a few mouse clicks and then download using Create CUPS-PPD. The Cupsomatic script is also downloaded locally; it provides for the integration of the dynamically created PPDs in CUPS. SuSE Linux 7.2 for example combines such PPD files in the packet cups-drivers from series n. After installation they can be found and sorted according to manufacturer in /usr/share/cups/model. The filter script is installed by SuSE under /usr/lib/ cups/filter/cupsomatic. Pure CUPS solutions with CUPS-O-Maticgenerated PPDs and front-ends like Til Kamppeter’s free XPP (X Printing Panel) or Kups and the

The program is in a tar archive, which is unpacked and installed very easily with an installation script. The script also creates the PPD files for co-operation with CUPS. The software is dynamically linked to the Gtk library 1.2. The heart of Turboprint is the command line program /usr/bin/tpprint. Purists can use this to control Turboprint via the shell. The Turboprint configuration dialogs are xtpsetup to install the spool system and xtpconfig for configuration of the driver.

Keeping order The Turboprint driver (*.TPP) is found under /usr/share/turboprint/printers. The PPDs are in /usr/share/cups/model/turboprint, if you prefer to print with Turboprint via CUPS instead of with lpd. Details of the variety of drivers supplied can be found both in the list under /usr/share/turboprint/printers /printer and on the Zedonet homepage. Turboprint uses two configuration files in /etc/turboprint. turboprint.cfg contains the printer descriptions which had been created in the administrator mode with tpsetup. system.cfg mainly lists environment variables with paths to the print spooling system in use, such as CUPS. To use the CUPS filter with Turboprint, version 1.4 now includes /usr/share/turboprint/lib/

History Turboprint does not have the typical Linux genesis. The concept goes back to a development at the end of the eighties for the Commodore Amiga. The developers were – as they still are – somewhat dissatisfied with printing under Linux. The primary outcome was an image printing program, and later on its own printer drivers emerged from this, which over the years were expanded bit by bit to cover additional printer models. Right out in front came the Turboprint development, with support for the first colour inkjet printers like the legendary HP DeskJet 500 Color. At that time, many options for image

30 LINUX MAGAZINE 13 · 2001

reproduction and rastering were integrated into the Turboprint driver. Its own algorithms were added to improve the colour purity. Because the situation was comparable with that of the Amiga era, the developers of the Linux version were able to channel all their experience of Turboprint for the Amiga into the new development. The objective was (and is), to conjure up high quality colour and photo printing from modern inkjet printers under Linux which of at least as good quality as those under Windows with the original drivers from the printer manufacturers. After two years of development


TURBO PRINT 1.4

COVER FEATURE

The printer model selection list (TPP drivers) of Turboprint includes over 100 devices.

To work with CUPS, Turboprint needs its own integration into the PPD files. These are created automatically during installation and are kept in the CUPS directory under /usr/share/cups/model/ turboprint

rastertoturboprint. The PPD files created in the course of the Turboprint setup are in the CUPS driver directory under /usr/share /cups/model/turboprint. Any CUPS configuration front-end, such as KUPS, can use the Turboprint PPDs as an alternative to the CUPS printer drivers. The user will also find, in the CUPS printer selection for the respective standard CUPS printer model, Turboprint-specific drivers or models listed as <Model>-Turboprint. When Tpprint itself is called up, each individual print job is performed, by either the filter in /usr/share/turboprint/lib/tpfilter or /usr/share/ turboprint/lib/ rastertoturboprint respectively.

The Turboprint installer makes up its own mind On first installation, Turboprint analyses the computer environment. So the print system currently in use is identified. Then Turboprint is adapted to the respective requirements. Since modern print systems like CUPS have their own API, the environment variables and paths in system.cfg only come into use if the print system will not

Turboprint for Linux 1.0 emerged from the Amiga program in the spring. The Amiga code was written completely in Assembler – for porting onto Linux in C++ it had to be almost completely re-written. Only a few algorithms and the printer database with the colour profiles were taken over. In this way a few of the new print techniques and improvements got a look in, so that the Linux version of Turboprint will cope admirably with professional demands. In the first versions, Turboprint still integrated itself in the usual way as a classic magic filter in

integrate elegantly via the respective API. The installation script dynamically creates the PPD files. Turboprint can communicate with the existing print system in two ways: If Turboprint sets up lpd, then /usr/share/turboprint/ lib/tpfilter acts as a substitute for the default printer filter of a standard print system like Apsfilter or Magicfilter and performs the Turboprint run with the classic Ghostscript – but taking account of the Turboprint driver. The free Ghostscript is then used for the RIP process. If on the other hand Turboprint is used together with CUPS, the more powerful version of Ghostscript integrated in CUPS is used for the RIP process.

the LPR/LPRng printer spooling system. There is a graphical Gtk-based configuration menu for the expanded setting options available with Turboprint, such as brightness and colour saturation. As it became apparent that lpr would sooner or later replace the powerful CUPS spooling system, the developers modified the latest Turboprint 1.2 in such a way that it works together as easily with CUPS as with lpr. The Turboprint settings menu xtpconfig specifically adapts itself to the special features of CUPS, but also offers the original Turboprint-specific setting options as well.

13 · 2001 LINUX MAGAZINE 31


COVER FEATURE

If the Turboprint drivers have been correctly installed, any Turboprint printer can be addressed with Kups

TURBO PRINT 1.4

True Match – colour processing with colour profiles Most colour printers produce and print any colour from the four basic colours yellow, magenta, cyan and black. In computer terms, yellow, cyan and magenta are sufficient to represent colours; but black is still needed since in practice the three basic colours don’t produce a true black tone. Printing text would incur high costs. The question now arises as to how thousands of colours should be created for photo-realistic printing with millions of colour nuances. In principle it would be necessary to print the basic colours in various levels of brightness, but only a few special

Xtpsetup is an easy-to-use Gtk front-end for the management and configuration of the Turboprint environment. New printers can easily be added by the administrator with Add

32 LINUX MAGAZINE 13 · 2001

printers are able to do this. In practice, a process for rastering print colours is used. Colours are applied in a regular (ordered) or irregular pattern (error-diffusion), so that by placing more or fewer coloured dots, the viewer gets the impression of areas of different colour intensity. To do this, the necessary intensities of the printer complementary colours for a true colour reproduction of cyan, magenta and yellow must be calculated from the red, green and blue components of the screen colours. So much for theory. One major problem arises in practice. Printer manufacturers each use different colour nuances of the basic colours, which never correspond exactly to the pure complementary colours of the screen colours red, green, blue. So it is also impossible in principle to create a pure blue from the screen, such as by mixing a theoretical 50 per cent magenta and 50 per cent cyan. Depending on the colour cast of the basic printer colours, the proportions of colour are weighted differently. Also, there are always slight impurities in the print colours (for example a small proportion of yellow in the cyan) due to manufacturing. To make things harder, paper variation allows the inks to flow into each other to varying degrees and the chemical coating of inkjet printer paper slightly alters each colour tone. The result is that the colours in a photo printed with the same printer and driver will look different on different types of paper. All in all, the effect on the resulting colour tone due to ink (colour tone, flow characteristics, mixture properties and so on) and paper (chemical coating, absorption behaviour) is so complex that the ideal amount of ink to be applied can only be calculated approximately, even with a complicated formula, from the screen colours. A really true colour print result is only achievable if one knows the ideal mixture of printing ink for almost every screen colour and lays it out in some form – such as a table or a colour profile. This is produced by doing test prints with various colour patterns and analysing them using a colour measurement device. Separate colour profiles are needed, for different sorts of paper. The print


TURBO PRINT 1.4

quality in this system is critically dependent on the quality of the colour profile.

Colour correction and raster For Linux, the drivers of Turboprint have so far been the only ones to realise a real colour profile technique. The True-Match colour correction system developed by Zedonet also contains colour profiles for specialist printer papers, so that in fact in every situation one can achieve high-quality colour reproduction in the printouts. Correct rastering is of almost equal importance to the appropriate mix proportions of the colours, so that the individual droplets of ink can be discerned as little as possible and an even, photorealistic impression is created. Turboprint, when set to Error-Diffusion, works with a special raster algorithm, which distributes the droplets evenly over the paper and at the same time arranges them as far as possible so that none of them produces a pattern discernible to the eye. Also, light grey tones are not printed using black ink, but mixed from coloured inks so the visibility of the print raster is greatly reduced. A few printer models offer special techniques for even application of colour, which Turboprint also supports. The best known is the six-colour or seven-colour print with special photo cartridges, such as those supplied with the Epson Stylus Photo printers. The process also uses less colourintensive inks to print lighter colour tones, which produces scarcely visible raster dots. Other processes such as C-Ret III from Hewlett-Packard are based on variable droplet sizes. This technique is found in all modern inkjet printers, as is a process of overlapping the print lines for stripe-free printing, which Epson calls microweaving. The processes mentioned can only be controlled and used if the printer driver takes account of them right from the start in the colour processing. Printers which can use such techniques can also be controlled with simpler drivers, but they do not achieve their maximum print quality. In the standard setting Turboprint always tries for a printout which is true to the screen. Slide controllers available are used to alter the brightness, contrast, colour saturation or colour cast.

Handling Turboprint

easily via the Turboprint configuration programs Xtpsetup and Xtpconfig. Both programs also come in a command line version: Tpsetup and Tpconfig. The tasks of Tpprint and its printer configuration can also be done by Kups. Turboprint’s own drivers then appear in the printer selection list of Kups with the addition of Turboprint. Tpprint starts, whenever any print job is done, from /usr/share/ turboprint/lib/tpfilter.

Costs Originally the manufacturer had intended to make Turboprint available to the Linux community free of charge and to have the driver development financed by the printer manufacturers. But so far, most manufacturers have not been won over to such a commitment to Linux. At EUR 19.95 (about £13), though, the licence costs are in a very reasonable. Anyone who shies away from spending even this amount can obtain the free version 1.4 from the Turboprint homepage, which at the maximum resolution and with special paper, places an advertising logo of Turboprint on every printout.

Conclusion The old Linux print system comes to grief on today’s colour printers on the specific control. The new CUPS, through its ability to read in Windows PPD files, is countering the lack of commitment to Linux of the printer manufacturers in a sophisticated way. This is also an advantage when administering network printers on the basis of the IPP protocol. But it is only in combination with Turboprint that the full extent of its capability unfolds, as Turboprint, with its True Match colour correction system is finally lending colour and photo printing under Linux the same professionalism as under Windows. And the whole has an easy configuration. From the point of view of Turboprint, though, CUPS serves as an easy and welcome vehicle for printer administration which is the same throughout the system, even on a network. Excellent print results are also achieved by Turboprint without CUPS on the basis of the old print system. Which is why the economical software should actually be part of the obligatory investment of every Linux user with a modern inkjet printer. ■

The core of the Turboprint package is /usr/bin/tpprint. The program can be addressed, using the following syntax

Info

tpprint -ddrivername [Options] \ Input [Output]

Site of Grant Taylor with CUPS-O-Matic and Cupsomatic List of printers supported by Turboprint

from the command line, but it is controlled more

COVER FEATURE

Turboprint Homepage PPD files from Adobe for Postscript printers

http://www.turboprint.de http://www.adobe.com/products/ printerdrivers/winppd.html http://www.linuxprinting.org http://www.turboprint.de/printers.html

13 · 2001 LINUX MAGAZINE 33


KNOW HOW

QT

ON THE QT JONO BACON

When the KDE project was started by Mathias Ettrich, he needed to decide on which graphical toolkit to use to base the environment on. There were some options available, but one of the most capable toolkits was Qt by Trolltech in Oslo, Norway. One of the great things about Linux is the sheer number of available options and choices you can make. There are often multiple solutions for a single problem. An example is in windowing environments; we have KDE, Gnome, fvwm, Afterstep, Enlightenment etc. One of the most popular windowing environments on Linux and UNIX-based systems is the K Desktop Environment (KDE). KDE has been around for approximately five years and has developed into a mature project with hundreds of developers and some very competent, stable releases. A graphical toolkit is a number of buttons, checkboxes, scrollbars etc and other facilities to write graphical software with. There are a few toolkits available but Qt is one of the most capable, if not the most capable. From those early days five years ago, Qt has developed into an incredible piece of software with thousands of users using the toolkit around the world on a number of platforms and operating systems. This series of articles is going to look at Qt, what it can do and what kind of software you can write with it.

Getting to know Qt To start with, it is a good idea that we get a grasp of what Qt can do and what facilities are available. In this issue we will look at these features and get a good grounding of Qt’s facilities. Firstly, Qt is a very portable toolkit. Qt has been made available and ported to the following platforms: ● Linux (all major distributions) ● UNIX (all major UNIXes) ● Microsoft Windows ● Mac ● Embedded Devices (Casiopeia, iPAQ etc) There are also some other ports, so Qt always remains a portable toolkit. Not only is Qt portable, it is licensed to your needs in a very flexible way. Qt is available in three versions; Free, Professional and Enterprise. The free version involves Qt for X11, Embedded (and recently Windows). The free versions for X11 are licensed under the GPL and include full source code. The free version of Qt/Windows is binary only and does not contain the source code of the toolkit. For those wishing to write closed source or commercial software there is the Professional and Enterprise editions, which have different licensing to enable this. This licensing scheme makes Qt a very flexible toolkit, and as it is maintained by a company full time it always remains competent, competitive and, in our case, free.

What can it do? In these articles I am going to be focusing my efforts on the newly available Qt 3.0 version. There are previous editions such as the 2.x.x series and the 1.4.x series that are floating about, but I suggest you use the most current version available. Although at first glance Qt looks like a number of

Qt Designer main window 34 LINUX MAGAZINE 13 · 2001


QT

widgets (controls) that you can use to develop software, it is much more than that. Qt not only offers you a number of pre-designed widgets, but also the capability to roll your own widgets. Qt also includes a number of classes for managing data such as strings, numbers, vectors, linked lists, stacks, XML, DOM trees etc. On top of this Qt offers a number of networking features, support for multiple image formats and international text support. Qt does not stop there, there’s also the following: ● Multiple monitor support Qt allows applications to utilise multiple screens. On UNIX, this will support both Xinerama and the traditional multi-screen technology. ● New Component model Qt provides a platform-independent API for runtime loading of shared libraries and accessing of their functionality using a COM-like interface concept. ● Support for the latest evolutions in GUIs Qt supports the docking/floating window concept of modern, complex GUIs. It also adds a GUI control for interactive editing of rich text. ● Regular Expressions Qt-3.0 features a new and powerful regular expression engine greatly simplifying complex text manipulation operations. The syntax is compatible to, and as powerful as, Perl regular expressions while at the same time including full support for Unicode. ● Accessibility Support Qt controls provide information for accessibility architectures, so that visually or mobile-impaired people can use applications written in Qt with the standard tools provided (eg the Windows Magnifier and Narrator). ● 64-bit Safety The emerging, next generation of 64-bit hardware is supported by Qt 3.0.

KNOW HOW

QCAD application written with Qt

included in Qt 3.0. The Qt Designer has full support for these new controls, resulting in a RAD solution for database applications.

C++ support Qt itself is written in and natively supports C++ as a language. You don’t need to hunt far on the Net for a raging debate on whether C or C++ is better for GUI development, but it is acknowledged in a number of places that C++ is inherently better for GUI development. Qt is a good example of a wellcrafted toolkit C++ and object orientated paradigms. In the coming months I will be giving tutorials on using Qt, and some C++ knowledge is assumed. If you are unfamiliar with C++, there are a number of tutorials and good books available on the Net.

Database Programming

Qt Designer

Qt 3.0 includes a platform and databaseindependent API for accessing SQL databases. The API has both ODBC support and database-specific drivers for Oracle, PostgreSQL and MySQL databases, and custom drivers may be added. Database-aware controls that provide automatic synchronisation between GUI and database are

Qt includes a graphical interface building tool called Qt Designer which can be used to build dialog boxes, interfaces and more. Qt Designer is an important component in your software development with Qt and can save a lot of time in creating your software. Qt designer also has support for KDE widgets if

cRadio for controlling PCI Radio Cards 13 · 2001 LINUX MAGAZINE 35


KNOW HOW

QT

Internationalisation If you want to write software in multiple languages (natural languages, not programming languages), then you are going to need to translate strings of text across your programs. Trolltech has developed Qt Linguist to assist in this process. Qt Linguist provides a number of features for making your programs more internationally aware.

Uses for Qt For those of you new to Qt, you may be reading this article and wondering just what kind of software you can write with Qt? Well basically you can write any kind of software that you need to. Qt provides most of the visual GUI elements that you will need and Qt also provides the backend and internal facilities. Remember that Qt is a commercial product with a free edition that is not restricted or cut away at. You are getting an industry-strength toolkit for free. MuX2d music typesetting package for TeX

desired and can be compiled with this support. I will be covering installation of Qt in the next issue.

Documentation One of the great benefits of Qt as a development tool is its incredible documentation. Qt includes documentation on all of the classes included in it as well as a tutorial, information on Qt modules and other information. This documentation is essential when coding to look up method names and details. Trolltech has built an application to support this documentation called Qt Assistant. Qt Assistant provides an interface to the documentation provided by Qt, and also enables more documentation to be added.

KDE Integration Many people are starting use KDE as their standard interface for Linux. This has been due to a number of solid KDE releases, an easy to use interface, and good quality application software. KDE itself is written in Qt. This is a major benefit as it means that you have a massive wealth of code already written for KDE that is available for a reference. KDE has also developed extensive features on top of the Qt features and making a KDE-aware program out of a Qt program is a nominal job. This means that by learning Qt, you are also making yourself skilled enough to write software that is compatible with one of the most popular desktops for Linux.

Where to now?

Qt GUI Designer running in a Solaris environment 36 LINUX MAGAZINE 13 ¡ 2001

Now we have had a brief look at Qt, we will be taking a look at writing some software with it. This will begin in the next issue, but before then, there are some preparations I suggest you take. Qt is written in C++ and uses C++ as the language to write Qt software in (although there are other unofficial bindings). If you are unfamiliar with C++, I suggest you take a look at it and get to grips with it. It is a powerful language and needs some practice to get to use properly. There are plenty of good books and tutorials available to get you started. Qt is a powerful, flexible toolkit for professional grade software development. Qt has been written from the ground up as a capable toolkit for development of free and commercial software, and using it is very satisfying. Next issue we will begin using Qt to write some software. I will hopefully see you then! â–


SENDMAIL

INTERVIEW

Eric Allman

EMAIL ARCHITECT

Eric Allman wrote sendmail in the early eighties so that he didn’t have to

administer lots of user accounts. Linux Magazine talked to him about this seminal coding and its development over the last 20 years. Linux Magazine In 1981 when you first developed Sendmail, did you expect email to change the world? Eric Allman Yes and no. I was an email lover from the first time I used it and thought it absolutely wonderful. I thought everyone who was in science or academia and into technology would be using it so yes I did think it was important. But I didn’t predict that everyone – even my mother – would eventually be using it. It’s now at the stage where people go out to buy a computer just so they can use email so no, I didn’t envisage it becoming that pervasive. LM How is Sendmail the company coping with giving away the source code that it has spent so much time developing? EA As a hybrid company we do not give everything away. The core technology is Open Source and it is the right thing to do. There are lots of cases where companies have taken over Open Source and both the Open Source part and the commercial part die. We are well known and respected and that would be harmed if we were not Open Source. LM Would you agree that giving away the source encouraged innovation? EA Absolutely – The OS community of developers is small and the group that contribute back is smaller but it is still significant. Even if it is just a bug report it helps to move the software forward. LM Has the initial business plan that you developed with Greg Olson worked out well? EA Again, yes and no. The original concept was mass sale, with the intention of getting one per cent of the Open Source over to commercial with between one and 10 thousand dollars. That failed because of resistance due to the “It already works so why change?” attitude. What has worked well is selling to people who have problems. They want high-end technical support. They want everything including the design of the whole email system. LM Is the hybrid business model a good commercial success? EA It is, in fact, working very well for us. It seems to

marry the benefits of Open Source – innovation – together with traditional business. This has been useful in financing rounds. The general IT downturn has effected us but less so than lots of companies. LM How do you see email developing? EA It was fairly obvious that email was the killer application for the Internet. The Web may pull in people, but email keeps them there. Email will continue and we will get low cost access for the world. Digital signatures will become more widely accepted, but overall it will become bigger, faster and there will be more of it. LM Although you support Linux and NT, do you still use BSD at home? EA Yes – FreeBSD LM Do you see Lotus Domino and Microsoft Exchange as a serious threat? EA No, they’re complimentary to groupware. Sendmail Inc. is about standards-based email. Many companies use Sendmail as their Internet gateway into a company and then run groupware. Microsoft sees us as a competitor, less so with Lotus. We have a good partnership with IBM. LM What have been the benefits and downsides of having partnership programs rather than adding the content yourself? EA We cannot do everything. We are not a virus filtering company and never will be. But rather than say we cannot do this we have partners. IBM is a partner and so we both win. The downside is sharing revenues. LM Do you see a market for policy driven content management such as the removal of spam and Internet blocking? EA Absolutely – this will be one of the biggest areas for growth. As the Internet has become bigger it is no longer possible to say “tut, tut” to someone when they have made a mistake. Filtering of outgoing mail for viruses and legal liabilities such as sexual harassment and customer lists will become necessary. Also the use of encryption for financial and medical matters. ■ 13 · 2001 LINUX MAGAZINE 37


REPORT

BEER HIKE

Bouillon, Belgium

LINUX BEER HIKE 2001 RICHARD IBBOTSON

The Linux Beer Hike took place at Bouillon in Belgium this year. This might not seem like an obvious place to go to but the facilities and scenery certainly make up for this.

Coupled with six days of sun and only two days of rain it made this year’s event one of the best so far. A large crowd of people assembled over at the Archeoscope at Bouillon. Much beer was consumed and most were doing their level best to keep up with everyone else. The 10th anniversary of Linux was celebrated in some style. If only Linus Torvalds could have been there with us. The Friday and Saturday at the beginning of the week saw a whole army of geeks arrive in Bouillon from all over the world. In the seven days that the Beer Hike took place much of the local area was covered on foot and much of the local beer was consumed by the beer-crazed fans of da penguin. The author himself was able to find a “leetle more room for ze local Belgian ospitalite” even after a whole seven days of partying (mostly with the help of local people who did not give in even after midnight). At the start of the week we had to assemble a network which included the cluster and the

Bouillon town 38 LINUX MAGAZINE 13 · 2001

wireless link up to the campsite. The latter proved to be something of a pain to set up and dragged on into Monday. People were seen standing around in groups discussing the difficulties and what to do next. As well as this, the cluster kind of clustered on and the rest of us shoved off for more beer which at eight per cent volume was the kind of thing that shouldn’t be missed out on. One of the first social events was over at the bar next to the campsite. Most people turned up and the first overly large quantity of beer was consumed. The Beer Hike isn’t only about beer – there are also many events organised for everyone during the seven days that it runs. Willem Konynberg started this off with his presentation which asked how we make Linux better and why does it need to be so much like a Microsoft product when there’s no real need for it. He says that he wanted to raise a few eyebrows and question current thinking about computers and software in general. The assembled crowd loved it and gave him a big round of applause. On Sunday evening after a successful day of hacking at the keyboard most of us went on the torch-lit tour of the castle. This was something of a gem from beginning to end. Very touristy but also well thought out and not to be missed. It was all about how the crusade was led from the local castle in ancient times and how the political shape of Europe was changed as a result. It’s well worth going to Bouillon just for this trip alone. The tour was organised by Juergen Braukman. We would all like to thank him for his efforts. On Monday, we had Writing PHP Extensions from Harmutz Holzgraefe and IP Networking from Andy Fletcher – both proved to be really popular. At about the same time the trip round le Caves de Han-sur-Lesse took place. Most people said that they enjoyed it. There were more talks and presentations in the afternoon for those of us who could just about manage to hold our heads from the night before. On Tuesday Lance Davis from UKLinux.net took


BEER HIKE

REPORT

Hacking away

a group of people out for the rail hike. Everyone who went said it was really good. I saw Lance when he came back. He looked about 10 years younger and smiled all the time that I talked to him. Wednesday turned out to be the Perl and Apache day. By now we were all suffering the effects of the beer and the heat. Some of us just dozed off. Thursday and Friday turned out to be the rainy days when Brad Knowles took people along to two very famous Abbeys to look at the Chimay brewery and some of the others. Brad was distracted by others on Thursday morning. They left an hour late and were only able to see the grounds of the Abbey and not the brewery. Thursday was Trappiste Monasteries, Chimay and Friday was Trappiste Monasteries, Abbaye D’Orval. Everyone said they enjoyed it though and more beer was consumed on Thursday night. Brad also gave us an Introduction to Basic DNS Administration on the last Friday of the week which was fascinating from beginning to end and lead to some well informed discussion afterward. Later on he gave us Design and Implementation of Highly Scalable Email Systems. He’s given this talk in many places and it never gets to be boring or the same whenever he presents it. I think that Brad probably did more for the beer hike than we realised at the time. One of the little-noticed highlights of the week, other than the construction of the internal LAN and cluster, was the installation of QPE into an iPAQ Compaq palmtop.

Alban Pearce from Salford University and myself spent a whole six days installing and configuring his iPAQ. It was well worth it. The talk that he gave about how to install such a beastie with a cut down version of Debian GNU/Linux was brilliant and the demo MPEG sent across the LAN to his handheld iPAQ finished everyone as once again Bill Gates got the same custard pie in the face. He even replayed it a few times due to popular request. John Hearns helped him with the presentation. Together they make a fine team. Have a look on the Net for the familiar project if you want to know more. The final discussion on Saturday afternoon, after clearing up, was all about where to go next year. There have been many suggestions. The Saturday night session can only be described as head banging. The venue was the Cafe Baratin and much piano music was heard while some of us browsed some very nice oil paintings that had been produced by a local artist. Excellent food as well. The locals thought we were great and they want us back again. All of us are waiting to find out what comes next year. If you want to join in with the rest of us please do subscribe to the Linux Beer Hike list. You can join the list by sending mail to majordomo@lists.cyberware.co.uk with the word “subscribe” somewhere in the message. We do hope that you will all be able to come along to the next one. ■ Richard is Chairman and Organiser for Sheffield Linux User’s Group – you can view their website at http://www.sheflug.co.uk

Info Linux Beer Hike http://lbw2001.ynfonatic.de/ Webcam site http://killefiz.de/lbw/online/ind ex.html The Familiar Project http://familiar.handhelds.org/ UK Linux http://www.uklinux.net

13 · 2001 LINUX MAGAZINE 39


KNOW HOW

APPLE MAC

RUNNING LINUX ON MAC JASON WALSH

Anyone with an interest in computing, especially in UNIX-based OSs cannot have failed to have noticed the hype surrounding the release of Apple’s Mac OS X. OS X is a whole new ball game in Macintosh computing.

Mac on Linux

For many years Apple have been trying to find a suitable replacement for the sophisticated, but rapidly dating, Mac OS. After a flirtation with Jean Louis Gassee’s BeOS, the return of Apple cofounder Steve Jobs precipitated a focus shift to NeXT technologies. NeXT was the company formed by Jobs following his unceremonious eviction from Apple by the then CEO, former Pepsi man, John Sculley. NeXT manufactured the famous 680x0-based Black hardware, which offered performance beyond that available on Macs and PCs of the time, but never really found a commercial base for the machines and instead opened up their famous NeXTSTEP OS. It is this OS that is the basis of Mac OSX. Reworked and ported to the PowerPC platform, NeXTSTEP has become Darwin – the open source kernel of Mac OS X. Including key UNIX technologies such as Mach and having full POSIX compatibility, Darwin/OS X does for UNIX what NeXT tried to do in the early 1990s. It combines extreme power with ease of use.

40 LINUX MAGAZINE 13 · 2001

There is a problem, however. The entry price is steep. You must have at least a G3 processor and 128MB of RAM is required to do any serious work. If your Mac is pre-G3, your options for using MacOS X are limited to expensive processor upgrade cards and even then Apple will not guarantee support. This is where Linux comes in. There are versions of Linux for nearly every Mac made in the last 10 years, even 680x0-based machines. Over the next few years, software for the ‘Classic Mac OS’ will dry up as users – even in the publishing industry, famed for its inertia – switch to OS X. Where does that leave your once-prized Mac? Many people are now booting up Linux as a second system on their Macs. Old Power Macs such as the 7200 make fine firewalls and small office servers, and if configured properly can actually make great desktop machines. iMacs make good, small and inexpensive Linux desktops and if aesthetics are not a concern for you, you can add a second IDE hard drive quite easily, though it will hang out of the back of the machine. The G4 PowerBook is probably the best Linux-capable laptop on the market. The problem, as always, is user level software. Despite the fact that the Mac absolutely dominates the creative industries and some areas of science computing, it is a minority platform. Linux on the Mac, or indeed any PPC system is a minority within a minority. However, things may change very soon; let’s look at why. First off, OS X has an open source core in the form of Darwin. Darwin has even been ported to x86 systems. This has allowed a lot of standard UNIX applications to make their way to OS X and then PPC/Linux and vice versa. GUI applications may take a little longer, but work is already underway. There are several X Window servers for OS X, including open source projects and Tennon Intersystems’ excellent commercial effort. Secondly, Apple and Adobe are currently having a lovers’ tiff. These two companies have been codependent for years, and despite the availability of Adobe Photoshop on other platforms (notably


KNOW HOW

Windows and IRIX), it has remained one of three or four ‘killer apps’ on the Mac. However, recent encroachment by Apple into Adobe’s territory with video editing software such as iMovie and Final Cut Pro has hampered sales of Adobe Premiere. To add insult to Adobe’s injury, Apple is soon to launch a bitmap editing application at the low end of the market, which will spell problems for Adobe’s recent Photoshop Elements. Adobe has decided to boycott the upcoming Macworld Expo in return. This dispute is unlikely to go on for long, as both companies’ fates are increasingly tied together, but for the meantime there is no OS X native version of Photoshop. As a result, development of the PPC version of GIMP has stepped up and promises have been made about CMYK support. GIMP on OS X is however, a kludge, running it under Linux PPC is much more pleasurable. Linux is free – as in ‘free beer’. Yes, despite the hoo-hah about free meaning open, the fact remains that Linux has many more no-cost useable small application programs, such as mp3 players, calendars, PIMs and so on, than any other OS. This may seem relatively unimportant, but if money is a concern for you or your organisation, and let’s face it, if you’re using a Power Macintosh 7200 rather than a G4 it probably is, then a highly configurable OS which you can alter to suit your machines with a large selection of simple productivity applications is likely to be of interest. Linux is ugly compared to the Mac OS. Many Linux users disavow this and can come up with a hundred and one reasons to say otherwise, but from the perspective of user interface, Apple have consistently got it right, where others, notably Microsoft, haven’t. That said, KDE and Gnome have come on in leaps and bounds and though I still prefer other GUIs such as those of the Mac OS, BeOS and of course, OS X, they are a pleasure to use compared to their predecessors. Add to this the fact that with the great Mac-On-Linux application the Mac OS can actually be run in an X Window and it’s dream come true time. If you think that this sounds awkward, remember that BeOS users have been doing it for years with SheepShaver and that Mac OS X itself uses similar technologies for running classic Mac applications. ■

Mac Linux resources LinuxPPC site MK Linux site Mac On Linux site

http://www.linxppc.org http://www.mklinux.org http://www.maconlinux.com

Related non-Linux resources Tennon Intersystems Apple’s Mac OS X site Apple’s official Darwin site Be Inc GNUStep

http://www.tennon.com http://www.apple.com/macosx http://www.opensource.apple.com http://www.be.com http://www.gnustep.org

13 · 2001 LINUX MAGAZINE 41


KNOW HOW

GIMP WORKSHOP

Image processing with Gimp: part 6

GIMPRESSIONS SIMON BUDIG

Gimpressionist is a flexible plugin by Vidar Madsen which enables the user to turn ordinary images into works of art. Linux Magazine takes a closer look at this handy tool. One of the classic applications in image processing is that of adding a hand-painted touch to photos. Every program has at least one function for making photos into oil paintings or suchlike. Gimp is no exception to the rule – the relevant plugin can be found under <Image>/Filters/Artistic/Oilify.... Now you may be wondering why we don’t sound especially enthusiastic. There is a simple reason for this: It’s just boring. Great, I can turn my picture into an oil painting – but anyone can do that with any old program. The results from Gimp’s filter are not particularly wonderful either. The true reason for my boredom, though, is that there is something a great deal more exciting.

Curtain up for Gimpressionist In Gimpressionist Vidar Madsen has written a plugin that can be used very flexibly to turn images into works of art. The basic idea is simple: The image is reassembled from small paintbrush images, which can adapt to the image. Gimpressionist is one of the plugins which have so many parameters that it is necessary to save any especially effective combinations so as to be able to load them in again later. Start <Image>/Filters/ Artistic/GIMPressionist; these delay settings are the

Figure 1: Gimpressionist – the starting place 42 LINUX MAGAZINE 13 · 2001

Figure 2: Paper tiger

first thing you see (Figure 1). To make use of them, click on the name of the combination – for example Dotify and then on Apply. Now the default settings are distributed over the individual index cards. Since Gimpressionist does not work especially quickly, you should ask for a preview, by clicking on Update. In the little box, you can roughly assess the effect. With a click on OK the image is then edited. The default setting Dotify has the effect of making the image look like a confetti mosaic (Figure 2). In order to show which ideas are behind Gimpressionist, we will now convert the effect step by step into a sort of chalk drawing on a wall. The wall is a sort of structure, onto which the whole image is to be stamped. The setting for this can be found on the tab Paper. Select the “paper” bricks2.pgm. To be able to see the effect (this was not required for the confetti), drag the sliding controller Relief up to about 70. The structure of the wall becomes visible after a click on Updtae. On the card tab Brush you will now see where the round basic pattern for the confetti came from. We would rather have chalk strokes, so select the paintbrush chalk01.pgm. If you have the preview drawn again, the preview image is black apart from the structure of the wall. The


GIMP WORKSHOP

paintbrush is drawn a bit too small. So change to the Size tab and set the minimum and maximum size to 30. In the preview, the image becomes visible again. Obviously it is not desirable for all the strokes to run in the same direction. In order to make up for this, change to the index card Orientation and under Start angle select 360 degrees and under Directions about 10. Now Gimpressionist can turn the paintbrush from the starting angle in 10 steps by 360 degrees and adapt the image. This adaptation can function according to various criteria, and one nice option is that of adapting the rough contours in the image (Adaptative). Obviously Gimpressionist can adapt the paintbrush strokes all the more precisely, the more Directions there are available. Now you have probably noticed the large black marks, which are due to the Random Placement strategy. In the case of Evenly distributed, the black marks are less common. And the size can also be adapted to the image. If, on the Size index card you set the minimum size to 15, the maximum size to 30 and (similar to the directions) set the number of sizes to 10, Gimpressionist has various sizes of paintbrush at its disposal. Here, too, it is possible to define a selection strategy. Adaptative orientates itself to the structures in the image. Be aware that for many directions and many sizes the computing time rises steeply and the result could keep you waiting a long time. You should avoid combining the maximum setting of 30 in the sizes with the Adaptative strategy. Now we have a tiger, which has been painted onto the wall with chalk (Figure 3). In order to get a feel for the options, you should try out the various default settings and then see what settings have been made in order to achieve this effect. With a bit of creativity it is possible to turn an image not only into an oil painting, but also into other – jollier – things.

Animation In part 3 of the Workshop we asked you to submit suggested topics for a continuation of the series (to sbudig@linux-user.de). The most frequent question was how to create GIF animations with Gimp – or somewhat more generally – how Gimp can cope with films. To answer the last question first: Gimp is not the right tool for editing films of several minutes in length. You can certainly read in films using the plugins under <Image>/Video/Split video into frames, but they are then stored on the hard disk as individual images uncompressed – anyone who doesn’t exactly have terabytes of space to spare will very soon run out. When it comes to making more complicated animations, you will notice that Gimp is not the ideal tool. The method of showing individual images as layers is a fast hack, to enable GIF

KNOW HOW

Figure 3: Tiger on the wall

animations without overwhelming the internal data structures. In particular, one now no longer has the option of working with layers for animations. Especially in animations, where objects are moved back and forth and may thereby overlap each other, this would be very useful, though. This is where the plugins from the domain of <Image>/Video/... come in. But even with the classic methods it is possible to make GIF animations with lots of effects.

GIF in motion GIF is still the only format supported by almost all Web browsers for animations. This is why, despite the licensing problems concerning the LZW compression, it is still popular for use in Web design. You have to have a licence from Unisys to be able to legally distribute GIF images produced with Gimp on the Internet. For this reason, in many Gimp packages the GIF (and TIFF) plugin is not installed as standard. You must then install another package (gimp1.2-nonfree) to be able to create GIF images. The same applies for the Windows version of Gimp, and you can find out more on this at http://www.gimp.org/win32/. Animations are, in reality, nothing more than a collection of images shown one after the other. To this extent it seems obvious that Gimp should save the individual images of an animation in layers. If you load any old animated GIF from the Web into Gimp (you can simply give a URL as filename, Gimp then downloads the image using wget from the Net), you will see that in the layer dialog the individual stages of the animation are visible. How long an individual image is visible, is something you can determine from the name of a layer. If it is called layer (500ms), the image will be shown for half a second. Bear in mind that only ms (milliseconds) is permitted as a unit. If you do not specify a period, when the finished image is saved a standard period will be asked for.. 13 · 2001 LINUX MAGAZINE 43


KNOW HOW

GIMP WORKSHOP

A big yawn One fast method of creating an animation is the Iwarp plugin, which you will find under <Image>/Filters/Blur/Iwarp. With this plugin you can blur images freehand, in a similar to that of the Goo programs. It is especially easy to create caricatures out of faces, by exaggerating distinctive facial features. You can displace image areas, blow them up, shrink them and turn them clockwise and anticlockwise. With Delete you take an image back to its original condition. The two sliding controls define the size of the affected area and the intensity of the effect. With the mouse you can then blur in the preview image, and you will get the hang of it after a few tries. The Iwarp plugin now has the option of creating an animation from unblurred to blurred image (and back again if you like). To do this, simply click on the index card Animation and select the number of the intermediate steps. With Reverse the animation goes, not from the original to the blurred image but – surprise – the other way round. With ping-pong, after blurring it animates back to the original image. Using the tiger image from last month we have made our tiger yawn using this method (Figure 4). Maybe you can even get your mother-in-law to grin... You can view the finished work of art using <Image>Filters/Animation/Animation playback. The plugin is easy to use, but there is also another neat trick here: You can click on the display of the animation and drag it out of the window. This is especially practical, if you want to quickly assess

Figure 4: Giiiiimp!

Figure 5: Sorting the layers 44 LINUX MAGAZINE 13 · 2001

how the finished animation would look in the website, and do not feel like faffing around in HTML code. Simply drag the animation over the Web browser. It disappears again when the window is closed. Keep the Layers & Channels dialog open all the time, when you are working with animations. It is a very useful tool, to quickly duplicate a layer, change the sequence of the layers (thus the order of the individual images) and to combine two layers with each other. As a little example we can make a little text appear.

Fading in text Create a new image 500x100 in size. Select the text tool and create the text which is to appear, in a layer of its own. Now duplicate the background and the text layer 10 times each. Using drag and drop you can now sort the copies of the text layer between the copies of the background layers (Figure 5). When you play the animation back, you will see a flashing white text against a black background. But we would rather have a text which is faded in, so select values of between 0% and 100% as covering power of the text layers, using the slide control in the layer dialog in 10%-steps. For technical reasons, you cannot see this effect yet in the animation preview, but in principle it is still a flashing text. To get rid of this, merge every two sequential layers. To do this you must click the mouse to activate the text layers network after another and press Shift+Ctrl+M to trigger the command Combine downwards (Figure 6). Now our text fades in gently. We can now save this image as an animated GIF. Simply specify a filename, ending in .gif. The export dialog will appear automatically (Figure 7), which informs you that several layers can be combined before saving – but we don’t want to do that in this case. So click on Save as animation. Since the GIF plugin only supports indexed colours, the image is automatically converted into such a format. Then click on Export. In the dialog which will appear, just

Figure 6: Merging layers


GIMP WORKSHOP

click on OK, the default settings are reasonable. If you now look at this image in Netscape, you will see the effect.

Web design GIF animations are sometimes a nice enhancement for a website – but if they are used to excess and there is something flashing and moving wherever you look, visitors will be put off. Please be sparing with the use of GIF animations. Sometimes a small effect is much more effective than all that flashing. For example on Slashdot after a report on Gimp there are always astonished comments that the eyes of Wilber (the Gimp mascot) can move – and yet one had never noticed it before. They do actually move by one or two pixels, and that will never change. But a Wilber who rotates about his own axis, who changes his colour and at the same time hops up and down, would never trigger this Aha! affect.

Size matters The other thing you should bear in mind is the size of files. GIF animations can become very large and drastically increase the loading time for a Web page. If the animation is better designed right from the start and some effects are made slightly differently, you can save a lot of space. The animation we have just created (Figure 8), is about 34 KB in size, via an ISDN connection it would take five seconds to get onto your home PC. But since it is only one second long, it will run too slowly and be jumpy. In order to reduce the size, you should do two things: Firstly, index the image by hand (as few colours as possible and if possible without colour scanning) and then select the menu item <Image>/Filters/Animation/Animation optimize. This command tries to remove redundancies from

KNOW HOW

the image and to exploit a couple of special features of the GIF format, in order to save a bit more space. This is especially worthwhile when large areas remain the same from one image to the next and therefore do not have to be saved again. In our case, we have got it down to about 24KB. But the main problem with file size is home-made. Since our text fades in slowly, many pixels change colour from one image to the next. It would be better if only small areas were to change each time – and then less image data would be transferred, too. As an experiment I have redesigned our animation so that it appears letter by letter. This means that only a small area ever changes from one image to the next. If you want to do this, it is worthwhile starting with the full text, making a copy of the layer and then deleting one letter. Repeat this process until the text is blank. The result looks something like Figure 9. Once the image has been indexed and optimised it is just 4KB in size. So the slight adjustment to the animation has certainly paid off. Generally it is possible to say that movements and fading in take up more space than the appearance of parts of an image. Obviously there is a lot more to discover, but that’s enough for this time. Have fun! ■

Figure 8: Fading in text – large files

Figure 9: Fading in text – small files

The author

Figure 7: The relevant dialogs for GIF animations

Simon Budig is now battling with compiler construction. That’s why there are no philosophical comments this time. A parser is an algorithm, formal proof of...

13 · 2001 LINUX MAGAZINE 45


KNOW HOW

INSTALLATION

Installing Open-Source Software on Linux

GOING TO THE SOURCE CHRIS BROWN

Sometimes we need to get new programs for the Linux system, but installing them can be a chore. This month we look at installation from the source.

Installing from source code The idea of installing from source code might seem daunting – all those nasty curly brackets and stuff. Isn’t this option out of the question for the nonprogrammers amongst us? No, certainly not! Installing from source usually does not require you to modify, understand, or even look at the actual source code at all. However, since most of the software is written in C or C++, it does require that you install the C/C++ development tools on your system. Most of the established open-source sites use a format known as a tarball, (also known as a compressed tar archive) to package source code for distribution. These files usually have names ending in .tar.gz. As an example, we’re going to take a look at installing the latest version of the Apache Web server from source. The process is much the same for other packages. As I’m writing this article, the most recent version of Apache I can find as an RPM is 1.3.20; however I know that there’s a beta version of 2.0 available. We’ll try the obvious place – www.apache.org. Sure

Figure 1: Finding files 46 LINUX MAGAZINE 13 · 2001

enough, after a couple of minutes poking around I find a listing of files available for download (see Figure 1). The tarball I need is called httpd-2_0_16beta.tar.gz. The file below it in the list is a PGP signature for the file, so I can be sure it’s authentic. I decide to download the tarball to my home directory, /home/chris. The next thing to do is to uncompress and unpack the archive. With the right switches, tar can do both of these in one step: $ cd /home/chris $ tar xzvf httpd-2_0_16-beta.tar.gz You’ll see a long list of the files as tar extracts them from the archive and puts them into the subdirectory httpd-2_0_16. If you’ll look in that directory you’ll see some documentation files with names like INSTALL and README, which you’ll probably want to look at. Now it’s time to build the software. Not many years ago this typically involved quite a bit of fiddling around to customise the build process to your platform, following instructions in the INSTALL file which said lots of intimidating things like “if you don’t have the library libfoobar.o, add the flag DNOFOOBAR to the CFLAGS macro definition in the makefile”. Nowadays this customisation is mostly automatic thanks to an amazing tool called autoconf from the Free Software Foundation. Autoconf is used by the package developer to create a script called configure which is included in the tarball. The configure script performs lots of tests on your system and builds a makefile depending on what it finds. By the way, I do not recommend actually looking at the configure script; like most automatically generated code, it’s not a pretty sight. If configure finds that necessary components are absent from your system, it will tell you what’s missing and abort. In any event, you’ll see a long list of all the tests that the configure script is making scroll by. If all goes well, you end up with a makefile to control the build of the software.


INSTALLATION

There’s not space enough here to talk about makefiles and the make command in depth. Suffice for now to say that the makefile specifies what files need to be created, which files they need to be created from, and what commands are needed to do the job. The make program interprets the makefile and runs the necessary commands. Usually all you need to do is to run make with no arguments. This will compile and link the programs that make up the package, and may take some time depending on the complexity of the package and the speed of your computer. Time to go and top up the bird feeders with peanuts perhaps. If configure ran successfully, the make is unlikely to fail. Now you have the compiled version of the package. Note that everything so far has been contained within the directory you did the build in – in our example, that’s /home/chris/httpd-2_0_16. If I were to empty and remove this directory, and delete the tarball from my home directory, I would remove all traces of the package. The final step is to install all the pieces into the correct places in your system. This might be as simple as putting the program into /usr/bin for example, but will typically also install documentation and maybe some configuration files. Because this operation updates system

KNOW HOW

directories, it must be run as root. This operation is also automated via entries in the makefile and the command is simply ‘make install’. In most cases, that’s all you’ll need to do. It takes longer to explain than to actually do. In summary, the sequence of commands is usually: <download the tarball to /somewhere > $ cd /somewhere $ tar xzvf package_name.tar.gz $ cd package_name $ ./configure $ make su to root ... # make install Once the package is installed you can recover some disk space by deleting the directory you unpacked the tarball into – for example: $ rm -rf /somewhere/package_name Installing new software onto Linux isn’t hard and doesn’t require any programming skills. It’s an excellent way of expanding your system and keeping what you have up to date. And of course, it’s free! Happy hunting! ■

13 · 2001 LINUX MAGAZINE 47


KNOW HOW

MIGRATION

Customising the desktop with Control Center

TAILOR-MADE DESKTOP ANJA M WAGNER

Working with a graphical user interface starts being fun once the settings have been adapted to your own preferences, so that the interface makes working easier. click on the Gnome icon on the left and select Program/Configuration/GNOME Control Center. The Control Centers for Windows and KDE differ in terms of their appearance. KDE divides the screen into two halves. The configuration areas are listed on the left, the relevant dialog window appears on the right. The elderly gentleman in billowing robes who welcomes you is called Kandalf, by the way. The categories in the left column can be represented as a tree structure or an icon view. You can change the display mode using the View option on the menu bar. In the following text we will be referring to the tree view. Figure 1: The Windows Control Panel is accessed via the Start menu

Figure 2: KDE adopts the same route to its Control Center as Windows

Under Windows, the principal configuration options for the desktop can be found in the Control Panel. KDE, probably the most commonly used graphical interface for Linux, takes a very similar approach. You will find the important options for customising your work environment in the KDE Control Center. In this workshop we will show you in KDE how to adapt the settings which you would access through the Display section of the Windows Control Panel. We will be referring to Windows 98 SE and SuSE Linux 7.1 with KDE 2.0.1. The path to the configuration controls is similar for both desktops. Under Windows it is Start/Settings/Control Panel. After installing KDE you will see a panel similar to the Windows taskbar at the bottom of the screen. This is the KDE control panel, often simply called Panel. If you click on the button with the KDE icon at the far left, a menu opens which contains, among many other things, the Control Center. Click on this menu item and you will find yourself in at the heart of the controls. The Control Center for Gnome (another graphical interface) can be reached most quickly via the button with the toolbox on the control panel. You can also

48 LINUX MAGAZINE 13 · 2001

Appearances The visual aspect of the Windows user interface is adapted mainly through the Display option in the Control Panel. Here you can choose wallpaper and screensavers, and configure display options for desktop components such as windows, title bars, icons and menus. In order to do this in KDE, click on the “+” in front of Look and Feel in the tree view of the KDE Control Center. Further sub-items will appear. The first, Desktop, itself contains four sub-items. Under General you can specify some general desktop properties.

Figure 3: Kandalf welcomes you to the Control Center


MIGRATION

KNOW HOW

Figure 4: Basic desktop settings are implemented in the Display section of the Control Panel, for instance the selection of wallpaper

Figure 5: Here you can specify what happens when you click on the desktop with the middle mouse button

What should happen if you click on a free area of the desktop with the mouse? If you preset the option Window List Menu, clicking will open a menu that shows all windows currently open. You can display these tiled or cascading, raise them or change desktops altogether (see below). The Desktop Menu may be more familiar to you under the name Context Menu. Here you find basic actions such as Create new for directories or documents, Paste, Help or commands to rearrange the icons on the desktop. By default, this menu is opened with a right-click. This is true of almost all KDE applications. If you like, KDE could also show the application menu when you click on any free space on the desktop. This normally appears when you click on the K-Button on the panel. In some cases, such as clicking on the desktop, you will need a middle mouse button to use KDE. For a two-button mouse you normally need to press both buttons together to simulate pressing the middle mouse button. Whichever options you have chosen for each of the three buttons, confirm the settings with Apply and test the effect. On the second tab, Appearance, you can vary the font size for the desktop between small, medium and large, as well as setting the standard font and normal text colour. If you have experimented with lots of different options and would like to return to the original settings just click on Standard.

Backgrounds We continue with the sub-item Background in the left column. Here you will notice a special feature of graphical Linux interfaces. KDE provides four desktops as standard. You can set up a maximum of 16 desktops (see below). If you want to design different desktops, uncheck Common background. In the lower half you can now select a background for each desktop. Select a gradient under Mode if the background is going to have two or more colours. If you would like a pattern, activate the relevant option and click on Setup. KDE offers seven patterns as standard. Confirm your selection with OK and check the appearance in preview. Use the same procedure to activate a program as background. kdeworld displays the Earth’s time zones, updated every 10 minutes, xearth shows the globe as it rotates slowly. Activating background programs does take up a lot of computing capacity and can slow down machines significantly depending on their resources. Under Wallpaper you can deactivate the preset logo and specify how an image should be arranged on the desktop. You can select your own image using the Browse button. To have more than one wallpaper display, enable Multiple, select the images through Setup and determine the order and time interval of their appearance. The Advanced tab allows detailed settings such as blending and limiting the pixmap cache. The

Figure 6: Rather than having an image, you can run a program as desktop background

Figure 7: You can decorate the desktop with your favourite pictures, tiled, blended or alternating 13 · 2001 LINUX MAGAZINE 49


KNOW HOW

MIGRATION

Figure 8: Ease-of-use and plenty of space is provided by up to 16 virtual desktops

Figure 9: Each desktop is represented on the panel by a button

easiest way to find out how these affect the desktop is to try them out. Fortunately you can create many different desktops rather than having to settle on one design. Desktop and window Borders can be magnetised. This means that windows that are moved close to another window or to the edge of the screen will snap onto these once they have got within a certain distance – the snap zone. This makes it very easy to position windows. Use the slider under Magic Borders to specify how large in pixels the zone in which the magnetism effect works is going to be. Active Desktop Borders are not yet available. As mentioned already in the Background section, KDE starts off by providing four desktops. In the Virtual Desktops configuration section you can define a maximum of 12 additional desktops, which you can customise according to taste. To increase the number of active desktops, use the slider. If you would like the desktops to have names, you can write these in the their text boxes, for example Work, Games, Experiments, etc. Each desktop will now appear as a button on the control panel. Small boxes indicate which desktop has windows open. “Why all this effort?” you may ask yourself. If you work with many windows open simultaneously you can arrange these, possibly by subject, on the different desktops. Each individual desktop remains clear, and by clicking on the panel buttons you can quickly jump between desktops. Screensavers are selected using the Screensaver menu item in the left column. First, activate the option Enable screensaver in the dialog window. Set the time in minutes before it starts and apply a password if required. The Priority slider determines how much processor performance is made available to the screensaver. You ought to keep the preset, Low.

50 LINUX MAGAZINE 13 · 2001

Figure 10: Screensavers to suit every taste

Windows combines the screensaver configuration with the adaptation of the monitor’s energy-saving features. This is the same in Gnome. KDE gives the energy settings their own menu item. Select Power Control/Energy in the left-hand column of the Control Center. Once you have enabled the display energy saving mode, use the sliders to set the time after which it should be activated.

More style You have already been introduced to a number of configuration options for the desktop, but there are more. Themes allows you to design your desktop’s style and icons. KDE offers several different styles. This means that the appearance of windows, icons, buttons, title bars, etc. follow a consistent style. For example, if you choose the style Qt Windows the appearance of your desktop will be very similar to the familiar Windows environment. Some general settings for KDE program toolbars are also applied here. Should toolbar buttons be represented as icons, text or a combination of both, should active buttons indicate when the mouse pointer touches them (by being highlighted briefly), and should the toolbar be transparent when it is being moved?

Figure 11: Saving energy


MIGRATION

KNOW HOW

Figure 12: Change the appearance and style of your desktop

Some surprising effects can be achieved with Colors. You can vary the colours of windows, title bars and menus. After you click on Apply the desktop elements are assembled according to the chosen colour scheme and displayed in a preview. Depending on how outrageous your desktop design is you may see unusual things. Tip: If you have designed very different backgrounds for your desktops, there is an easy way to check the window layout for all desktops. Open a window in any desktop and right-click on the title bar. In the context menu, select To desktop/All desktops. Now all you have to do is to select all desktops in turn from the panel and to check whether the colour and design suits all backgrounds. Another very convenient configuration option can be found under Window Behaviour. If your machine is slow you should disable Display content in moving windows and Display content in resizing windows. Only the frame is now displayed when windows are moved, as familiar from Windows. If there is more than one window open on the desktop, KDE can apply a Smart placement policy so that as much as possible is visible of each window on the desktop. Other options are Cascade or Random. The windows focus policy is designed to make things easier; the active window is the one in focus, it receives all keyboard input. You can specify the behaviour of the window for mouse clicks or mouse contact. The settings could be combined so that a window is in focus as soon as the mouse moves over it and is raised automatically. Select the relevant option on the Focus Policy drop-down menu, followed by the option Auto raise which is enabled. The raising of the focus window can be delayed using the slider. What should happen when you click on title bars, frames or windows with one of the three mouse buttons is specified under Mouse Behaviour. The default settings are similar to those in Windows. A useful feature is the shade facility, which is

Figure 13: The more outrageous the better

started by double-clicking on the title bar of a window. The open window is reduced to its title bar. Another double-click causes it to expand back to its full size. This avoids cluttering up of the desktop when many windows are open at the same time. Under Taskbar you can enable the Show all windows option. This causes all open windows on all desktops to be displayed in the window panel. The window panel is integrated into the control panel and is situated to the right of the desktop buttons. If the option is disabled, only the windows on the current desktop are shown. The second option, Position, has not yet been implemented. The control bar or panel is an essential component of the desktop and can be arranged in a variety of ways. It can be dragged to the edge of the desktop using drag and drop as in Windows. You can also specify its position under General on the menu item Panel. The panel has a small arrow at either end. If you click on this, it disappears, apart from the button with the small arrow. Another click causes the panel to return. It can also be hidden automatically. Use the slider to specify after how many seconds this should happen. The panel reappears when you touch the edge of the screen where it is normally placed with the mouse pointer, by default this is at the bottom. On the Look and Feel tab you will find the

Figure 14: The mouse rules over title bars and windows

13 路 2001 LINUX MAGAZINE 51


KNOW HOW

MIGRATION

Figure 15: Select fonts for different elements of the graphic interface, like window titles or menu entries

Figure 16: Desktop icons can have both size and colour customised.

option Fade out applet handles. If such applets are present on the panel you will see a small area on their left, which looks like it is covered in small rivets. This is the handle. By right-clicking on it you can move buttons and applets or delete them from the panel.

If you switch between desktops you will notice that the panel remains the same. In Gnome you are able to use and configure several panels. Right-click on a free space on the control bar and select Panel/Create Panel. You can insert elements into the new panel in the same way by choosing Add to Panel and selecting the relevant elements from the menus that follow. You can configure a different control panel for every area of work, which greatly reduces the time it takes to access the respective applications. Simply close panels that you do not require at present using the small arrows. Back to KDE: Font type and size for the desktop components are set under Fonts. The Choose button leads to a dialog window where font, font style and font size for the specified desktop components are set separately. These options allow you to differentiate in more detail than under Desktop/General/Appearance. If the size of the icons on the desktop does not appeal to you Icons is the place to go. Using the example of the cogwheel application icon, KDE shows how icons will appear in panels and on the desktop. The size can vary over three levels. In addition, pixels

can be displayed double sized. The Effects tab contains further design options, for example colouring the icons and adjusting the saturation. KDE can keep you informed about the system activity of the graphical interface and the window manager. Four options are available for each action under System Notifications: Log to file, Play sound, Show message box and Standard error output. If you would like to play a sound for a system notification tick the relevant option – this will enable the text box Filename. Click on the directory icon to the right of the box and choose the appropriate sound file. The last sub-item of the large Look and Feel complex lists the valid keyboard shortcuts under Key bindings. Many of these shortcuts such as Cut (Ctrl+X), Copy (Ctrl+C) are identical to the Windows shortcuts. If you would like to change a key combination, click on the relevant action in the dialog box and select Custom key. As an example we will change the key combination Alt+F1, which calls the Start menu, to Alt+F5. Choose this action from the list. If you now click on Custom key the Alt key remains selected. Click on the function key icon and press F5. The new setting appears in the selection window. Actions that do not have any keyboard shortcuts yet are defined in the same way. The screen resolution and the colour depth cannot be amended through the KDE Control Center. You have to start another tool in order to do this, which is why we will be covering this point in a later workshop. ■

Figure 17: System under control

Figure 18: Key shortcuts instead of mouse

Gnome panels

KDE: KDE stands for K Desktop Environment and is a graphical user interface for Linux. Together with Gnome, KDE is the most widely used interface and these days the most sophisticated. The number of programs and applets for KDE is enormous. The project has been in existence since the end of 1996. KDE consists of the modules login display manager (kdm), the window manager (kwm), the file manager (Konqueror) and the control panel (kpanel). Further information can be found under http://www.kde.org. Gnome: GNU Network Object Model Environment, where GNU stands for GNU is Not UNIX, has been a project of the Free Software Foundation since 1997. It consists of the login display manager (gdm), the file manager (gmc) and the control panel (panel). Information is kept at http://www.gnome.org.

52 LINUX MAGAZINE 13 · 2001


PROGRAMMING

COMMAND

Locate and Find

COMMAND COLIN MURPHY

There are many branches in a UNIXtype file system, so it will come as no surprise to you when you realise that you have forgotten on which twiglet you have

Even if you have an infallible memory you may still have to deal with an unfamiliar landscape, different distributions have different file layouts, which will only frustrate you when you are trying to find a system file – the XF86Config file is a good example of this. Thankfully, there are a number of tools and utilities that can help you find the files you need. Most Linux distributions will come with the locate utility, which will give you the location of any files that have the text you are looking for in their filenames. So the command:

based on other criteria like file size or by the date the file was last modified. This is a feature-rich and powerful command made obvious because the manpage is so big. The basic thing to remember when calling ‘find’ is that you need to provide the search criteria as well as the search pattern, so the command:

colin@localhost colin]$ locate XF86Config

/var/log/security.log /var/log/auth.log /var.log/user.log

[root@localhost /]$ find /var/log/ -nameU ‘*.log’ will give me

left a now desperately sought after file.

will give me /etc/X11/XF86Config /etc/X11/XF86Config.old /etc/X11/XF86Config.test /usr/X11R6/lib/X11/XF86Config-4.eg /usr/X11R6/man/man5/XF86Config.5x.bz2

Crond Crond is the batch daemon which starts other processes at predetermined times, which are described in the control file etc/crontab.

Bash logout #/.bash_logout is one of the bash shell command files, this one being run when you exit from a bash shell session, usually when you are shutting the machine down to turn it off.

showing me all the files that have the text ‘XF86Config’ in the filename, some of which I really should get around to deleting the next time I do some tidying up! If you have some idea of the filename then locate will help you track down its location. locate relies upon its own database, which, to be of any use, needs to be updated regularly. If you are in the habit of leaving your machine on overnight then this usually will happen automatically when the nightly crond jobs are run. If your machine is never on late enough for these jobs to run then you will need to run the updatedb command yourself at some point – maybe as you log out, something which could be automated by adding the command to your ~/.bash_logout file in your home directory. The advantage of having locate use a database means queries will be answered very quickly, the disadvantage is the database will be out of date, especially if you have been creating or copying a lot of files in one session, even this can be solved by running updatedb from the command line, leaving you with time to go get a coffee. find is another command line tool that will help you find files, but this time you can search for files

53 LINUX MAGAZINE 13 · 2001

amongst many other files. Breaking this down – we will only look in the /var/log directory and any directories below it. We are basing our search criteria on the names of files only and that we only want to know about filenames that end in “.log”. There are many search criteria other than -name, all of which are listed in the manpages, here are just some: ● -mmin -n allows you to look for files that were modified no less that n minutes ago – useful if you wanted to see which log files had just been written to. +n would allow you to look for files that are older than n minutes. ● -size +n will look for files that are bigger than n in 512 byte sized blocks. Put a “c” after the value to search for byte counts or a “k” to search for kilobyte counts, ● -user name will allow you to look for files that only belong to the named user. The search criteria can by combined to make for a more powerful search, so [root@localhost /]$ find -mmin -30 -user colin will only tell me about files that have been modified in the last half an hour and only belong to the user colin. ■


PROGRAMMING

PYTHON

Object persistence in Python

PYTHON POWER ANDREAS JUNG

In a new series on Python, Linux Magazine will be reporting on current developments every other month and introducing the concepts that make Python unique. Our first topic is object persistence. Welcome to Linux Magazine’s new Python series. We will be looking at topics on all aspects of Python, for beginners as well as advanced users. This includes reports on current Python developments and solutions, but also basic articles on certain subjects. The first article deals with the permanent storage of objects. But first a brief overview of Python... Python is a scripting language that was developed at the beginning of the 90s by Guido

van Rossum and has since evolved into a universally employed programming language. Today, Python is the most important and most widely used scripting language apart from Perl. As this description implies, Python is an interpreted language, compilation of Python programs takes place at runtime. A magazine article cannot hope to give a full introduction to Python, however, we will discuss some of Python’s concepts and advantages. A detailed introduction can be found in the Python

The most important innovations in Python 2.1 Nested scopes

__future__ statements

Until Python 2.0 there were three name spaces, which are searched for variables in the following order: local name space, module name space and built-in name space. This separation is not intuitive if you look at nested functions:

New features are introduced with every version of Python. This may lead to a break in compatibility with existing applications. In order to alleviate this problem, new aspects that will become standard features in Python 2.2 can be linked using a __future__ import statement. Nested scopes will become a standard feature of Python from version 2.2 onwards. Although their implementation is already finished they have not yet been enabled in Python 2.1. To be able to use them anyway, they need to be linked and enabled with

def f(): ... def g(value): ... return g(value-1)

Invocation of function g() in the return statement will cause a name error exception, because g has not been defined in any of the three name spaces. Python 2.1 removes this shortcoming and allows the nesting of name spaces through importing the new nested_scopes module.

54 LINUX MAGAZINE 13 · 2001

from __future__ import nested_scopes

Warning framework Over the years many modules have accumulated that are no longer supported, are obsolete or have been replaced by newer ones with improved functionality. It is difficult for developers to remove modules


PYTHON

tutorial at http://www.python.org/doc, but also in the new Open Source book Dive Into Python (http://www.diveintopython.org).

Python overview We’d briefly like to mention a few of Python’s advantages and distinctive features: Quick to learn and to easy read: Python’s syntax is simple and easy to understand, and its functionality is clear. Unlike Perl, even very large projects can be created, maintained and still be intelligible some time later. Its language range is orthogonal, as a rule there is no duplication of functionality. Loops are realised as for or while constructs, repeat or do..until loops are unnecessary. Modular: related functionalities (for example sockets or graphical options) have been combined into modules and are imported when required. In the spirit of code reuse, modules can be used by different applications. Interactive: Python has an interactive mode, which makes familiarisation very easy, particularly for beginners. Compact: Compared to compiled languages like C or C++ Python programs are very compact. Python’s data types, such as dictionaries, lists and tuples, allow most complex operations to fit onto one line. Programs are structured into codeblocks by indentation of the source code. The bracketing familiar from C is therefore redundant. Object-oriented: In contrast to other programming languages Python was designed from the outset to be object-oriented rather than being extended with object-oriented concepts later on, like Perl, for example. This unified concept distinguishes Python significantly from its competitors. Python’s increasing popularity has one main

reason: the clear and simple language structure makes it easily accessible. In the meantime, Python is used by schools and universities to teach programming skills. However, Python is not only of interest to beginners, it is the Swiss Army knife of programming. It is used in areas as diverse as Web applications, string processing, administrative and other applications, numeric calculations and controlling complex production environments in factories. Python inherently offers many useful concepts that are not found in other languages, for example object persistence.

In every object-oriented programming language objects contain methods and attributes. For many applications it is desirable to deposit an object permanently on a storage medium in order to reuse it after restarting the program. Following a program termination the latest stored state of the object can then be accessed. It is always possible to write applicationdependant code for the export of important data, but each modification of the object also requires the export functionality to be amended accordingly. What is required at this point is transparent object persistence, that is, a mechanism that allows objects to be stored permanently without additional code. This should happen without necessitating the programmer or the application to have special knowledge about persistence.

Python persistence For a long time Python has contained pickle and cPickle, with which objects can be serialised. Objects are serialised into character streams, which

> import regex __main__:1: Deprecation Warning: the regex module is deprecated; please use the re module Users then have one release cycle to convert their software to the newer module re for regular expressions.

In Python 2.1 attributes can be assigned to functions:

Tuples: A tuple is a number of values separated by commas

What is object persistence?

without running the risk that applications won’t work with later versions. The warning framework makes it possible to issue versiondependant warnings that a module will no longer be contained in the next version or a functionality will be changed or removed. For example, when importing the regex module, Python 2.1 issues the warning:

Function attributes

PROGRAMMING

def func(): .... func.author = “Holger Müller” func.security = 1 All attributes are stored in the function’s dictionary __dict__. Until version 2.0 it was only possible to hide additional information in the doc string, which could be read through f.__doc__.

New installation mechanism From version 2.1 the installation is carried out using the distutils package, which is the standard installation tool for Python modules. It is therefore no longer necessary to go to the trouble of configuring the modules manually, as was the case in older versions. The installation script checks automatically which modules it can compile (similar to configure), based on the headers and libraries, and then builds them automatically.

13 · 2001 LINUX MAGAZINE 55


PROGRAMMING

PYTHON

Listing 1: Pickling an object

zodb[`instClass’] = instClass

#Import the pickle module import cPickle class myClass: def_init_(self,num,txt): self.num=num self.txt=txt #Generating a myClass object instClass=myClass(212,’Python is cool’) fname=’instClass.p’ #Serialising instClass into a file cPickle.dump(instClass,open(fname,’w’)) #Open file with pickled object and #create a new object newinstClass=cPickle.load(open(fname,’r’)) print newinstClass.num,newinstClass.txt

The object is bound to the key `instClass’ in the ZODB and stored. In the same way objects can easily be retrieved from the ZODB: instClass = zodb[`instClass’] That looks very elegant, and it is. But before we get to that point, ZODB has to be installed first. ZODB is not restricted to a specific medium for storing objects. Normally it deposits objects in a file within the file system, however, adapters for databases such as Oracle or BerkleyDB exist. The storage medium is transparent to the application. Only when opening the ZODB does the medium have to be specified, that is, if applicable, the underlying actual database layer.

Listing 2: Opening a ZODB database from ZODB import DB, FileStorage fstorage = FileStorage.FileStorage(`Data.fs’) db = DB(fstorage) connection = db.open() root = connection.root() can then be in files. This process is called pickling; an object is conserved, as it were, in order to be reused later. Alternatively, Python can read in such a serialized object and convert it back into an object (unpickling). Both modules are identical in their functionality: cPickle is the C reimplementation of the pickle module written in Python, and is always preferable for efficiency reasons. In the example in Listing 1 an object with the two attributes num=212 and txt=’Python is cool’ is created. The object is stored permanently in an internal format in instClass.p by invoking cPickle.dump(). The subsequent call cPickle.load() loads the files and generates a new instance of myClass, which has the same attributes as the original object. This approach is generally possible for every Python object, however, there are some exceptions. For example, file objects or sockets cannot be serialized, which would not be sensible anyway. Pickling allows persistent storage of any object – even ones with multiple inheritance – but the programmer still has to implement parts of the code himself.

Persistence in Python using the ZODB Based on the pickle mechanism, the Zope Object Data Base (ZODB for short) was created during the development of the Zope application server. It frees the developer from the burden of implementation as well. Its use is relatively simple: to the developer the ZODB appears as a mapping object which is addressed in the same way as a Python dictionary: 56 LINUX MAGAZINE 13 · 2001

Installation of the ZODB The ZODB is integrated into Zope and can be used if Zope has already been installed. If you don’t need Zope, there is a stand-alone version of the ZODB which is being maintained by A M Kuchling. After unpacking the archive the installation is performed using the distutils tool (contained in Python 2.0/2.1, for Python 1.5.x the distutils have to be installed separately): python setup.py install That should automatically compile and install all ZODB sources and modules. It is advisable to use the current version of Python, 2.1.

Using the ZODB How to open the ZODB when using a file as the storage medium can be seen in detail in Listing 2. The FileStorage object in this case represents the storage medium that is being used for the ZODB. When using a ZODB adapter for a relational database the call must be amended accordingly. The subsequent calls open the database and create the actual `root’ object through which the ZODB is addressed by the application. Serialisable Python objects can now easily be deposited in the ZODB: root[`red’] = `ZODB is cool’ root[`blue’] = [`Perl’,’is’,’cool’] Assignment only stores the objects in the ZODB temporarily. In order to store them persistently – that is permanently – the transaction must be committed: get_transaction().commit() A transaction is an atomic operation and consists of


PYTHON

PROGRAMMING

Listing 3: creating persistent classes import ZODB import Persistence class PLanguage(Persistence.Persistent) def __init__(self,lang,easy2learn): self.language = lang self.learneffort = easy2learn self.authors = [] .... languages = [] languages.append( PLanguage(`Python’,’very easy’) ) languages.append( PLanguage(`Perl’,’very hard’) ) languages.append( PLanguage(`TCL’,’easy’) ) zodb[`languages’] = languages TCL = zodb[`languages][2] TCL.learneffort = `not easy’ get_transaction().commit()

a sequence of changes within the database. The transaction mechanism of the database ensures that either all changes are carried out or none. This guarantees data integrity between two commit calls. After the data have been stored in the ZODB they can, of course, be retrieved. Opening the ZODB is done in the same way as the writing of data. Reading the data is identical to using a dictionary:

As explained above, changes to mutable data types are not automatically recognised by the ZODB. In such cases alterations have to be explicitly indicated to the database by setting the attribute _p_changed to 1. The ZODB will then update the object accordingly:

print root[`red’] -> `ZODB is cool’ print root[`blue’] -> [`Perl’,’is’,’cool’]

class PLanguage(Persistence.Persistent) .... def setAuthor(self,author): self.authors.append( author ) self._p_changed = 1

Changes in the ZODB

Outlook

The ZODB automatically recognises changes to objects and also stores them, with one exception: changes to lists and dictionaries are not recognised automatically. That is true on a general level for all objects that are described as mutable, or changeable, in the Python philosophy. Changes to a list or a dictionary must therefore not be made using

The Zope extension Zope Enterprise Objects (ZEO) can be used to build a distributed ZODB, which means objects can also be stored distributed. This article shows how easy the ZODB is to use and that it represents a powerful tool for Python developers, which allows transparent object persistence while requiring little effort to learn and only minor source code amendments.

root[`blue’].append(`a lot’) get_transaction().commit() but instead require a new assignment of the object: temp = root[`blue’] temp.insert(2,’not’) root[`blue’] = temp get_transaction().commit()

The author Andreas Jung lives near Washington D.C. and works for Zope Corporation (formerly Digital Creations) as a software engineer in the Zope core team. Email: andreas@andreas-jung.com

Info Persistent classes Converting classes into persistent classes is particularly easy. They simply need to be derived from the class Persistence.Persistent. The process in detail is illustrated by the example in Listing 3.

Python in practice: ZODB pages by A. M. Kuchling: M. Pelletier: ZODB for Python Programmers: Zope Enterprise Objects (ZEO):

http://www.python.org/psa/Users.html http://www.amk.ca/zodb/ http://www.zope.org/Documentation/ Articles/ZODB1 http://www.zope.org/Products/ZEO

13 · 2001 LINUX MAGAZINE 57


PROGRAMMING

TEMPLATE TOOLKIT

TEMPLATE

FILE PROCESSING JIM CHEETHAM

The Template Toolkit (TT) provides a metalanguage that can be inserted into otherwise ordinary data files, allowing you to embed data processing instructions. TT is a collection of Perl modules, and so you will need to have Perl on your system to use TT. Don’t worry that you need to understand Perl in order to use TT, though – the template language has been designed to be useable by non-Perl hackers, and you can invoke it simply from the command line.

Scope TT describes itself as “a fast, flexible, powerful and extensible template processing system”. I won’t dwell on the speed aspect (TT will save you plenty of time once you’re using it) nor the full extensibility (which is achieved primarily through the internal use of Perl). However, flexibility and power are TT’s watchwords. Originally designed for generating dynamic Web content, TT is applicable to a much wider range of tasks. For the purposes of examples in

siteheader.tt2 <?xml version=”1.0” encoding=”UTF-8”?> <!DOCTYPE html PUBLIC “-//W3C//DTD XHTML 1.0 Transitional//EN” “DTD/xhtml1-transitional.dtd”> <html> [% DEFAULT title=”TT example site” papercol=”#ffffff” inkcol=”#000000” %] <head> <title>[% title %]</title> </head> <body bgcolor=”[% papercol %]” text=”[% inkcol %]”>

index.html [% PROCESS siteheader.tt2 %] <h1>TT Example website</h1> <p>Welcome to the example TT website</p> <p>Have a look at the other site pages, and don’t forget to look at the source HTML code</p> <ul> <li><a href=”info.html”>Info</a> about the site</li> <li><a href=”about.html”>Contact</a> information for the site</li> </ul> [% PROCESS sitefooter.tt2 %] 58 LINUX MAGAZINE 13 · 2001

this article, I’ll be describing a system for quickly building a set of static Web pages, using the command line tools tpage and ttree.

Installing Template The current version of Template Toolkit is 2.02, and it is available from the main website, http://www.template-toolkit.org. For those of you used to Perl, it is also available from the CPAN archives, http://www.cpan.org. Install with the normal cpan commands: $ Perl -MCPAN -e shell cpan> install Template

The example website Throughout this article, I will be providing examples from a simple website. Because I’m keeping the examples short, the website might look a little contrived, but I hope you can see the wider applications of TT. The site will consist of only a few files; initially we will meet only index.html (which is the homepage) about.html which provides some contact details, and info.html, which provides some more information about the site. To go with these files, we’ll use a couple of template files, siteheader.tt2 and sitefooter.tt2. The exact names of all these files is pretty much unimportant, and the extension (.tt2, .html) is doubly unimportant. I just tend to keep using file name extensions like this to help me organise my files while I’m working on them, and they are especially useful if you ever find yourself editing files in a Windows environment. As the examples build up, more files and templates will be introduced. The example site, and the code used to produce it, can be found on http://tt.gonzul.net

The language The TT language is embedded into your data files, and by default the TT commands are identified by [% and


TEMPLATE TOOLKIT

%]. These can, of course, be changed in case they would cause a conflict with your data – TT is flexible, after all. Taking the siteheader.tt2 file as our example: The file is a fragment of HTML code – specifically, it’s the document declaration, header and beginning of the body of an XHTML file. But don’t worry about that at the moment, because I’m going to that part of things in a minute. The <title> and <body> lines are interesting – they show what you will probably recognise as normal HTML lines of code, except that where you would expect to find text (in the case of <title>) or values (<body>) you find a TT code reference to a variable. Earlier on in the snippet there’s a section called DEFAULT, which introduces values for the variables that I’m using below. All the variables look like just plain text – if you want to use real numbers for something (and potentially do some operations on those numbers, like addition or subtraction) you can, trusting the underlying Perl system to Do The Right Thing and automatically transform from text to numeric, and back again, according to context. When this file is processed by TT, everything it finds between [% and %] will be replaced with the value TT comes up with at the time. So, with the variable title set to “TT example site”, the code <title>[% title %]</title> will become <title>TT example code</title> Notice here that the quote marks (“”) used to declare the value of title have not been kept, nor have the spaces within the [% title %] section.

Using templates by name Now, this siteheader.tt2 file isn’t very useful on it’s own – it won’t produce a valid HTML file, for a start. But I can include it at the beginning of every “real” HTML page in my site, by using the PROCESS directive. There are a couple of other variations on this command, called INCLUDE and INSERT, but they don’t do quite what I want here. Have a look at the example site’s homepage, index.html. Here I have a simple HTML file, but it doesn’t start with <HTML> or even <BODY>, and therefore isn’t really a suitable homepage. Instead, it has a TT directive at the beginning, [% PROCESS siteheader.tt2 %]. Similarly, it doesn’t end with </body></html> as you might expect, but it does have a TT directive to process the file sitefooter.tt2. The PROCESS directive allows you to include another template into the current file, and it will keep track of all the variables that you are currently using. This will become clearer in the next example, but for the time being let’s just see what happens to our index.html. I’ll use the tpage command to actually process the

PROGRAMMING

tpage $ <span class=”pcode”>tpage</span> index.html <?xml version=”1.0” encoding=”UTF-8”?> <!DOCTYPE html PUBLIC “-//W3C//DTD XHTML 1.0 Transitional//EN” “DTD/xhtml1-transitional.dtd”> <html> <head> <title>TT example site</title> </head> <body bgcolor=”#ffffff” text=”#000000”> <h1>TT Example website</h1> <p>Welcome to the example TT website</p> <p>Have a look at the other site pages, and don’t forget to look at the source HTML code</p> <ul> <li><a href=”info.html”>Info</a> about the site</li> <li><a href=”about.html”>Contact</a> information for the site</li> </ul> <div> <p>Example site copyright © 2001 Jim Cheetham</p> </div> </body> </html> files. All tpage does is to read in the file you specify, and to run it through a Template instance within Perl, with the results coming out on STDOUT. If you want to see a practical example of how to use Template from within a Perl environment, start by having a look at the internals of tpage – however I’m not going to cover that aspect of Template Toolkit here. You can see that above and below the actual HTML code from index.html, there appears extra HTML code, that is produced by the siteheader.tt2 and sitefooter.tt2 files. This code has been inserted into the output, and in the case of siteheader.tt2, the variable name references between [% and %] have been substituted for their values. So we now have simple way to make sure that all our files have a consistent header and footer, in just one TT command. Now, if you actually wanted to look at this file in a Web browser, you’d have to save this output, and put it somewhere sensible, then ask your browser to read that file. But don’t worry about that just at the moment, because we haven’t met the extremely useful ttree command yet.

info.html [% PROCESS siteheader.tt2 title=”Site Information”%] <h1>Information about the TT Example website</h1> <p>The TT example website has been produced to illustrate the use of <a href=”http://www.template-toolkit.org”>Template Toolkit</a> when building static Web sites.</p> <p>Have a look at the other site pages, and don’t forget to look at the source HTML code</p> <ul> <li><a href=”index.html”>Index</a> page for the site</li> <li><a href=”about.html”>Contact</a> information for the site</li> </ul> [% PROCESS sitefooter.tt2 %]

13 · 2001 LINUX MAGAZINE 59


PROGRAMMING

TEMPLATE TOOLKIT

So far our example hasn’t shown any of TT’s more powerful features. PROCESS looks useful enough, but you probably don’t want to have all of your pages with the same <title> string, for example. Having multiple siteheader files would defeat the object of using TT in

Modified Listing $ <span class=”pcode”>tpage</span> info.html ?xml version=”1.0” encoding=”UTF-8”?> <!DOCTYPE html PUBLIC “-//W3C//DTD XHTML 1.0 Transitional//EN” “DTD/xhtml1-transitional.dtd”> <html> <head> <title>Site Information</title> </head> <body bgcolor=”#ffffff” text=”#000000”> <h1>Information about the TT Example website</h1> <p>The TT example website has been produced to illustrate the use of <a href=”http://www.template-toolkit.org”>Template Toolkit</a> when building static Web sites.</p> <p>Have a look at the other site pages, and don’t forget to look at the source HTML code</p> <ul> <li><a href=”index.html”>Index</a> page for the site</li> <li><a href=”about.html”>Contact</a> information for the site</li> </ul> <div> <p>Example site copyright © 2001 Jim Cheetham</p> </div> </body> </html>

menu.html [% PROCESS siteheader.tt2 title=”Menu example” %] <h1>Menu example</h1> <p>This is an example of a list of values, used twice by the same template and presented in two different ways</p> [% menuitems = [ “first”, “second”, “third”, “fourth”, “fifth” ] %] <table border=”1”> <tr><th>Vertical Menu</th><th>Horizontal Menu</th></tr> <tr><td>[% PROCESS menu.tt2 dirn=”vertical” %]</td> <td>[% PROCESS menu.tt2 dirn=”horizontal” %]</td></tr> </table> [% PROCESS sitefooter.tt2 %]

menu.tt2 [% DEFAULT dirn = “horizontal” menuitems = [ “firstitem”, “middleitem”, “lastitem” ] %] [% itemcount = 0 %] [% FOREACH item = menuitems %] [% item %] [% itemcount = itemcount + 1 %] [% IF itemcount != menuitems.size %] [% IF dirn == “horizontal” %] , [% ELSE %] ,<br /> [% END %] [% END %] [% END %] 60 LINUX MAGAZINE 13 · 2001

the first place, so how can we easily ask for variations? For the answer to this, have a look at the info.html file. This file is almost identical to index.html, and you can again see the usefulness of having standard PROCESS instructions to keep our site pages consistent. But for this page, we want to have a different <title>, so in the PROCESS statement where we call for the siteheader.tt2 template, we have included the name and value of the title variable. When this gets processed by <span class=”pcode”>tpage</span>, we see the results as the modified listing. In this case, the <title> declaration in the code now reads “Site Information”, instead of the default “TT example site”. This flexibility in variable declaration is a big feature for TT. The siteheader.tt2 file does set it’s own value for title, but we are able to override it with the PROCESS line in info.html because it is defined within a DEFAULT block, which only sets values for variables if they have not been already specified elsewhere. Similarly, it would be easy to alter the values of papercol and inkcol on a per- page basis, by including their specifications on the relevant PROCESS lines.

Decisions, decisions... So far the templates we’ve been using have been pretty straight-forward, just setting and using values. TT starts to get more interesting when you encourage your templates to make decisions (based on the values of variables) and produce different output in response. Let’s have a look at a new Web page, menu.html, which uses the template menu.tt2 to present two different variations of the same menu – a little contrived, perhaps, but you will see what I’m getting at. I’m also going to introduce you to some looping control statements, and list variables. It’ll sound easier when you see the examples in the menu.html file: We’ve seen most of this before, the two PROCESS directives at the beginning and end are the same as in the other HTML files. However, there are a couple of new things here. The first is the declaration of menuitems as a list of values, in a syntax that Perl people will be familiar with, and the second is the use of a PROCESS directive right in the middle of a table, twice. Each time menu.tt2 is called we are selecting a different value for dirn, the variable that determines how the menu items will be presented. Now for a look at the menu.tt2 template file: There are a few familiar directives in this file, so let’s deal with them first. The DEFAULT block at the beginning allows the template to set values for dirn and menuitems, in case the calling program did not specify them. I’m not so sure how useful it is for a template to provide it’s own data, in menuitems. It might be better for the template to check to see if menuitems has been specified, and if not, to output some sort of diagnostic message. It’s a matter of taste, I guess.


TEMPLATE TOOLKIT

Then we set the itemcount variable to be 0. This variable will be used to keep track of how far through the list of items we have progressed. Now we encounter a new directive, FOREACH. This statement sets up a loop construct, which intends to step through each value in menuitems in turn, setting the variable item to whatever the next list item is, each time. The end of the FOREACH block is indicated by the [% END %] statement, and I’ve used indenting to make it easier for the reader to match up the END statement with the relevant beginning. The first thing we do inside the loop is to output the current list item value. Then we add one to the counter itemcount. Yes, there are lots of other ways of doing this job, but let’s stick to the simple methods. Now we have a decision to make. If we have reached the last item in the list, we just want to finish. You haven’t yet seen what we do if we’re in the middle of the list, so it may not be entirely clear why we don’t want to do it at the end of the list, but trust me for the moment, and read on. The decision is made by the [% IF statement. It looks at the conditional, which is the statement “itemcount !=menuitems.size”, and decides whether it is true or not. If it is true, in other words, if we are not on the last item in the list, then we can carry on down to the next section of code, otherwise the test fails and we END the IF block. There is a handy reversal of the IF statement, known as the UNLESS statement. Sometimes it’s easier to read your template code in a more natural voice when the test word is the opposite way round. Template Toolkit (and Perl!) tries to be easy to use. So, we’ve decided that we’re not yet at the end of the list. We’d like to put something between the items in the list, otherwise they’ll run together on the final output. For the horizontal list, we’ll add just a “,”, and for the vertical list, we’ll add a comma and a linebreak, “,<br />” (Note that I’m using XHTML statements here, it’s good practice and won’t break existing browsers). I’ll use a simple IF test on the dirn variable, to see if it is equal to the word “horizontal”. If it is, we’ll output just a comma, and if it isn’t, we’ll go for the comma and line-break. Of course, with a test like this we’re not being very thorough - if someone had set dirn to a value like “sideways” they’d end up with vertical. You could change the test around to have a different default, or even allow a situation where you could output the list with no delimiters when the dirn is not recognised. When you get round to running this example, you might be surprised to see lots and lots of extra, blank lines in the final output. This is a side effect of the default behaviour of TT, where it leaves the original file untouched outside of the [% ...%] blocks. This includes the line endings after the %] sections, and can be dealt with - but not with the simple tpage program. Hold your horses and wait for the ttree command, coming up next.

PROGRAMMING

ttree listing $ ttree -d destdir -s sourcedir —ignore “.tt2$” ttree 2.03 (Template Toolkit version 2.00) Source: ~/ttexamples/tt-website/source/ Destination: ~/ttexamples/tt-website/example/ Include Path: [ ./Websrc/templates, /usr/local/templates/lib ] Ignore: [ \b(CVS|RCS)\b, ^#, .tt2$ ] Copy: [ \.png$, \.gif$ ] Accept: [ * ] + about.html + index.html + info.html + menu.html - menu.tt2 (ignored, matches /.tt2$/) - sitefooter.tt2 (ignored, matches /.tt2$/) - siteheader.tt2 (ignored, matches /.tt2$/) - sitemap.tt2 (ignored, matches /.tt2$/)

The ttree command So far I’ve been running TT by using tpage on one file at a time, which is fairly awkward and definitely not easy to keep track of. It is time to move up to the ttree program, which offers far more flexibility, and by default will process all of your files correctly. ttree has a great configuration file, but I unfortunately don’t have the space here to explain it – instead, try reading the extensive documentation that comes with Template Toolkit. When ttree first runs, it will try to create a suitable config file, in your home directory by default, and you can go off and edit this to suit yourself. However, for our example we don’t really need anything more complex that the default config file. It is useful to be able to specify the output directory for ttree to be different from the input directory – but don’t panic, ttree quite sensibly refuses to let them be the same. So on the ttree command line we’ll specify the source and destination directories, and we’ll also make sure that it doesn’t process the template files we have, by asking it to ignore all files ending in “.tt2”. See ttree listing ttree is your friend. I haven’t really been able to do it justice here, beyond the simplest use, but it is an excellent way to help you look after your TT source files and get them built into the right place. It also understands the modification date stamps on your source files, and will only process files that have actually changed, next time you run it. This is best appreciated when running on under-powered workstations, which is pretty much what everyone has. You might remember from above the comments about tpage allowing what may seem like excessive blank space to appear in your output files. Well, with ttree you can request that TT eats up all that blank space, with a series of options to the command that look like this: $ ttree —pre_chomp —post_chomp —trim (everything else)

Links The example website, found at http://tt.gonzul.net Template Toolkit http://www.template-toolkit.org Perl http://www.Perl.org CPAN http://www.cpan.org

Summary This has been a brief overview of the Template Toolkit’s capabilities, introducing basic invocation methods and some simple logic and flow-control directives. With just these commands, however, it is possible to produce some quite complex static websites relatively quickly. Many of TT’s more powerful commands are more suited to dynamic work, or to invocation from within a Perl program – which I definitely encourage you to explore in the future! ■

13 · 2001 LINUX MAGAZINE 61


BOOKS

REVIEWS

LINUX DEVICE DRIVERS ALISON DAVIES

Are you frustrated by not having the drivers for a particular piece of hardware? Then write your own! The aim of this book is to teach you how to write device drivers for Linux. It is aimed at people who want to experiment with the computer and at technical programmers who need to deal with the inner workings of a Linux box. It covers kernel hacking rather than quick fix user-space applications and so is not for the faint hearted. The main target of the book is writing for version 2.4 of the Linux kernel, but each chapter includes a section on backward compatibility. The book covers modularization; char devices; debugging techniques and advanced features of char devices such as blocking operations and time and memory management in the kernel. Chapter eight covers hardware with management of I/O

ports and memory buffers. These chapters require the building of a simple testing device. Chapter 10 continues the writing of kernel software and covers portability. The later sections of the book go deeper into modularization, block devices and even more advanced aspects of memory management. Network interfaces are dealt with in chapter 14. The book ends with a guide to the overall design of the kernel source. There is a good section on further information so that if your appetite has been whetted by this book you can continue to develop your projects. ■ Author: Publisher: Price: ISBN

A Rubini & J Corbet O’Reilly £28.50 0-596-00008-1

CRYPTOGRAPHY IN C AND C++ If you are interested in writing cryptography algorithms or just want to know how they work then this is the book for you. Complete with a CD-ROM containing Linux tested code, the book pulls no punches with mathematical formulae, starting with low level programming of fundamental maths operation, the book expands to advanced mathematical theories. With C examples at each step explaining multiprecision arithmetic, the author concludes the opening section with C Lint testing routines. The second half of the book deals with C++ and error handling. Full explanations of the RSA algorithm are used as a working example. The book concludes with the Rijndael, the new American data encryption standard. Comprehensive appendices

62 LINUX MAGAZINE 13 · 2001

separate out directories of functions for C and C++. Due to the heavy mathematical nature of the text, this is not a book for light reading or dipping into, but the step-bystep path provides a rewarding challenge for the committed enthusiast. The ready-made assembly routines included on the CD, prove to be not just informative but actually useful, and now reside on my hard drive. If you were fascinated by “Fermat’s last theorem” and enthralled by tales of Enigma coding then this is the next logical step. Overall a worthwhile book for this important topic. ■ Author: Publisher: Price: ISBN

M Welschenbach Apress £35.50 1-893115-95-X


BEGINNERS

ANSWER GIRL

The Answer Girl

TEAMWORK PATRICIA JUNG In this issue, Answer

cp is complaining that we want to copy more than one file and the last argument 2001.tex is not a directory (because several source files cannot be copied into a single normal file). 2001.tex? That looks like part of the date statement:

Girl gives an introduction to version control with CVS (Concurrent Versions

[trish@lunar answergirl]$ date Mon Oct 15 02:24:09 CET 2001

System). After reading this, when you see the

Now the scales fall from our eyes. The spaces count, for the shell, obviously as separators between arguments, so in cp as filenames. With double inverted commas, though, the bash can be persuaded that the spaces are part of the argument string:

phrase check-in you will no longer think about air travel.

When you work on a source text for a long time, sooner or later the day will dawn when you wish you could get back that section you deleted last Monday. But deleted is deleted, and who doesn’t make up their mind to do better next time?

The fact that the world of everyday computing, even under Linux, is often good for surprises, is a bit of a truism: Time and again things don’t work, or not as they are supposed to. Answer Girl shows you how to deal elegantly with such little problems.

[trish@lunar answergirl]$ cp classscript.U tex “classscript`date`.tex” [trish@lunar answergirl]$ ls -l classscU ript* -rw-r—r— 1 trish users 8967 OcU t 15 02:25 classscript.tex -rw-r—r— 1 trish users 8967 Oct 15 02:34U classscriptMon Oct 15 02:34:04

Version control

CET 2001.tex

Anyone who works alone and has a certain amount of staying power, may start off copying the latest version (for example classscript-2.tex) at the start of a session of work in a new file with a serial number (classscript-3.tex). Which version the one from last Monday was, you can find out from the date details, which states ls -l classscript* as last amendment date. So then why not just include the date stamp in the filename? With date a UNIX system supplies the latest date and time details to your doorstep, after all, and with the backwards-pointing inverted commas, you can induce the shell, first to execute the command contained therein and then to use the result in the complete command:

The drawback here: classscriptMon Oct 15 02:34:04 CET 2001.tex not only looks ugly, but because of the spaces in the filename the file will force us more than once to place its name in some command line or other in inverted commas. So we would prefer a filename à la classscript_dd_mm_yy.tex or, in order that the files always appear nicely in the sequence followed in the calendar in the ls outputclassscript_yy_mm_dd.tex. As a glance at the date manpage shows, this works too. We just have to send date on its way with the desired format placeholder following a plus:

[trish@lunar answergirl]$ cp classscript.tex.U classscript`date`.tex cp: copying multiple files, but last argumU ent (2001.tex) is not a directory Try `cp —help’ for more information.

64 LINUX MAGAZINE 13 · 2001

[trish@lunar answergirl]$ mv “classscriptU Mon Oct 15 02:34:04 CETU 2001.tex” classscript_`date +%y_%m_%d`.tex -rw-r—r— 1 trish users 8967 Oct 15 02:U 25 classscript.tex -rw-r—r— 1 trish users 8967 Oct 15 02:U 34 classscript_01_03_12.tex


ANSWER GIRL

... and for the less conscientious So you already feel ill at ease with all this compulsory thinking? You’re not the only one. And when it comes to working with co-author(s) on a manuscript, it’s not only the discipline that gets difficult but also corrections. Who can guarantee me, after all, that my co-writer will not quietly remove the typing errors put in yesterday from her version, while I am completely transposing the sentences in precisely this chapter in my own version? Not just to simplify the work routine, but also to avoid extra work, there is only one answer: there has to be professional version management. If you are working with office packages or suchlike, which store your work in proprietary binary formats (such as StarWriter), you will presumably fall back on the built-in version control function. ASCII texts on the other hand can be managed effortlessly with version managers, as used in large programming projects. The source code in a programming language is after all nothing but text. A search produces astonishingly little choice. While companies would still rather license the commercial Perforce (http://www.perforce.com/), Open Source projects can ask for a free licence; the fully-functioning evaluation version allows only for two-person projects, it is not only occasional version controllers who tend most to fall back on the tried and trusted Concurrent Versions System cvs. This also comes with most distributions. Those unable to find a suitable package would be best heading for http://rpmfind.net/linux/rpm2html/ search.php?query=cvs or for the source code http://download.cyclic.com/pub/.

Added on So first to the installation- in rpm-based systems such as with a [root@lunar software]# rpm -i cvs-1.10.U 7-1.i386.rpm There it is then, the great unknown. A timid [trish@lunar answergirl]$ cvs — —help does not exactly warm you up with its lovely muddle. But a closer look at the chaos does then help: Usage: cvs [cvs-options] command [comU mand-options-and-arguments] [...] In order to use cvs, then, we must at least state cvs and then specify a CVS command afterwards. In addition, the behaviour of cvs can be altered by means of cvs-options, which must be specified

BEGINNERS

before the CVS command. To increase the complexity even more, each CVS command can also be followed by its own options and arguments. After a nice deep breath, one of the next lines also decrypts itself: [...] (specify — —help-commands for a liU st of commands [...] In fact, cvs —help-commands outputs a whole range of commands and under one or other we can even get something of an idea: [...] init Create a CVS repositoU ry if it doesn’t exist [...] init, that sounds like initialise, and to anyone who has ever come across the term CVS repository before in some open source project or other, this looks like just what we want: create a CVS depot, in which we can put, or – check in – our files. [trish@lunar answergirl]$ cvs init cvs init: No CVSROOT specified! PleaU se use the `-d’ option cvs [init aborted]: or set the CVSROU OT environment variable. If only it were that simple ... Luckily, the cvs manpage explains the ominous option -d (“directory”) to us: CVS OPTIONS [...] -d CVS_root_directory Use CVS_root_directory as the root directory pathname of the master source repository. Overrides the setting of the CVSROOT environment variable. This value should be specified as an absolute pathname. So we are dealing with an option for the cvs command (unlike an option, which relates to a CVS command), which will have as argument the directory for our Depot. What matters here is: We must specify it with full path, for example ~/cvs/linuxcourse/coursedocuments.

ASCII texts: Texts, whose characters are saved in the “American Standard Code for Information Interchange”. This code in the 7-bit version encompasses only the characters which are found on an American keyboard and a few control codes such as CR (Carriage Return) for Enter (originally from a typewriter) or LF (“Line Feed”) for a line break. In 8-bit ASCII, most special characters from languages with Latin alphabets can be coded. But anyone wanting to write with Cyrillic or Hebrew characters will have to use other codings such as UTF8. Most of the common text editors in this country use ASCII. Full path: The route to a file starting from the root directory /. So the full path leads to the program file /usr/sbin/groupadd via the directories /—>usr—>sbin and is therefore written /usr/sbin. Relative paths on the other hand always start from the current working directory. If the full path to a program is not listed in the environment variable PATH, it is not enough to call up the name of the command. If the shell should acknowledge the command calls for groupadd, usermod and useradd with a command not found in the cocombatants box, the command with full path (/usr/sbin/groupadd etc.) will hopefully provide a remedy. If not, the question arises as to whether these commands are even installed.

[trish@lunar answergirl]$ cvs -d ~/cU vs/linuxcourse/coursedocuments init cvs [init aborted]: cannot make directoryU /home/trish/cvs/linuxcourse/coursedocuments:U No such file or directory All right, now we’ll just have to create the directory ~/cvs/linuxcourse together with a parent directory ~/cvs/ and try it again: 13 · 2001 LINUX MAGAZINE 65


BEGINNERS

SecureShell: Safe substitute for Internet services such as Telnet and RSH (“Remote Shell”), with the aid of which one can work on remote computers as if sitting right in front of them. Data transmission is encrypted when you do so. The SecureShell package usually comes with a secure substitute for RCP (Remote Copy) named scp. Tunnelling: Using a service via a connection which another service makes. So for example CVS packages can be transmitted repackaged into SSH packages.

ANSWER GIRL

[trish@lunar answergirl]$ mkdir -p ~/cvsU /linuxcourse [trish@lunar answergirl]$ cvs -d ~/cvsU /linuxcourse/coursedocuments init

[...]

No response this time, but in the best UNIX tradition that should actually mean that everything has gone smoothly. And so it has, ls ~/cvs/linuxcourse/ coursedocuments shows that this directory has been created and also contains another subdirectory full of very oddly named files, called CVSROOT. Now the started course documents just have to go in there and there was something else – that’s right, I don’t want to do the work involving writing scripts and slides all on my own, so my co-author must also have access to the depot. So as not to bore those who want to use their repository alone, the work steps necessary for this are shown separately in the Co-authors box.

a command, with which it appears to be possible to import our initial database, which is currently in the working directory ~/course, two directories script and slides each with a tex file and an illustration –o our depot:

Initial database The CVS command overview lists, with

import Import sources into CU VS, using vendor branches [...]

[trish@lunar answergirl]$ cd ~/course [trish@lunar course]$ cvs import cvs import: No CVSROOT specified! PleaU se use the `-d’ option cvs [import aborted]: or set the CVSROU OT environment variable. Obviously, if the depot is not specified with -d, cvs cannot even know where the data from the current directory should be imported to. But since we have no inclination to keep typing in the endless -d ~/cvs/linuxcourse/coursedocuments, we take to heart the last line in the error message and set the environment variable CVSROOT:

Co-authors Anyone who creates a depot in their own home directory doesn’t want the co-author/s to be able to poke around in all the files in ~ (root could also pack it away somewhere else, for example into /home/cvs).

then makes an Account for the user fred. If he is only to use this for CVS purposes, this was a little premature, since then he should belong exclusively to the group course.

Classification

[root@lunar answergirl]# usermod -g course fred

So the best thing to do is seize root and make a new group course with the Group number 101 not yet assigned in /etc/group:

The small -g swaps the primary Group for fred comes to the rescue, instead of adding an extra group. Then fred gets another new user password ...

[trish@lunar answergirl]$ su Password: root-Password [root@lunar answergirl]# groupadd -g 101 course The maintainer of the CVS depot made should of course be included by root in the new group. This is done by manually editing /etc/group and obviously also with graphic user management tools. But before we start it as root, we’d be faster with a

[root@lunar answergirl]# passwd fred New user password: password_for_fred Retype new user password: password_for_fred passwd: all authentication tokens updated successfully ..., the home directory pre-defined by useradd is made ... [root@lunar answergirl]# mkdir ~fred

[root@lunar answergirl]# usermod -G course trish The capital -G here means: “Add another group to the other Groups of which the user is a member”. A [root@lunar answergirl]# groups trish trish : users course reveals that trish now, apart from belonging to the group users, is also a member of course.

New user [root@lunar answergirl]# useradd fred

66 LINUX MAGAZINE 13 · 2001

... and handed over (with the change-owner command fred becomes the owner of his home, and course as primary group inherits the group rights thereto): [root@lunar answergirl]# chown fred:course ~fred This means that root can now log out with exit. trish as CVS maintainer still has however, one more task to do: The CVS group course must be given read, write and in the case of directories, execution (and/or directory change) rights to the depot directory. When a check of the group rights shows that the rights are correct, but the name of the group is wrong, the command chgrp will help to correct this.


ANSWER GIRL

[trish@lunar course]$ export CVSROU OT=~/cvs/linuxcourse/coursedocuments So on to something new: [trish@lunar course]$ cvs import Usage: cvs import [-d] [-k subst] [-I iU gn] [-m msg] [-b branch] [-W spec] repository vendor-tag release-U tags... -d Use the file’s modification time as the time of import. -k sub Set default RCS keyword substitution mode. -I ign More files to ignore (! tU o reset). -b bra Vendor branch id. -m msg Log message. -W spec Wrappers specification line. (Specify the — —help global option for a lisU t of other help options) So it’s not that simple or perhaps with so little interaction cvs insists that we tell it explicitly what to call the depot (“repository”). The two

BEGINNERS

additionally required arguments vendor-tag and release-tags are fortunately not necessary for simple version management, so that here we can enter any old thing. The vendor tag, which is a sort of identification for the publisher of the data to be imported, then applies when one wishes to control amendments to sources from third parties by CVS, which are not themselves to flow back to the publisher (because the latter does not feel the amendments are important, correct or general enough). If a new original version comes out, the release tag makes it possible to distinguish between the versions. Whatever the case may be, a

tex-file: Text file consists of content and TeX- or LaTeX commands, which typify the structure of the content. Using the text batch commands tex or latex this becomes the actual portrayal of the readyto-print file.

[trish@lunar course]$ cvs import linuxcouU rse trish v2001 alerts us, by calling up the vi editor (or the write program stored in the environment variables VISUAL or EDITOR), to the fact that it would like us to give it another short description of the data: CVS: —————————————————————

Ownership

Distant relation

Since files can only be given to groups to which one belongs, trish has to log in again, only then does

Now fred can log in to lunar, check out and edit the data in his home directory there, but it is fairly unlikely that he will want to be online the whole time he is working with the documents. The whole point of a revision control system is precisely that of coordinating the work of people who are working on several different computers. Since fred now has a shell account on the CVS server lunar, he can now easily tunnel his CVS queries via SecureShell. To do so, on his Internet computer, he sets the variable CVS_RSH (CVS Remote Shell) on the command ssh (if necessary, also specifying path details):

[trish@lunar answergirl]$ chgrp -R courseU ~/cvs/linuxcourse/coursedocuments/ stop issuing error messages, because with the login process the membership data is refreshed. The -R in chgrp (just like chown and chmod) ensures that the owner’s details for ~/cvs/linuxcourse/coursedocuments/ and all files/directories underneath are altered recursively in a round-up. Where there are several CVS user(s) with different primary groups it’s worth bearing in mind another problem: Since trish belongs to the depot directory, she can also check in data with her primary group users. But this would no longer be accessible for course-only members such as fred. This problem can be solved by providing the depot directory with the s-right (“set group ID on execution”) for the group: [trish@lunar answergirl]$ chmod g+sU ~/cvs/linuxcourse/coursedocuments/ [trish@lunar answergirl]$ ls -al ~/cvs/linuxcouU rse/coursedocuments/ total 3 drwxrwsr-x 3 trish course 1024 Oct 7 00:31 . drwxr-xr-x 3 trish users 1024 Oct 7 00:31 .. drwxrwxr-x 2 trish course 1024 Oct 7 00:31 CVSROOT This ensures that all data written into the depot directory is part of the group course, even if trish checks in with a different primary group.

[fred@fredsbox ~]$ export CVS_RSH=ssh Naturally, he must also set his CVSROOT variable on the depot directory /home/trish/cvs/linuxcourse/coursedocuments. Since this is on the remote computer lunar.answergirl.co.uk, this becomes somewhat more complicated: Using the keyword ext he specifies that the depot can be found on an external machine, then follows the address of the depot computer with the username first, and finally comes the destination directory. To make it quite clear where each component ends, they are separated from each other by colons. To allow cvs to proceed in the certainty that the whole monstrosity is not just a somewhat odd directory name, there must also be an initial colon: [fred@fredsbox ~]$ exportU CVSROOT=:ext:fred@lists.answergirl.co.uk:/hoU me/trish/cvs/linuxcourse/coursedocuments After that, fred can go online and check out the depot linuxcourse made in the main text with cvs co linuxcourse.

13 · 2001 LINUX MAGAZINE 67


BEGINNERS

ANSWER GIRL

No more variables to set, ever again Setting the variable CVSROOT and in the case of remote access to the CVS server CVS_RSH is not necessarily something one wants to do again by hand in every new shell. Those wishing to access just one CVS depot have it easy: They enter the export lines in the ~/.bashrc read out when starting each bash or if applicable also in the initialisation file for Login shells, ~/.bash_profile (at least when they are actually working with the Bash).

CVS: Enter Log. Lines beginning with `CVU S:’ are removed automatically CVS: CVS: ————————————————————— An o brings us in the case of vi to a new line and into write mode, so that we can enter the text. By pressing the Escape key we get into the command mode of vi and save and end our entry with the sequence :wq. cvs now acknowledges the import with

Login shell: The command line interpreter seen after logging onto a virtual console, in most cases under Linux the Bourne Again Shell bash. The fact that a Bash becomes a login shell is defined with the option -login, so that in Xterminal programs under X11 one can also get landed with login shells. If an echo $variablename should show that a variable set in ~/.bashrc does not appear in the current shell, then one is usually dealing with a login shell, because in the Bash it doesn’t care about the ~/.bashrc, but about ~/.bash_profile. If you have no success there either with the variables set there (as long as this is not due to the fact that errors slipped in during the setting), this has presumably been blocked by the Bash parameter -noprofile. The only remedy for this is to delve even deeper into the system or the previously mentioned sources of the variables shortly before use.

cvs import: Importing /home/trish/cvs/linuU xcourse/coursedocuments/linuxcourse/script N linuxcourse/script/unixcourse.tex N linuxcourse/script/tree.eps cvs import: Importing /home/trish/cvs/linuU xcourse/coursedocuments/linuxcourse/slides N linuxcourse/slides/unixslide.tex N linuxcourse/slides/tree.eps No conflicts created by this import This means the depot is now filled with data and the data directory ~/course together with subdirectories have each been enriched by a directory named CVS.

In and out It is now really easy to work with the data which has been checked in. At the start of a work session a [trish@lunar course]$ cvs update cvs update: Updating . cvs update: Updating slides cvs update: Updating script brings the data in the data directory up to date; now if you want to update only a certain subdirectory, simply change to it before the command. If you have reached the end of a work unit, check in the amendments to files in the respective subdirectories with (check in). [trish@lunar course]$ cvs ci cvs commit: Examining . cvs commit: Examining slides

68 LINUX MAGAZINE 13 · 2001

It gets more complicated when you are dealing with several depots. Then it is advisable to write the export lines into an otherwise empty text file and to read these in before the first access to a depot of a specified shell with the command source file_with_CVS-variable to sources.

cvs commit: Examining script Here again you will need vi knowledge to describe the amendment. And of course, individual files can be “committed”: [trish@lunar course]$ cvs ci slides/uniU xslide.tex CVS: CVS: VS:’ CVS: CVS: CVS: CVS: CVS: CVS:

————————————————————— Enter Log. Lines beginning with `CU are removed automatically Committing in slides Modified Files: unixslide.tex —————————————————————

Qualms? If you quickly want to back out now, simply stop the editor without making any amendments. :q! in the command mode of vi will then make cvs start whinging: Log message unchanged or not specified a)bort, c)ontinue, e)dit, !)reuse this meU ssage unchanged for remaining dirs Action: (continue) An a confirms that we are serious about stopping unlike a simple Enter, which checks in nevertheless, and e, which brings us back into the editor.

Fresh Data New ideas in new subdirectories – sometimes in retrospect it turns out to be quite helpful to give the initial passion for work some structure ... [trish@lunar course]$ mkdir concept [trish@lunar course]$ cd concept If the concept, concept.tex is in this subdirectory, it should also be checked in – only how, if it’s not yet in the depot? [trish@lunar concept]$ cvs add concept.tex


ANSWER GIRL

cvs add: cannot open CVS/Entries for reU ading: No such file or directory cvs [add aborted]: no repository ... so that was not quite the right idea: Of course, cvs is uncertain what to do with the file, since the subdirectory concept does not yet contain a CVS directory. So it’s back one command or one directory ... [trish@lunar concept]$ cd ..

BEGINNERS

Pre-programmed conflicts? If several people are working on one document there is not a CVS server in the world which can prevent amendments which one person checks in colliding with amendments which the other wants to commit somewhat later: [trish@lunar linuxcourse]$ cvs commit cvs commit: Examining . cvs commit: Examining slides cvs commit: Examining script cvs commit: Up-to-date check failed for `script/unixcourse.tex’ cvs [commit aborted]: correct above errors first!

... and one after the other: [trish@lunar course]$ cvs add concept ? concept/concept.tex Directory /home/trish/information/clasU sscript/concept added to the repository [trish@lunar course]$ cd concept [trish@lunar concept]$ cvs add concept.tex cvs server: scheduling file `concept.teU x’ for addition cvs server: use ‘cvs commit’ to add thU is file permanently That appears to have worked, except that the file is obviously not inside yet. So we follow the instruction to commit the new access too. Whether to do this you use the long cvs command commit or the short ci, the result is the same: [trish@lunar concept]$ cvs commit cvs commit: Examining . CVS: ————————————————————— CVS: Enter Log. Lines beginning with `CU VS:’ are removed automatically CVS: CVS: Committing in . CVS: CVS: Added Files: CVS: concept.tex CVS: ————————————————————— oNew concept ESC:wq RCS file: /home/trish/information/classscript/concU ept/concept.tex,v done Checking in concept.tex; /home/trish/information/classscript/concU ept/concept.tex,v <— concept.tex initial revision: 1.1 done

Reconstruction concept.tex changed back and forth, once the structure was in place, the file kept changing until it became the present course script. After some time and a few work phases the names at the front and back no longer matched each other anyway, while

Obviously here someone has also tinkered with script/unixcourse.tex, and the depot contains a version which is newer than trish’s working version. This calls for an update: [trish@lunar linuxcourse]$ cvs update cvs update: Updating . cvs update: Updating slides cvs update: Updating script RCS file:U /home/trish/cvs/linuxcourse/coursedocuments/linuxcourse/script/U unixcourse.tex,v retrieving revision 1.2 retrieving revision 1.3 Merging differences between 1.2 and 1.3 into unixcourse.tex rcsmerge: warning: conflicts during merge cvs update: conflicts found in script/unixcourse.tex C script/unixcourse.tex cvs tries to get the amendment made in the meantime and trish’s new amendments under one roof (to merge). If this works successfully, there is nothing else to worry about, but if however as here it goes awry, trish must set to work in person and load the conflicting file unixcourse.tex into the editor again. This file has now been amended in the meantime by CVS so that the conflict is visible and easily found for manual editing: <<<<<<< unixcourse.tex Summer course Information Oxford Uni ======= Summer 2001 Oxford University >>>> 1.3 Above is her own version, and below the current depot version. All that is left to do with this is to remove the <-, =- and > lines and to merge the contradictory lines into exactly the text which is now to be checked in, e.g. Summer course Information Oxford Uni Then all that remains is to check it in.

back no longer matched each other anyway, while the previously checked-in unixcourse.tex has been edited in the script directory by nobody. In short: it is now time to make a cut and update the system, to overwrite unixcourse.tex with the content of concept.tex and to take concept.tex out of the repository. This is done with:13 · 2001 LINUX MAGAZINE 69


BEGINNERS

Attic: In directories with this name CVS stores the content and the history of files deleted from the depot. PGP: “Pretty Good Privacy”, which is certainly the commonest program for encryption and signing of emails and other data.

ANSWER GIRL

[trish@lunar course]$ mv concept/concepU t.tex script/unixcourse.tex [trish@lunar course]$ cvs remove conceU pt/concept.tex cvs remove: scheduling `concept/conceU pt.tex’ for removal cvs remove: use ‘cvs commit’ to remove thU is file permanently [trish@lunar course]$ cvs ci There are now two challenges during check-in: The new content of unixcourse.tex wants to be commented (for example with concept.tex now unixslide.tex), and in the documentation of the removal of concept.tex, cvs kindly specifies the same comment:

S:’ are removed automatically CVS: CVS: Committing in concept CVS: CVS: Removed Files: CVS: concept.tex CVS: ——————————————————————————————————— Once saved, we want witnesses to the amendments in the depot:

concept.tex now unixslide.tex CVS: ———————————————————————————————————

cvs commit: Examining . cvs commit: Examining slides cvs commit: Examining concept cvs commit: Examining script Checking in slides/unixslide.tex; /home/trish/cvs/linuxcourse/coursedocumeU nts/linuxcourse/slides/unixslide.teU x,v <— unixslide.tex

CVS: Enter Log. Lines beginning with `CVU

new revision: 1.3; previous revision: 1.2

No Shell Access? An account on a computer does not mean that it should also be used for all colleagues and friends to poke around in the system, for the use of Internet services or for filing data. On the contrary, there are lots of good reason to limit access to the CVS, without giving up the security offered by the SecureShell. All the administrator of the CVS server needs to do is to put the public SSH-key of the CVS user onto their workstation. In a similar way to PGP these create a key pair on their work account, of which the public key can be given out as you like, but the private key must be kept a secret. This is done with the command which comes with the SSH packages ssh-keygen and was discussed at length in the AnswerGirl in issue 9. ssh-keygen places the secret private key in the pre-set under ~/.ssh/identity. This must not be passed on! What the CVS sysadmin would like is a text file named ~/.ssh/identity.pub and looks something like this: 1024 35 1650436685253880075036753018316341259121199915025267000291059581615422698465467725 722291087981529925297580740457070035732730200443808731123567242499042199399958562417180463 886282258629627912928659500834818993325398351812901126113547302151424173769600621465990430 65554089684980963002106747241282736545822186999 fred@fredsbox.fred.co.uk This one long line is transferred by the sysadmin into the file .ssh/authorized_keys in the home directory of the user. fred’s public key thus ends up in ~fred/.ssh/authorized_keys on the CVS server. The server admin has in the meantime poked around in the manpage on SecureShell servers sshd and, under the heading of AUTHORIZED_KEYS FILE FORMAT has stumbled across the fact that at the start of a key line, it is possible to specify a command, which instead of a login shell is executed whenever the corresponding user tries to log in with the appropriate private key via ssh. So when fred comes via ssh with his cvs request, the CVS server should simply start and grant access only to the depot in ~trish/cvs/linuxcourse/coursedocuments. An appropriate CVS server is started with cvs server —allow-root=/home/trish/cvs/linuxcourse/coursedocuments, whereby the sysadmin only has to place a command=”cvs server —allow-root=/home/trish/cvs/linuxcourse/coursedocuments” at the start of the key line of fred’s public key: command=”cvs server —allow-root=/home/trish/cvs/linuxcourse/coursedocuments” 1024 35 16504 366852538800750367530183163412591211999150252670002910595816154226984654677257222910879815 299252975807404570700357327302004438087311235672424990421993999585624171804638862822586296 279129286595008348189933253983518129011261135473021514241737696006214659904306555408968498 0963002106747241282736545822186999 fred@fredsbox.fred.co.uk What matters here is that the whole rat’s nest must be kept as a single line.

70 LINUX MAGAZINE 13 · 2001


ANSWER GIRL

done Removing concept/concept.tex; /home/trish/cvs/linuxcourse/coursedocuU ments/linuxcourse/concept/concept.teU x,v <— concept.tex new revision: delete; previous revisioU n: 1.1.1.1 done Now all that needs to be done is to delete the nowempty concept directory, and this is done for us by a [trish@lunar course]$ cvs update -P (“purge” - “cleanse”).

Nothing is forever So the concept no longer exists. But here comes the unexpected question: “Have you perhaps got a concept for me?” We haven’t any more, but cvs has. All we need to do is remember at what point in time the concept was a concept. Fortunately, CVS keeps painstaking books. And one can inspect these, using cvs log. The best way to do this is if we don’t let everything rush by us, but by sending less:

BEGINNERS

More about cvs

cvs update: Updating . cvs update: Updating slides U slides/unixslide.tex cvs update: Updating script U script/unixcourse.tex Unfortunately, concept.tex is not there. [trish@lunar course]$ cvs update — —help update: invalid option — - — Usage: cvs update [-APdflRp] [-k kopt] [-U r rev|-D date] [-j rev] [-I ign] [-W spec] [files...] -A Reset any sticky tags/date/kopts. -P Prune empty directories. -d Build directories, like checkouU t does. [...]

Anyone who is in utter despair with the documentation supplied with CVS, should not give up. At http://cvsbook.redbean.com/ can be found the GPLed section of Karl Fogel’s CVS book for online browsing or downloading. Since this Answer Girl can in no way provide exhaustive information on CVS, it is recommended as further reading for all those who’ve tasted blood.

... was not all that wrong: Although there is no such option as — —help, cvs spits out precisely what we need: aid for the cvs command update. The option -d looks promising and that’s also how it turns out: update does in fact normally only update files already checked out, but when one says update -d, it is a bit more willing to co-operate and also creates directories which are missing from the working data directory.

[trish@lunar course]$ cvs log | less With the less search command /concept we find therein the comment on the renaming: RCS file: /home/trish/cvs/linuxcourse/coursedocumeU nts/linuxcourse/slides/unixslide.tex,v [...] —————————————— revision 1.3 date: 2001/10/15 21:57:53; author: triU sh; state: Exp; lines: +1 -1 concept.tex now unixslide.tex —————————————— Unfortunately, we are given no information as to the former existence of the file concept.tex itself: Our fault – had we not deleted the empty concept directory with its information in the CVS subdirectory with the -P flag of update, we would not be so helpless now. In the worst case, you will now have to rifle through the file tree of the CVS depot with less and ls, until the appropriate Attic file to concept.tex (~/cvs/linuxcourse/coursedocuments/linuxcourse/co ncept/Attic/concept.tex,v) is found. In any case, we establish that concept.tex must still have been in existence on 15.10.2001 at about 21:55. So if we adapt our database to that which was current at that time:

[trish@lunar course]$ cvs update -dD “200U 1-10-15 21:55:53” cvs update: Updating . cvs update: Updating slides cvs update: Updating concept U concept/concept.tex cvs update: Updating script Now we have got concept.tex back and can print out the concept. So that you do not first have to consult the CVS book mentioned in the box “More documentation on cvs”, I can reveal to you now that the ominous -A flag in the update command is the only option for getting an up-to-date working copy without concept.tex: [trish@lunar course]$ cvs update -A cvs update: Updating . cvs update: Updating slides U slides/unixslide.tex cvs update: Updating concept cvs update: warning: concept/concept.teU x is not (any longer) pertinent cvs update: Updating script U script/unixcourse.tex Whether you now wish to delete the concept directory, using -P, is entirely up to you. ■

[trish@lunar course]$ cvs update -D “200U 1-10-15 21:55:53”

13 · 2001 LINUX MAGAZINE 71


BEGINNERS

KORNER

K-splitter

FAIRGROUND STALL OR MONK’S CELL? STEFANIE TEUFEL

Who says there is no place for gossip and scandal in a Linux magazine? K-splitter broadcasts the latest news from the K-World and noses around here and there behind the scenes. Anti-aliasing: Aliasing refers to the stair effect at the edges of graphics, especially of text or lines, which arises because it is only possible to display truly straight lines with pixels if they are horizontal or vertical. The solution – inserting shading pixels into the steps – is called anti-aliasing. Library: A file containing a collection of useful C functions for specific purposes. So for example there is libm, which provides mathematical functions, or libXt, containing functions for programming the X11 window system. Often libraries are utilised by several programs simultaneously (shared). MIME: This abbreviation stands for Multipurpose Internet Mail Extensions, a method of specifying standardised file types. Examples of MIME types are text/plain (pure text file without formatting) or video/mpeg (MPEGcompressed video stream). MIME is used primarily in mail programs or Web browsers. Normally this data is stored in a file named mime.types.

Up close and personal Where there are lots of options, there is great confusion. The more the KDE project grows, the more comprehensive the configuration options become. That does not exactly make life simple for new users, which is why a new tool is now being worked on feverishly, the so-called Kpersonalizer. This is supposed to start from KDE 2.2 on first login of the user and accompany them through their first steps through the desktop configuration jungle (Figure 1). Adaptability with respect to the look and feel is highly sought after. So should anti-aliasing be used for fonts and icons or not? The user can decide by a mouse click whether KDE should behave like Windows – whether the desktop responds to double instead of single clicks, or whether KDE should take after the good old UNIX type. What may be even more interesting is the also integrated Eyecandy-O-Meter (Figure 2). Here one

Figure 1: Start at the beginning 72 LINUX MAGAZINE 13 · 2001

will be able in future to choose – depending on the computer equipment – between a multicoloured fairground stall or a Spartan monk’s cell; at least when it comes to graphical effects. Simply move the slide controller in the respective direction, the rest is done by KDE. But there will be no support for the installation of printers and similar inconveniences. Because the developers fear that some distributor or other could, due to the differences between distributions, quickly and comprehensively deactivate the new configuration tool.

Making a facsimile of himself... ... is not something Michael Goffioul has been doing, although he is making a good few waves in the KDE Project. First he came up with the print library for KDE 2. Then he expanded this by various print back-ends, and lastly he enhanced the KDE 2

Figure 2: Fairground Stall or Monk’s Cell?


KORNER

printworks by fax and PDF support. But as if that were not enough he has now come up with the little tool KdeprintFax, with which it is possible to send a fax directly from any KDE 2 application. The rather sad background to this glad development: The program Ksendfax which used to exist has, like so many other small, but useful programs, not been ported to KDE 2. The main features of the new fax maker are EFax support and Hylafax support, and also an in-built filter mechanism, which converts any file type, before sending, into PostScript format. The usual external programs act as filters here, which are freely configurable within the fax utility and can be defined for every MIME type. One nice side effect: Documents that you want to send by fax need only be dragged into the main window using drag and drop. Goffioul has already integrated the source code of the program into the existing CVS tree. Whether, and to what extent, the program will make it into the next major KDE release (2.2) had not yet been determined when we went to press.

A thesaurus for KWord KWord development is gathering pace. In order to make it more attractive for users to change over from their usual word processing programs such as Word or StarOffice the developers are constantly coming up with new ideas. The latest idea stems from Daniel Naber, who presents a patch on the developer list which serves to expand KWord in the near future by a thesaurus based on Wordnet. The substantial, 13MB, size of the download from http://www.cogsci.princeton.edu/~wn/ and worries about the free distribution of Wordnet have now led to a search for alternatives. Under discussion is, among others, Kdict.

All bets are off KDE games are growing up. This has apparently been noticed by the makers of the GGZ Gaming Zone, a free alternative to Microsoft’s Gaming Zone, because part of their new release 0.0.4 is for the first time a KDE front-end. As on other online gaming sites, players can log on to the Gaming

BEGINNERS

Figure 4: Tic Tac Tux

Zone and then play against other users or computers. GGZ offers, among other things, various levels of difficulty, chat rooms and statistics – and all are available now in the usual KDE look and feel thanks to the new KDE client. At present there are just three KDE games, but by the time the next release comes out their number should have doubled. In order to avoid duplicates, the makers of the Gaming Zone want to work closely with the people from the KDE games project. Further information and the latest gossip from the world of GGZ gaming can be found by hobbygamblers at http://ggz.sourceforge.net/. And anyone who would like not only to play, but also to make a real contribution to the further expansion of the project, can lay down their suggestions and offers of assistance at any time at ggzdev@lists.sourceforge.net ■

CVS-Tree: Especially when many developers are working on a software project, it is necessary to have a procedure which prevents anyone overwriting changes made by their co-programmers by mistake or destroying the only working version. Many projects therefore use a Concurrent Versions System. It is also possible at any time to reconstruct an older development version from a CVS file tree, but obviously it is most often used to hold the latest program code. The Answer Girl in this issue gives an insight into the construction and use of a CVS depot. Wordnet: Work began on the Wordnet project in 1985 and is still going on. Those working on development are mainly psychologists and linguists from the University of Princeton, USA. Wordnet is a sort of online lexicon. The structure of the lexicon is in line with psycholinguistic theories about human memory, unlike the more usual lexicons, which are organised according to alphabetical order or into classes of synonyms. Wordnet is not restricted to a specific area and currently consists of over 95,600 different English word forms. One of the most important characteristics of Wordnet is that for any given word, a distinction is made between the word form and the meaning of the word. The meaning of the word is represented here by a number of synonymous word forms. This thesaurus is mainly recommended as an aid to composition when one wishes to write an English text.

Figure 3: The new miracle weapon in printing 13 · 2001 LINUX MAGAZINE 73


BEGINNERS

KORNER

K-tools

RIGHT INTO THE NET STEFANIE TEUFEL

In this column we present tools which have proven to be especially useful under KDE because they solve a problem, or are just some of the nicer things in life, which – once discovered – you wouldn’t want to do without. In this issue, we want to show you an especially easy way to connect to the Internet with QtWvDialer. This is a simple graphical front-end for the useful tool WvDial. The charm of the original program lies in the fact that it automatically finds and configures your modem, after which, without further ado, it makes a PPP connection to your ISP. All you need is your username, password and telephone number – the rest is done by your dial-up assistant. With all these WvDial mechanisms comes QtWvDial – and many features along with it: a summary of the PPP connections that have been made for example, or a configuration editor.

Hand it over

Terminal emulation: The main purpose of a terminal emulation is to act, under a graphical user interface, as a substitute for the virtual terminals, between which you can hop merrily back and forth under Linux by pressing Alt and Fx. This console replacement allows you to use programs that were really intended for the command line under X. ■

You will also need one or two files. Firstly, of course, the latest version of the program itself is required. You can download it from author Matthias Toussaint’s homepage at http://private.addcom.de/t/toussaint/qtwvdialer. html. You will also need tmake from the Qt developer company Trolltech (at ftp://ftp.trolltech. com/freebies/tmake/) and of course WvDial. This should actually already be on your disk, since most of the latest distributions come with this handy dialler. If you are not sure, you can check, in the case of rpm-based distributions, with a

TMAKEPATH=/path/to/tmake/lib/linux-g++ PATH=$PATH:/path/to/tmake/bin export TMAKEPATH PATH Then unpack QtWvdial, and change to the directory that has been created. Now all you need do is enter a ./configure and send a make after it – done. A little tip for users of an Nvidia chip: If you have installed the Nvidia OpenGL driver and Qt has been compiled with OpenGL support, there could be problems. But the author has also made provision for this. If this happens, call up ./configure with --nvidia. That’s all there is to it. In the bin subdirectory you should now find an executable file qtwvdialer, which you can copy into any directory you like in your search path (say /usr/local/bin).

Getting friendly After installation, start the program as root by entering a qtwvdialer & in a terminal emulation of your choice. You will then be confronted by a window as in Figure 1, in which your new friend will tell you that he has made a directory for you. From now on you will find the settings and the monthly log files there. After clicking OK on your way into the network of networks please make one quick stop over in the main window of QtWvDial because it’s time to enter the Internet. To do this, click on the little screwdriver icon.

stefanie@diabolo[~]> rpm -q wvdial wvdial-1.41-12 Had no luck finding it? No matter, the necessary files are available at http://www.worldvisions.ca/ wvdial/. If you haven’t installed tmake yet, you should make up for lost time as soon as possible. To do this, all you have to do is unpack the tarball and then set a couple of environment variables. For Bash users it looks like this:

74 LINUX MAGAZINE 13 · 2001

Figure 1: Informative


KORNER

Figure 2: We did warn you

BEGINNERS

will have to define additional accesses with different data. You can play around with the functionalities of the program if you like using the QtWvDial tab. There are preset advisories for making your PPP connection or logging the duration of a connection. All really useful, so there’s no reason to alter the default settings. There is one other interesting option that we would like to point out to you. If you have activated the Start program after connect box, KDE’s dialler automatically starts a program of your choice after dialling. The default is Netscape. If you would rather surf the Web with Konqueror or Opera, obviously this can be changed at any time.

Figure 5: I who know nothing

Onto the Net

Figure 3: Dive in to the joy of configuration!

You will then be greeted by a window as in Figure 2, which challenges you to turn to the modem configuration. Don’t worry, a click on the OK button automatically opens the necessary window (Figure 3). Click on the button Run wvdialconf. The program then immediately starts to search for a modem on your system. This could take a few seconds, but in the end your patience will be rewarded with a result, as in Figure 4. Now you have to provide the tool with your access data. To do this, click on Account and then on the Add account button. In the window (Figure 6), which pops open, QtWvDial asks you to give a name to the access. Do as he requires and then click on OK. Now all you need to do is enter the appropriate details in the boxes Phone number, User name and Password (Figure 5). That’s all there is to it, at least for this account. If you dial up the Net using various providers, you

Figure 4: Sinking the shot

After all this configuration kerfuffle, at last you’re in business. To dial up the Internet, fish out the appropriate account in the drop-down menu in the main window and then click on Connect. You can then calmly observe the process of logging-on to your provider in a window. Once connected, the graphs in the lower part of the connection window (Figure 7) show you the throughput rate of your current connection. It’s really interesting to see what’s trickling through the wires. If you want to end the connection, just click on the Disconnect button, and QtWvDial immediately cuts you off from the Internet. Anyone who wants to risk a cautious glance at the time he has actually spent on the Net, can click again on the screwdriver icon after ending the PPP session and select the Logfile tab. This gives you the actual connections to the network of networks arranged by month and neatly separated into accounts (Figure 8). And if the sight of this gives you too much of a headache, you can delete all the log files by clicking on the Clear logfiles button. ■

Figure 6: You know best when it comes to your access data

Figure 7: Enough juice on the line?

Figure 8: Logged and filed 12 · 2001 LINUX MAGAZINE 75


BEGINNERS

DESKTOPIA

Jo’s alternative desktop

XFCE JO MOSKALEWSKI At first glance, the XFce desktop might remind the old UNIX hands amongst you of CDE (Common Desktop Environment). But XFce’s nippy, easy-to-use interface also supports some modern features. When it comes to posing almost any question, in the world of the UNIXes you’re spoilt for choice. There are usually several programs to choose from that can solve your problem. The same goes for graphical user interfaces: All distributors agree that this choice must be made easier for the user, and just install one of the two big desktop environments – often without even asking if they are required. But as a general rule, the distributions also have a few other pearls, which have no need to hide behind powerful interfaces. One of these is XFce.

echo $HOME If there is no XFce on one of the silver discs from the distributor, we will be happy to help out: On the cover CD you will find what is, at the time of going to press, the latest version 3.8.3 – but by the time this issue hits the streets, this could be out of date,

Figure 1: A first look 76 LINUX MAGAZINE 13 · 2001

as the development of XFce is anything but slow. In the six weeks it took for these lines to be prepared and typed, four new releases came out. During this period, not only were bugs corrected and details improved, but also all kinds of sensible innovations were implemented. If you like things up to date, take a look at http://www.xfce.org/. Unlike the package jungle of interfaces of a similar type, the people at XFce bundle absolutely everything into a single package, which at 3.4MB is not too big for download (for the source texts), just to try this environment out. “Can’t be much in it”, we hear some of you mumbling. Well, the ready-compiled source code at 38Mb proves to be an asset, and comes up with an impressive list of features: ● Modest hardware requirement ● Easy-to-use window manager “xfwm” ● Session management ● Front-ends for all options: No editing of configuration files necessary (although this is possible if a simple text format is used) ● Drag and drop ● Panel with program starter and clock ● Integration into KDE and Gnome menus ● Virtual desktops including pager ● Theme-capability including creating themes (also for general X-applications, especially Gnome and Gtk programs) ● XFTree file manager ● System sounds with XFSound ● Manager for background images ● Gnome-compatible ● XFGlob search tool ● Recycling bin ● Tool to mount drives ● 19 languages ● Support for Xinerama (multi-monitor operation)


DESKTOPIA

BEGINNERS

On your marks Installation means simply playing in the RPM package from the cover CD with rpm. If you would prefer to use the sources, you will need a few (not very spectacular) developer packages, which are listed in the INSTALL file included. Once XFce is on the hard disk, every user will also need an initial configuration. Here again the XFce developers have done all the work, and a simple xfce_setup (typed on the console or in XTerm) arranges things so that in future both in the graphical log-in, as well as when starting X via startx, XFce will fill the monitor. An xfce_remove can revive any settings which have fallen victim to the set-up. Users with a bit more experience might not necessarily entrust their home directory, including painstakingly created start files, to a strange set-up tool – so they should simply call up the window manager xfwm, which will also start the entire environment automatically. XFce will not make the existing configuration files; but those that already exist will not be touched, provided they weren’t created by Xfce. The Gtk themes in particular will have only a limited function. XFce is thus very easy and safe to implement – except that updating from a version before 3.8 means getting rid of old configuration files. If you want to explore the newly-created desktop world in person and gets stuck, would be well advised to consult the concise but adequate Users’ Guide at /usr/(local/)share/xfce/help/help.html. Unfortunately this has not been updated since Version 3.7, but it is still valid. But let’s slowly feel our way through this desktop environment:

Striking ... ...is the word for the main panel on the bottom edge of the screen. This is not – as is the custom of many other desktops – firmly anchored in position, but can be moved anywhere on the striped areas on the desktop with the mouse. The little button on the far left ends the X-session in the same way as the Quit button in the central area. Both do their duty, but never without first consulting the mouse. The really small button minimises the main panel, as if it were a completely normal program (which is basically what a panel is). A single mouse click on an icon starts the associated program – and these icons lead to drop-down menus, from which additional programs can be started. One nice feature is the lower bar in these drop-down menus: This can be used to separate a menu permanently and position it freely on the desktop. New to the panel is the option of selecting the number of icons and menus (a maximum of 12 is provided, which should be enough for any user profile). Equally, both the size of the icons as well as the display of the menus can be selected in three stages, making XFce more flexible than ever and means a major steps towards the declared objective:

Figure 2: The main panel – including ripped off menu

that of being able to work unhindered in a modern environment even on less powerful computers.

Set up Each start entry can be configured by right-clicking on it. If, on the other hand, the main panel itself is to be configured, the icon with the colour palette helps: This is where the XFce set-up is hiding. Unfortunately it is not yet possible to configure the entire desktop from here, and so in the menu over the icon for the mouse settings a few more tools can be found – such as for system sounds and background graphics.

Following suit ... ...is allowed in the first tab of the set-up dialog; bearing the title “Palette”, in this case meaning themes. The Load ... button allows you to choose one of the 73 themes supplied. Also of interest in the colour settings is the entry Apply colors to all applications – this makes basic settings via the Xresources, which are used by many X-programs (but not by KDE or Gnome programs, where the Gnome solution is also used in XFce). Anyone also wishing to use the XFce themes for Gtk applications (such as Gnome software), should have created the user’s own start files (as described above) via xfce_setup, or deleted or renamed ~/.gtkrc. This only leaves out the KDE applications. The eight colour zones in the set-up dialog offer an ingenious Figure 3: Themes on request for the Navigator

13 · 2001 LINUX MAGAZINE 77


BEGINNERS

DESKTOPIA

Figure 4: XFce’s standard window frames

Figure 5: Windows in the Mofit design

system for displaying your own taste in handling the colour palette – anyone who has ever tried to make their own Gtk theme any other way will certainly appreciate these splashes of colour. The second tab concerns XFce itself. If you want to use your own colour settings for each desktop (and no background graphics), activate the first menu item Repaint root Window of Workspace; the control button with the heading Panel Layer ensures on request that the main panel cannot be drowned out by other applications so easily.

Window master It only starts to get interesting in the third tab, which concerns the behaviour and appearance of the window manager xfwm. Both Click to focus and Auto raise are on offer – the latter unfortunately without configurable time delay (which is the only failing of this window manager). Other features sparkle. So for example there is a choice of three window decorations – XFce, Mofit (no typing errors!) and Trench, which extend from the company’s own look via the classic Linux desktop up to the flair of an Apple Macintosh. When the windows are moved, they dock with other windows and on the edge of the screen. This makes it easier to position a window. The handling of the virtual desktop is also really nice in the Figure 7: System sounds

78 LINUX MAGAZINE 13 · 2001

Figure 6: Trench Look

window manager. To toggle between them, there are several options: ● via the XFPager (by default in the top left corner), ● via the main panel, ● with a click on the middle mouse button, ● or else by deliberately pushing the mouse beyond the edge of the screen. If a window is intended to follow the journey of the mouse, it can simply be pinned firmly with the circle symbol in the toolbar. Another nice feature is that each virtual desktop can be assigned its own background. And if there is a shortage of space, the window can be minimised, not just as a traditional icon, but also onto its toolbar – either via another button in the toolbar or by a right mouse click on the toolbar. KDE and Gnome menus can be found with the left mouse button on the free desktop, while the middle mouse button always keeps the functions of the active toolbar to hand. And if, despite everything, an application can no longer be found on the desktop (or if this has been covered by a hasty Auto raise), the right mouse button can help.

...and its subordinates The last tab in the set-up dialog integrates the additional tools of Xfce. If you’d rather be sparing with resources, you can completely deactivate the system sounds there (and if you change your mind about this in the meantime, simply start these – like any other tool in XFce - manually in “main panel”). Things are somewhat trickier when it comes to the implementation of the Backdrop Manager – by means of which graphics instead of colours (or colour runs) can be set as desktop background. Unfortunately this cannot cope with virtual desktops, and so it is only possible to set one graphic for everything. So if you now, when swapping over to another virtual desktop, suddenly come across a single colour, the box marked Repaint root Window of Workspace, in the set-up dialog, should be deactivated.


DESKTOPIA

BEGINNERS

which first and foremost the file manager is responsible), in addition to a well co-ordinated collection of tools. There isn’t any session management yet, which is always active and unnoticed in XFce. If, when shutting down XFce, an XTerm window is left behind, this is opened again at the next start in the same place. Everything is still where it was left, making an autostart function pointless. XFce is a great success – as things stand now, the desktop user lacks for nothing. It has speed and stability combined with clean problem solutions and a huge dollop of comfort. It’s a long time since XFce was a secret tip, even if the installation routines of many distributions allow you to think so ■

Figure 8: Backdrop Manager

Workhorse Another important little helper is the file manager XFTree, included as standard. When you start it, you may be disappointed by its lean appearance, yet at second glance this is a distinct advantage: It can do drag and drop, and this means that not only can you use the two file lists familiar to numerous Norton Commander clones for this, but also open a third file list in the form of a third window – ideal for tidying up. The concept also corresponds to the latest trend: Only one directory tree is shown, in which the files are incorporated (and not just a file list with the directories in the current path). Drag and drop does not, however, function between the windows of the file manager. Instead, a text file can be dragged onto the editor icon of the main panel, in order to open it. Since XFce likes to use current standards, drag and drop also functions from a few other file managers – such as from Gnome Midnight Commander or KDE’s Konqueror. XFTree does not appear to be completely finished when files are opened via the file name extensions: If a program other than the preconfigured one is to be used here, this has to be stated manually in the configuration file ~/.xfce/xtree.reg.

Figure 9: Copying between two XFTree windows

A good nose You won’t need Dr Watson in XFce, but once the journey of adventure does begin through the leftovers from past computer activities, there is a handy search tool at your disposal with XFGlob. The search can be narrowed down, not only by the path or via file name filters, but also via the file content or the file type, as well as a number of other options.

Session management XFce is meant to be an environment – so far we can recognise the usual features of an environment such as the consistent look and feel (achieved by the use of GTK and overarching themes), drag and drop (for

Figure 10: XFGlob offers a wide range of search functions 13 · 2001 LINUX MAGAZINE 79


BEGINNERS

GNOMOGRAM

Gnome News and Programs

GNOMOGRAM BJÖRN GANSLANDT Each month we sift through Gnome news and tools and present you with a digest of all things Gnome.

A Gnome in the Sun Sun is now offering, Gnome 1.4 packages for its own operating system Solaris. At present Sun is still presenting the packages as a preview, but eventually Gnome will supersede the antiquated CDE. As a member of the Gnome Advisory Board, however, the activities of Sun are not limited to porting onto Solaris. Sun is also heavily involved in making Gnome accessible to disabled people, and is collaborating on compilation, documentation, session management and Gconf. Not least, the office package Open Office (formerly StarOffice)

from Sun is being adapted to Gnome. Another project involving Sun is the verification of the Gnome 2.0 architecture according to Sun’s own criteria. To this end, the 20 Questions questionnaire will be applied to the individual components of the system in order to improve stability and quality.

Gnome 2.0 Schedule The first schedule for Gnome 2.0 can be found at gnome.org, which should be completed in December. But there is some doubt as to whether the target date can be met. It doesn’t exactly inspire confidence when the developers argue among themselves over Gnome 2 – although it certainly wouldn’t be the first time a Gnome release was delayed.

Ximian and Swarmcast Miguel de Icaza, co-founder of Ximian, has announced that he would be interested in integrating Swarmcast into the Red Carpet package manager. Swarmcast is a system from Opencola,

Lightspeed Even if it’s not possible to accelerate objects of more than elementary particle size to the speed of light, we can still simulate the optical effects that would arise if it were. Lightspeed (Figure 1) is capable of importing 3D objects from Lightwave or 3D-Studio and bringing these to various virtual speeds. This sometimes gives rise to effects similar to those with sound – the Doppler effect for example. The object also shortens along its axis of motion and becomes very blurred. This blurring is due to the fact Figure 1: The Enterprise – not yet up to warp one that more distant points are perceived as if they were nearer. These and other effects can be activated individually, and the virtual camera can be turned and moved at will in space. Since the real-time display is incorrect, especially when it comes to the Doppler effect, Lightspeed also allows the export of scenes for the raytracer Backlight, which works with greater precision. To display objects Lightwave needs OpenGL or Mesa and GTKGlarea, which allows it to embed OpenGL as a GTK Widget. 80 LINUX MAGAZINE 13 · 2001


GNOMOGRAM

BEGINEERS

Gramps As soon as it encompasses more than a couple of generations, genealogy becomes very complicated, both in terms of research and in processing the data. But processing can be simplified considerably by programs such as Gramps. Gramps creates a gziped XML database, in which individuals and their family relationships are recorded. Information such family photos and genealogical sources can also be managed. Inside this database it is possible to search and to classify according to specified criteria – such as the Soundex codes formerly applied by the US government. But Figure 2: A little genealogy what makes the program really interesting are the numerous reports which can be generated from the database. In addition to a simple family tree and a complex family graph, Gramps can also process the genealogy as a Web page. The appearance of the Web pages here is partly influenced by templates. Under Tools there are numerous tools for working out the type of relationship between two people. It is also possible to verify the integrity of the database, though Gramps gives no information of any kind as to where specific errors have occurred. Gramps offers those who would be older than the selected parents when selecting possible children. This slows the work considerably, especially in the case of large databases. Since Gramps also interprets the common GEDCOM format, it’s a good program for medium-sized databases (Figure 2).

which is famous for its free cola recipe. Swarmcast is not going to be used to supply drinks, but large files. In a way similar to the peer-to-peer system Mojonation, this large file will be split up into lots of small pieces, which will then be downloaded. Everyone who owns a piece becomes part of a mesh and can obtain the missing parts from the other participants. This takes a considerable load off the main distributor, since he does not have to send the whole file to every individual. As soon as the file has been completely put together, the participant can leave the mesh.

Smoothed Text in Gtk+ 1.2 The fact that anti-aliasing is possible under Gtk+ 1.2 was demonstrated recently with the patch at chez.com. But however beautiful the associated screenshots may be, the patch is not compatible with the internationalisation in Gtk+ 1.2. For this reason all Gnome users will have to keep straining their eyes until the release of Gnome 2.0 with hard edges. Gnome 2.0 will then be based on Gtk+ 2.0, which contains anti-aliasing and a greatly improved internationalisation by means of Pango.

Gnome support To make it easier for new programmers to get into Gnome, the Gnome-Love Mailing List has been set

up, in which many relatively simple tasks, as well as support, are offered. The Gnome To-Do-System has also been revived, which should fill up rapidly – especially with respect to Gnome 2.0. Another way of becoming familiar with the architecture of Gnome is the correction of a few bugs: Apart from Bugzilla, a Bugsquad mailing list has been set up to this end. ■

URLs Sun’s Gnome site Twenty Questions

www.sun.com/gnome/ developer.gnome.org/dotplan/archreview/ 20questions.html Gnome 2.0 mail.gnome.org/archives/gnome-hackers/2001May/msg00221.html Swarmcast proposal slashdot.org/articles/01/05/23/2136226.shtml Swarmcast site www.swarmcast.com MoJo Nation www.mojonation.net Antialiasing patch www.chez.com/alex9858/gtkaa/ Gnome Love mailing list mail.gnome.org/mailman/listinfo/gnome-love Gnome To Do list gnome.org/todo/ Gnome Bug list bugzilla.gnome.org Gnome Bugsquad list mail.gnome.org/mailman/listinfo/gnome-bugsquad Genealogy gramps.sourceforge.net LightSpeed fox.mit.edu/skunk/soft/lightspeed/ BackLight www.anu.edu.au/Physics/Searle/Downloads.html Mesa www.mesa3d.org/ Gtk GL Area www.student.oulu.fi/~jlof/gtkglarea/ ■ 13 · 2001 LINUX MAGAZINE 81


BEGINNERS

KIDS

Linux for Kids The end of the long vacation

SUMMER ROUND-UP RICHARD SMEDLEY

Whilst work to get Free Software into British schools seems stalled, progress is being made around the world. Here we highlight some projects old and new.

Letter 11 The K Desktop Environment (KDE) is on a roll at the moment with praise for KDE 2.2 and KOffice 1.1 and a clear roadmap to KDE 3.x – a port to Qt3. As the project continues to gather momentum it is no surprise that many of the developers have turned their attentions to the needs of younger users. The KDE edutainment project will create educational software based around KDE. Recognising that KDE is rather deficient in software for children generally (though there are notable exceptions – see this column in Linux Magazine 10) the project will develop KDE educational software for children aged three to 18. Although just launched, the project has an active mailing list, a newsletter, columns on educational philosophy and diverse applications in development including, in a firm stance against the dumbing down of education, KLatin. Eventually the KDE Edutainment package will be released with the rest of KDE.

There is perhaps no need to rehearse the practical arguments here for stable, economical software that can run on older hardware. It is a sad reflection on the nation if we need to make the case for open source code for schools and colleges and the desirability of immersing young minds in a community built on cooperation, never mind the advantages to taxpayers of avoiding foreign proprietary software.

k12ltsp The author is currently installing cable around his house to upgrade his poorly functioning home network. This will lead to the configuration of the childrens’ computers as thin clients using packages from the Linux Terminal server Project (ltsp). The clients (X servers) boot from floppy or a ROM on the network card. They get an IP address from a server running Bootp or DHCP. Then a TFTP request

Info KDE edutainment IRC – server K12ltsp FSF Europe’s pages OFSET FSF and WEF Free in Italy In France Blue Linux Tux4kids

http://edu.kde.org/ irc.openprojects.net, channel: #kde-edu http://www.riverdale.k12.or.us/linux/k12ltsp.html http://www.fsfeurope.org/education/education.html http://www.ofset.org/information/legal/manifesto.html http://savannah.gnu.org/projects/edu http://www.libresoftware-educ.org/en/carteItalieen.html http://www.libresoftware-educ.org/en/carteFranceen.html http://www.bluelinux.org/ http://www.geekcomix.com/tux4kids/goals/

82 LINUX MAGAZINE 13 · 2001

is sent to download a kernel from the server. The kernel takes control, mounts a root filesystem and starts the init process. An X-server is loaded and a login prompt is displayed. The principal advantage is in administration – 20 or 200 machines are now managed as one. Other advantages include vastly reduced hardware requirements (and improved reliability) and consequent cost and energy savings. This has prompted some thoughts on current efforts at X terminals for schools, particularly the progress being made in the USA by the K12ltsp project. Look out for a feature in the next couple of months.

Free schools The Free Software Foundation Europe (FSF Europe) has been working with The Organization for Free Software in Education and Teaching (OFSET) to promote Free Software in French schools. The French government now finds itself joining Argentina and Mexico in considering Linux in its educational establishments. The other big project for FSF Europe is representation at the World Education Forum (WEF), to be held October 24-27, 2001, in Porto Alegre, Brazil.

Blue penguins Matt Jezorek of Blue Linux has boldly proposed to pack all the “Educational Software games and such for younger students” into a single new distro, as well as “put together a second version that will contain administrative software for the schools.” Take a look at their site to see how they are progressing. Meanwhile the developers of Tuxtyping have launched Tux4kids to promote “quality educational software released under Free Software or other OSIcertified licenses.” Until next month, if you would like an educational excuse to waste some time, try simutrans on the cover CD. A freeware Railroad Tycoon-type game covering all modes of transport and the resources needed to run them. ■


BEGINNERS

OUT OF THE BOX

Master of Formats

PRETTY PICTURES CHRISTIAN PERLE

Out of the box takes a look at the best Linux tools around and recommends programs that we feel are either indispensable or unduly ignored. This issue, we focus on XnView. Yet another image viewer? But there’s already xv, gtksee, display and many more. And yet it’s worth taking a look at XnView by Pierre Gougelet, as this program can handle more image file formats (currently 214 of them) than any other under Linux. Since a few of these formats are not Open Source (CorelDRAW Preview for example), XnView is unfortunately not under the GPL, but is only freeware for non-commercial use.

What’s new Although XnView has been in existence for a long time for a whole range of other operating systems – including such exotica as TOS (Atari) and the slowly dying OS/2 – it is a newcomer to the Linux domain. To list all the options of the program would burst the seams of this column, so we’ll just concentrate on the highlights. But before XnView graces our screens, we must first install it. The author offers the program for download as an RPM package and as tar archive at http://perso.wanadoo.fr/pierre.g/. Regardless of the package format, the lesstif library in version 0.91.4 or higher must be installed. The XnView-RPM packet can be installed (as root) with the command rpm -Uvh XnView-lesstif.i386.rpm. If you are not using an RPM-based distribution, it will be slightly more trouble:

the option of installing the program with the Bash script instxnv.sh included on the cover CD. Copy this script into the XnView-1.17-x86-unknown-linux2.xlesstif directory and start it with sh instxnv.sh.

Perspectives Using an xterm, KDE or Gnome terminal, start the program with xnview &. At first there is nothing to see but a little menu bar. After selecting File/Browse... from the menu, things start to look a little different. An interface based on the file manager of a Redmonder software manufacturer emerges, the browser. Figure 1 shows the overview of the /home/chris/pics directory. How directory contents are presented can be adjusted in the View/View As menu. Additional options such as sorting, size of icons and arrangement of the browser elements are also found in the View menu. All program settings are stored in the file ~/.xnviewrc.

Not just reading The program can also write most of the image

tar xzf XnView-x86-unknown-linux2.x-U lesstif.tgz cd XnView-1.17-x86-unknown-linux2.x-lesstif su (enter root password) csh install exit If csh is not installed on your computer, there is still 84 LINUX MAGAZINE 13 · 2001

Figure 1: Galaxies at a glance


OUT OF THE BOX

BEGINNERS

Figure 6: Automatic thumbnails for the Web

Figure 2: Automatic conversion

Figure 7: Result on the Web page

Figure 3: Left, the original and right with gamma correction

formats known to XnView. When you hold down the right mouse button over an image file icon, a context menu appears, from which you select Convert.... Alternatively, you can press [Ctrl+U]. In the following dialog window (Figure 2) you can define destination format and optional processing steps (Advanced Operations... button) such as gamma correction, blurring or colour reduction. Gamma correction (Change Gamma) serves to adapt the brightness distribution in the image for different monitors or other output media. The brightest and darkest colours remain the same – only the variation between them changes, depending on the gamma value. If a value of less than 1 is selected, the result is darker than the original, and with value of more than 1 it becomes brighter. In Figure 3 the image has been brightened with a gamma value of 1.8. For blurring (Blur) you can specify a value between 0 and 100. Figure 4 shows an enlarged section after blurring. The edges of the text have been rounded off as a result. An example of reduction to 16 colours (Convert to Colors) is shown in Figure 5, again as an enlarged section. The conversion function is not restricted to individual images. As can be seen in Figure 2, you can turn whole directories of image files into another format at a single stroke. So in the example, all files in the directory /home/chris/samba/galaxies are converted into the PNG format and saved to the directory /home/chris/local.

Automatic for the Web Another function of XnView worth mentioning is the automatic creation of Web pages with image

Figure 4: Blurring with Blur factor 100

overviews (thumbnails) as HTML tables. This function is reached via Tools/Web Page... on the menu or the key combination [Ctrl+G]. In Figure 6 can be seen setting options such as number of lines and columns in the table, size and format of the thumbnails and page title. Sections from a complete Web page are shown in Figure 7 (Netscape has been used) The file names under the thumbnails are links to the original files. If one table is not enough, XnView spreads the overview over several pages, which are connected to each other by links. ■

Figure 5: Atari ST says hi: 16 colours

GPL: The GNU General Public License. This is a software licence, allowing the program to be passed on, on condition that the source text always remains available. It is also permissible, and expressly desired, that users make their own improvements to the software and publish it again. Linux itself is under the GPL. RPM: Using the Red Hat Packet Manager software packets can be neatly installed and uninstalled. tar: The tape archiver is the standard archiving program in the UNIX world. tar archives are also called tarballs and are usually compressed using gzip – hence the file ending .tar.gz or .tgz. Lesstif: A free (under the GPL) copy of the Motif library, which provides menu and dialog elements for X-window programming. ~: The tilde is a shortening symbol for the home directory of the current user. And ~user designates the home directory of the user user. PNG: Portable Network Graphics, a graphics format which mainly owes its existences to the licence problems of the compression procedure used in the popular GIF format. Instead, PNG uses the unpatented compression of gzip. ■ 13 · 2001 LINUX MAGAZINE 85


COMMUNITY

POSIX COMPLIANT

A new GPLed desktop OS

FAITH IN ATHEOS RICHARD SMEDLEY

As mentioned in the first article in this series, there are many UNIX-like OSs out there, developed for many purposes. Those who wish to move UNIX to the desktop – and avoid the burden

A self portrait of the AtheOS server

of the X Window

It is often said that most Free Software projects are written to scratch a coder’s particular itch. In the case of Atheos creator Kurt Skauen, this was to create a free desktop OS. Skauen felt that GNU/Linux and the BSDs, with their mishmash of toolkits (Gtk, Qt, Motif/Lesstif, Xt, fltk), failed on several levels to give the end user a consistent and easy to use environment. The GUI has been an integral part of Atheos right from the beginning. Originally written, like the kernel, in C – the GUI was reworked in C++ as Skauen learned it, to meet the projects aims of an Object-Oriented API.

Kernel

System – will be curious to look at AtheOS, a GUI-based POSIX-leaning Operating system (OS), with modular kernel and a 64-bit journaled filesystem. AtheOS Web browser in action

Of course, the user interface (UI) is important for the end user but it is the kernel and the API that will interest the developer. The OS builds on firm foundations with a pre-emptable, multi-thread kernel coded with SMP in mind, for high-end workstations. Threads are scheduled independently of whether they are inside the kernel, as well as improving SMP performance, this reduces scheduling latency on all machines and results in the GUI feeling extremely responsive (dare one say Amiga-like?). The kernel has been called microkernel-like. It is not a true microkernel, however construction is extremely modular – there are no device drivers built in. At boot time required device drivers are loaded by GRUB. Any driver can be loaded during runtime, including block devices or file systems. In keeping with the AtheOS philosophy drivers are written for a well-defined API, there is no need to learn everything about the kernel before you dive in and have a go.

File under buzzword AtheOS version of XSpringies 86 LINUX MAGAZINE 13 · 2001

There has been coverage of journalled filesystems in this publication and elsewhere as the four


POSIX COMPLIANT

contenders to replace Linux ext2 (ReiserFS, ext3, JFS and XFS) shape up. AtheOS, like BeOS, was designed to do more than this from conception. The 64-bit journalled filesystem supports user attributes (MIME-type specification or a custom data stream – an icon, associated with the file for example). Indexing of file attributes allows extremely fast file look up. Applications can monitor files and directories for changes without any need for polling, thanks to node watching within the kernel.

History The three BSDs, GNU/Hurd and Linux and other major OSS Operating Systems have developed over time with the input of a large number of people. Atheos has been largely the spare-time work of one hacker. Kurt Skauen takes pride in preferring coding to talking about how such-and-such feature could be implemented. A clear single vision has produced fairly clean code and good APIs at both the device driver level and for application programming. Since putting a TCP/IP stack into the kernel and running the www.atheos.cx website on AtheOS the project has been picked up by many of the geek sites on the Web. Interest is growing and the project may now be faced with how to deal with conflicting ideas about direction. So far new developers seem happy to follow Skauen’s lead. TCP/IP stack is currently undergoing heavy improvements resulting in better connection for those with high latency. All the code is licensed under the GNU GPL – indeed a complete set of GNU tools are every bit as useful for developing AtheOS as they were 10 years ago for Linux.

No browser conflicts W3m was a fairly early port (most UNIX command line applications compile with little or no alteration) but as a graphical environment a modern graphical Web browser was needed. Impressed with the clean design of KHTML – the HTML renderer used in Konqueror – Skauen has ported it to AtheOS. Calls to port KDE (and Qt) fall on deaf ears though. Atheos with its high level API and OSimplemented GUI features demands the consistency of using its native toolkit. KHTML has been ported to use AtheOS native widgets: Each widget (checkbox, radio button et al) either uses a native AtheOS widget directly, or is presented with a Qtlike widget by a thin class written to wrap the AtheOS widget. KHTML is not multithreaded and a total rewrite would have been difficult. Fortunately a workaround was available and ABrowse, the browser that uses KHTML, synchronises all browser windows with the same mutex, giving the impression of a single thread of execution, thus enabling most of the KHTML code to run unchanged.

COMMUNITY

Multithreading The AtheOS project has found itself welcoming many former BeOS developers, which has lead to some debate on the merits of pervasive multithreading. The problem under BeOS, also famous for its multithreading, was that developers were forced to use multithreading and the BeOS locking mechanism. In itself this wouldn’t have been so bad but for various bugs and limitations – including the limited number of threads that an application may spawn. This limit greatly troubled the BeOS port of Mozilla. Another limitation was the messaging system which BeOS uses (BLoopers), in particular the limits on message queueing – which would also have been problematic even under a single-threaded OS. Under AtheOS threads can communicate through message ports (most common), shared memory, POSIX signals, semaphores, named and anonymous pipes, ptys and TCP/IP – and probably body language as well. In AtheOS an application may spawn approximately 16.2 million threads, processes, message ports, global-semaphores, memory-areas or any other object which carries a global ID. For threads the task-switching attributes of intelcompatible CPUs which AtheOS uses limits this further to around 8000. However this could easily be removed. In real world use, multiple threads may slow down processes but they will leave the User Interface feeling more responsive. For a desktop OS this benefits the end user.

Atheos newbie install Atheos is still at an early development stage and lacks a friendly installer – or indeed the ability to read CD-ROM drives. Nevertheless what has been done is impressive enough. For the curious we have included it on the cover CD. Some PCs will not be compatible but if you have a video card from Matrox, nVidia or S3 you will be able to really appreciate the graphical environment. On all the machines I tried I had to use the vesa driver so video was very slow. Do I have to remind you to back up anything important before installing an experimental OS onto your hard drive? Unless you already have an Atheos File System (AFS) somewhere on your PC, you will need to create a FAT (DOS) partition and copy across the base installation base-atheos-0.3.xx.tgz to this partition. Take three blank floppy disks and copy the following files to them: atheos-0.3.x.boot.01 atheos-0.3.x.boot.02 atheos-0.3.x.data.01 After booting from the floppies AtheOS drops you into a Bash shell inside a terminal emulator (aterm). Typing help will give you the commands available.

Buzzword-compliant Atheos boasts a number of desirable features for a geek OS: 64-bit journalled filesystem; kernel with built-in TCP/IP stack and support for SMP; fair degree of POSIX compliance – will run most UNIX CLI tools; client-server GUI protocol, using native messaging system; fine-grained multithreading. If that doesn’t tempt you to have a look, nothing will.

13 · 2001 LINUX MAGAZINE 87


COMMUNITY

POSIX COMPLIANT

If you have yet to prepare a partition to install AtheOS type Diskmanager & at the shell to invoke the AtheOS partitioning tool. If you have a partition set aside you can use Diskmanager in order to modify the partition size and change it to AFS the native AtheOS file system. Now mount the FAT file system where you copied the base install package (in the following case hda0):

mkdir dos bash-2.03$ mount /dev/disk/bios/hda/0 dos note the heirarchical structure of /dev, with disk block drivers under /dev/disk. AtheOS reads disk data from BIOS as this is the only disk driver finished thus far. AtheOS 0.35 is some improvement over 0.34 in this driver and seems better with my (old and flaky) hardware. Now we come to formatting our AtheOS partition:

menu.lst as follows: title AtheOS root (hd0,1) kernel /atheos/sys/kernel.so U root=/dev/disk/bios/hda/1 module /atheos/sys/drivers/fs/afs module /atheos/sys/drivers/dev/disk/bios This will tell GRUB to boot partition number 2 on the first drive through the “root (hd0,1)” command and where to find the kernel, the boot block-device driver and the boot-FS driver on that patition. Save and exit and you are ready to reboot with the floppy disk to install the bootloader. Before GRUB starts loading up AtheOS hit <ESC> and then type “C” to enter the GRUB shell from which you can install GRUB. Type root (hd0,1) in order to tell GRUB where to find the config file and the second-stage loader. To install in the master boot record (mbr) type setup (hd0)

bash-2.03$ format /dev/disk/bios/hda/1 afsU MyAtheOSPartition bash-2.03$ mkdir afs bash-2.03$ mount /dev/disk/bios/hda/1 afs

to put GRUB on the AtheOS partition. You will need to point your current boot loader there: setup (hd0,1)

Please make sure that it is definitely hda1 (for example) you want to format before typing the above. Now unpack the base installation bash-2.03$ cd /afs bash-2.03$ tar -xvpzf /dos/atheos-baseU -0.3.xx.tgz You will now find yourself with an /atheos directory containing the OS and a /boot directory containing GRUB. We must now configure this bootloader to boot AtheOS. It would be a good idea to play around with GRUB under Linux first if LILO is your normal bootloader, and to make boot floppies for any other OS that you have installed. Edit the /afs/boot/grub/menu.lst file – you will find jed, an emacs-like editor, at the command line. For the above configuration you need

Info AtheOS homepage AtheOS Developer’s mailing list

http://www.atheos.cx http://lists.sourceforge.net/lists/listinfo/ atheos-developer http://www.gnu.org/software/grub/grub.html

GRUB Brent Newhall’s site hosts thirdparty applications http://www.kamidake.org Other AtheOS sites http://mnemo.nu/ Interviews and reviews of AtheOS www.benews.com, www.osnews.com and www.slashdot.org ■ 88 LINUX MAGAZINE 13 · 2001

Congratulations. Now reboot to be welcomed by the login prompt. Default root password is “root” and default user is guest (password=”guest”). Change it in /etc/password. Once up and running installing software is fairly straightforward, the binaries each unpack to their own folders. Emacs is the included editor and ports of Apache, php and Python mean that some will already find this a usable OS. The idea of a GUI-based UNIX and the amount of work already implemented have generated considerable interest already and lead to a number of third-party applications appearing. Now check out the websites for more information and have fun.

The Future Atheos is a work in progress. It will be sometime before it is ready for general use, though it might be hitting desktops before the Fresco-based Berlin display server reaches maturity for UNIX. Nevertheless it is a fascinating project in itself and anyone interested in taking their first steps towards kernel hackerdom would be well advised to start with some much needed device drivers for this OS. At the moment most of the advanced features of Atheos are not fully implemented. If you have an app to write or port over now would be a good time to get involved to grow your project with the OS. One day you may be porting AtheOS apps to Gtk on Linux. ■


COMMUNITY

BRAVE GNU WORLD

The monthly GNU Column

BRAVE GNU WORLD GEORG CF GREVE

This month, we feature a rather wide variety of topics, finishing with a semi-experimental item that has been developed with Bernhard Reiter.

TINY

Georg

A minimal GNU/Linux distribution is the goal of the TINY project. It was started by Odile Bènassy, the team also consists of Jean-François Martinez, Mathieu Roy and Roger Dingledine. The acronym TINY stands for “Tis Independence N’Yet,” which is a pun derived from “Independence Linux,” JeanFrançois Martinez’s main project. The project goes back to the personal experience of a relative of Odile, who tried to introduce GNU/Linux in her school, and the idea of giving developing countries the chance to participate in the information age and letting them profit from Free Software. Keeping hardware requirements as low as possible was necessary in order to achieve this. The current minimum is a 386 DX 33 without hard disk. Developing countries have many other difficulties and the missing technical infrastructure poses a problem. There are no Internet connections and floppy disks do not survive the climate, very often people do not even have electricity. But help programs exist to get electricity and phone lines into remote areas. Odile has talked to scientists and physicists helping with such programs on a voluntary basis. TINY is based on a Slackware 4.0 and uses the glibc2 and kernel 2.2; the license for the distribution is the GNU General Public License. Other than some minimal distributions, TINY is 92 LINUX MAGAZINE 13 · 2001

completely usable and ready for everyday use. The distribution can be installed successfully as the messages on the home page (available in six languages) show. But a longer and better maintained application list is needed. The project has been suspended due to the other commitments of the volunteers. Feedback from developing countries is also still lacking. So the current team would like to turn over the project to another group of people, who they would give every help and support to. If you’re interested in helping others to help themselves, TINY might provide a good basis. If you are simply looking for a minimal distribution, TINY is worth a glance.

GNU TeXmacs The GNU TeXmacs project works on a Free Software scientific WYSIWYG text editor. As the name suggests, Joris van der Hoeven, author of GNU TeXmacs, was inspired by GNU EMACS and LaTeX. TINY is not a LaTeX front end but an independent project. The inspiration from LaTeX came in the form of typesetting quality and capabilities for typesetting mathematical expressions – an area in which LaTeX undoubtedly offers the best solution. GNU TeXmacs also uses the TeX fonts and has import/export filters for TeX/LaTeX documents. The EMACS inspired TINY’s extensibility. GNU


BRAVE GNU WORLD

TeXmacs in Russian showing font sets

TeXmacs is written in C++ with Guile/Scheme as extension language. The user interface and the editor itself can be customised/extended with Guile commands. GNU TeXmacs also allows the user to perform scientific calculations directly through interfaces to Maxima, Pari GP, GTybalt, Yacas, Macaulay 2, Mupad and Reduce. An interface to Scilab should be usable soon and adding more interfaces is relatively easy. Combined with the planned extension towards becoming a full XML editor, GNU TeXmacs offers interesting possibilities for things like interactive mathematical documents on the Internet. Thanks to professional typesetting quality, good anti-aliasing of the TeX fonts, the possibility of structured documents and the potential for dynamic macros and style-files, GNU TeXmacs offers a lot of possibilities for the scientific user in particular. The project is heading towards 1.0 version. Joris van der Hoeven, Andrey Grozin, Thomas Rohwer and others have been working on it for about four years now. Current problems are some incompletely implemented features and LaTeX filters that still offer some room for improvement. The documentation is still too terse. The immediate plans are to get rid of these problems and get version 1.0 released. Spreadsheet support and the XML/HTML extensions are the next steps. Long-term plans involve porting it to non-UNIX platforms and developing technical drawing capabilities. Help is welcome in any form; from documentation, translation, filter writing, ports to other platforms and Gnome support, to making GNU TeXmacs more widely known and used.

CD-ROM Control Since the small projects have been neglected a bit, it is time to feature one of them here. CD-ROM Control by Paul Millar is a small applet to control CD-ROMs. It was written in Tcl/Tk (and a

COMMUNITY

Coloured formulae fonts

small part in C). Besides a status display it offers easy possibilities for mounting/unmounting/ejecting a CD-ROM by mouse-click. The GUI can alternatively be Tk or GTK+. The special feature of CD-ROM control is autostart, which allows the user to automatically start their favourite graphical file manager, Web browser or audio-application when entering a CD. Even if the integrated desktops offer part of this functionality, this is not true for all window managers, which is why this project might be interesting to some.

TeXmacs showing formulae and embedded images in the WYSIWYG display

Saxogram Saxogram by Matt Dunford is probably the most unusual project covered this issue. The name is derived from the relatively unknown Danish historian Saxo Grammaticus, whose chronicles are a fanciful reference to Hamlet by Shakespeare. Saxogram allows the user to create a vocabulary list for documents in foreign languages in order to make learning a new language easier. Like so 13 路 2001 LINUX MAGAZINE 93


COMMUNITY

BRAVE GNU WORLD

CD-ROM Control under Gnome

many people before him, Matt learned Latin and Greek, which very often required looking up every third word. One day, he discovered a Latin dictionary on the Web, which inspired him to begin working on Saxogram. Saxogram parses a document for words and looks them up automatically in a dictionary. The output collates the words found, together with their explanation. Saxogram is quite successful in dealing with conjugated and declined words correctly. The accessibility of dictionaries is a sore point: free ones are rare. Although the Internet Dictionary Project concentrates on this problem area, the dictionaries are not yet stable enough for use. The LEO online dictionary is of limited help, as it can only translate between English and German. As far as Greek dictionaries go, Matt hasn’t found a single one and would very much appreciate being pointed to one. The program was written in Python and is released under the GNU General Public License. Its main problem is execution speed due to many regular expressions and disk accesses. Execution speed was not considered as a feature when the application was written. A solution to this oversight is high on the task list for further developments. Currently supported are German, Latin and a little Italian, while the working language is English. Adding further languages is another development goal, as is the creation of a GUI. Most important for further development are 94 LINUX MAGAZINE 13 · 2001

more testers and dictionaries, so anyone interested should get in touch.

GNU libiconv The GNU libiconv is the character set conversion library of the GNU Project; through the iconv() function it offers programs the functionality of converting documents between different character sets. A few words about the background: traditionally, the character set contains 256 values, each of which represents a letter. Thinking about Asiatic languages and local specialties like the umlaute or the euro sign makes it obvious that 256 values are not enough to represent every character on this planet. So people in the different language areas created modified character sets that would contain the local letters. Because of this, the letter for a certain value has become ambiguous because it depends on the language. Therefore, the selected language is specified by the encoding. In order to internationalise a program properly, the program must be capable of transliteration. This is a basic requirement for all text-processing programs. Ulrich Drepper tried to create a standard solution for this problem about two years ago for the glibc, but because of portability problems, the solution has not made it into the glibc. This resulted in a splintering of the character set conversion libraries. Bruno Haible seeks to change this with libiconv. Libiconv supplies the iconv() function in the same way as it is supplied by the glibc 2.2. Libiconv is portable, fast and autonomous, so it can be used to supply the iconv() functionality on all systems


BRAVE GNU WORLD

without the glibc 2.2. Authors can now use iconv() without fearing for the portability of their programs. Authors of mail programs in particular should do this because a lot of the mail clients don’t handle MIME extensions correctly yet. Like the glibc, libiconv is covered by the GNU Lesser General Public License, so it can be linked with proprietary programs if need be. This should hopefully solve the conversion problems for everyone. The libiconv functionality also has an impressive transliteration feature. A character that does not exist in the target character set can be approximated by one or more similar characters if the “//TRANSLIT” feature is requested.

Mozart/Oz Mozart/Oz is a rather interesting development platform. A complete description of all aspects would be too long to cover in any depth here, so we’ll merely try to provide an overview to give developers an idea of what it’s all about. Mozart was started 1991 in the European ACCLAIM project and has been developed with cooperative efforts under an X11-like license. Although the license is certainly not optimal, it does qualify as Free Software. Mozart is a development platform for intelligent, distributed applications. Distributed computing in particular is a great strength of Mozart, as it makes the network transparent. Additionally, it supports multiple paradigms, concurrent programming through lightweight threads (several thousand threads per application are possible), as well as mobile agents and more. The Oz virtual machine is portable and runs on almost all Unix derivates as well as MS Windows. For the user interface it supplies an object-oriented library with a high-level, well integrated interface to Tcl/Tk. The project is usable and the biggest problem now is developers who need to unlearn bad habits. Some things are still being worked on. Plans for the future include improving reliability, security and network transparency as well as adding more tools. The Mozart Consortium could use some help, they especially need a volunteer for a Windows IDE. The current IDE is based on GNU EMACS, which is not to everyone’s taste. A port to Macintosh is also in the works but this is proceeding rather slowly and help would be welcome. If you are interested in taking a closer look at

COMMUNITY

Info Send ideas, comments and questions to Brave GNU World column@brave-gnu-world.org Homepage of the GNU Project http://www.gnu.org/ Homepage of Georg’s Brave GNU World http://brave-gnu-world.org We run GNU initiative http://www.gnu.org/brave-gnuworld/ rungnu/rungnu.en.html TINY GNU/Linux distribution homepage http://tiny.seul.org/de/ GNU TeXmacs homepage http://www.texmacs.org LyX homepage http://www.lyx.org CD-ROM control homepage http://sourceforge.net/projects/crcontrol Saxogram homepage http://saxogram.sourceforge.net/ Online Latin Dictionary http://king.tidbits.com/matt/ LatinDictReadMe.html Internet Dictionary Project http://www.june29.com/IDP/ LEO English/German Dictionary http://dict.leo.org GNU libiconv homepage http://clisp.cons.org/~haible/ packages-libiconv.html Mozart/Oz homepage http://www.mozart-oz.org Sound & MIDI Software for GNU/Linux http://sound.condorow.net

Mozart/Oz, a visit to their homepage, http://www.mozart-oz.org, is recommended.

Free Software and 3D This feature was initiated by Bernhard Reiter. As the German representative of the FSF Europe and cofounder of the Intevation GmbH – a company which works only with Free Software – he spends a lot of time on the Free Software project. 3D-modelling is a very interesting and important topic, which is also why the last FSF Award has gone to Brian Paul for his work on the Mesa 3D graphics library. But there are still no fully developed Free Software modelling tools available. This is probably due to two causes: First of all, it was not possible to find a good overview Web page about Free 3D software. Very often, not even related projects seem to be aware of each other. Also, proprietary products that are available at a low price provide an efficient development roadblock. Users are attracted by the low price and do not realise that their choice makes it impossible to maintain software in the future. As Bernhard says, “This is a classic example that pragmatism and compromises in terms of software freedom slow down a whole software area.” Any non-programmer can use networking to focus development efforts. Dave Phillips has demonstrated this with his Sound and MIDI Software for UNIX/Linux page. A comparable Web page for 3D modelling with Free Software would be a substantial contribution to this area. So much for this Brave GNU World issue. We hope to have provided some interesting ideas and, as usual, we ask you to mail comments, questions, idea and interesting projects to the usual address. Maybe even some Free Software from the 3D area. ■ 13 · 2001 LINUX MAGAZINE 95


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.