LJ Archive

More Letters

Corel Linux Statement

The talk continues about Corel's beta testing phase for Corel Linux and I wanted to send you our response to the criticisms that have arisen over the past few days. Hopefully this will clarify our beta testing phase and our commitment to the open source community. Please give me a call if you have any questions. Thanks.

A Time For Clarification

Corel wishes to clarify some issues that have arisen concerning the Beta testing phase of Corel LINUX.

The Beta testing phase for Corel LINUX, and the related Corel Beta Testing Agreement is in no way intended to contravene the terms of the GNU GPL or any other applicable Open Source license agreement. The restrictions on reproduction and distribution of the Beta version of Corel LINUX contained in the Beta Testing Agreement are intended to apply only to those components of Corel LINUX that were independently developed by Corel without the use of Open Source software.

Corel has decided to restrict the reproduction and distribution of these independently-developed components because further testing of these features is required before distributing them to a wider audience. As these added features will bear the Corel brand name, we wish to ensure that they are of the high quality that people expect from Corel.

In order to rectify some of the confusion surrounding the Beta testing program, Corel will be amending its Beta Testing Agreement to further delineate between Beta testers use of Open Source code components and Corel-developed code. As planned, beta testers will be receiving the source code to both the Open Source code and Corel modifications to them within the next week to 10 days, and are free to distribute these modifications in accordance with the terms of the applicable Open Source license agreement.

Corel will not be providing the source code to those portions of the Beta version of Corel LINUX that were independently developed by Corel without the use of Open Source software. As stated previously, once these features are deemed to be ready for general release, Corel will make the source code available under an Open Source license, the terms of which will be disclosed at a later date.

Corel remains committed to supporting the Open Source community, and has been working closely with the Linux community for the last few years. Corel has also contributed a significant amount of work to the open source Wine Project, which allows 32-bit Windows applications to run on Linux, and has worked to make Linux more accessible to mainstream users via Corel LINUX.

Do you know about Corel's involvement with Linux? Visit us at http://linux.corel.com

—Judith O'Brien, juditho@corel.ca


Article Suggestion

You print, from time to time, a comparison of Linux distributions. You regularly print full reviews of commercial products for Linux (hardware, software, books, etc.). You print mini reviews in your 'Focus on Software' column which seem to be mainly development or connectivity tools; and your 'New Products' column which seem to be mainly commercial products.

The point I am trying to make is that apart from a few commercial desktop/productivity products you do not mention desktop software for Linux.

Your magazine, which I have been reading for at least 5 years, seems to cater almost exclusively to information systems and development people. I believe that it *must* address the desktop/productivity users as well.

Microsoft DOS/Windows became widely used because it started off as a desktop/productivity system. It only entered the information systems realm because enterprise managers wanted decided to save money by taking advantage of the already trained user base.

Many potential Linux users are already using some sort of MS or Apple system. They have a *lot* of both money (to purchase) and time (to learn) invested in productivity software. They may have a significant amount of money invested in games for themselves or their children. They may also have money invested in educational software for their children. Many would like to switch to a more powerful and stable system.

The thing that is holding them back is *cost*. Yes Linux is free (sort of) and they already have the machine. However, a machine with Linux loaded and running like a top doesn't help them do what they need done. They need software! Most of the people who want to switch to Linux can not afford the cash to replace all their software (e.g. productivity, games, educational). Even if they can afford the cash, many of them have been using DOS/Windows or Apple/Macintosh applications for a long time and therefore have a large numbers of data files which is more valuable than the cash cost of switching. They can not or do not want to incur the *cost* (in terms of time) to re-create their data files. Unless they can get software which; 1, is *very* inexpensive; 2, can load their data files; 3, can export data files which others who have not switched can read, they simply will *not* switch.

The main idea is that they want to keep cost, both monetary and effort/time, of switching as low as possible. Once they have switched, slowly over time, they would be willing to spend money on software, again.

—Milton Hubsher, milton@CAM.ORG


To Mock a Pronouncing Bird: Re Linux Journal article, September 1999, p.14

Hey, kids, let's change some of the C language! Yeah, sure, let's just decide to replace some words with our own words ... what?? What do you mean we can't do that?

I'm sure it's obvious to everyone that a few people can't simply change a portion of a widely agreed language or standard. What stuns me is how the computer community—so savvy about its own standards—widely ignores the existing standards of English grammar and pronunciation. Your September article surveying pronunciations of Linux recorded on the Web, while an interesting survey of contemporary pronunciation, demonstrates this common ignorance. But it did contain one valuable revelation.

English pronunciation has rules, just as any computer language does. Although imperfect, they are necessary to help avoid the language descending into chaos, and to help readers know how to pronounce words. In Standard English pronunciation the second vowel “u” makes the first vowel “i” long. Thus in the word “line” we do not say “linn-ee.” (Indeed, in this example, the “n” is doubled to indicate that the vowel is a short “i” pronounced “ih,” so if we were discussing “Linnux” the first vowel would be short.)

The article's revelation that Torvalds was named after the American scientist Linus Pauling puts the final nail in the coffin for the “Linn-ucks” camp. Linus has always been pronounced “Line-us” because the name was absorbed into English, and became pronounced according to the rules of standard English pronunciation. No one ever called Dr. Pauling “Linn-us.” Even in its original language, the name was NEVER pronounced with a short “i”, as “Lih.”

Readers should note that Linus Torvalds, Maddog and several others from the early days adhere to their own pronunciation which I have not seen noted elsewhere. As clearly heard at LinuxWorld during their speeches, they actually pronounce Linux as “Linnix,” not “Linnux.” This may result from the OS's origin in the OS Minix and the “ix” in Unix. (Note that we do not say “Oo-nix.”)

The manner in which this debate has proceeded is significant. At LinuxWorld, vendors were heard loudly mocking attendees who innocently used their English language background and correctly pronounced the OS as “Line-ucks.” Along with the author of your article, many seem to feel that pronunciation is simply “received,” meaning that the way they hear many individuals pronounce it makes it right. This is obviously not true. Although English is a living language, we are not freed from the duty of occasionally weeding the verbal garden.

A worse example is the common mispronunciation of the word “router” as “raowter.” The word comes from the French phase “en route,” pronounced “on root.” Until the mid-1980s, the standard American and English pronunciation of this word was “root,” as evidenced by the pronunciation of the popular US television show “Route 66.” When Local Area Networks became common technology, introduced by companies populated with engineers from the American South and West, the origin of the word and its correct pronunciation were displaced by the Southwestern regional pronunciation, which has now infested even television shows such as Star Trek Voyager (“Re-raowt power to the shields!”). To adopt this pronunciation is to deliberately bury the history of the phrase. (Clearly the show's producers do not realize how bizarre this pronunciation sounds outside the American Southwest.)

It is understandable that many individuals with a strong engineering and computer background are weak in English, just as many with a background in languages are weak in mathematics. One can sympathize with their plight since standard English pronunciation is infrequently or poorly taught these days. But it is impossible to sympathize with the authors of many key documents in our industry such as the foundation documents these documents are full of spelling and grammatical errors, yet each came from academic or commercial organizations large and sufficiently prosperous to afford editing and proofreading services. Had they understood the errors of their linguistic ways, these scientists would surely have behaved differently.

So this is how the debate has proceeded—without background, =
investigation or scholarship. Your article concludes that the web survey of pronunciations is “inconclusive” and that the pronunciation “Linn-ucks” appears to be the winner. Winner? Correct pronunciation is no contest. At least those in the industry responsible for using the English language to discuss the languages of computers should accept their responsibility to make this clear.

It's “Line-ucks,” and you've helped prove it. :-)

—Malcolm Dean, malcolmdean@earthlink.net


LJ Photos

I've been a Linux Journal subscriber for a little under a year now, and first of all let me say that I always love the magazine, it's one of a very few I will consistently read cover-to-cover, and the design is great; attention-grabbing covers, a clean and easy-to-navigate look all throughout the articles. But I must complain about the photography. I've been a photographer longer than a Linux user, and it seems like a majority of the photos I see alongside the copy are nearly lifeless - the colors are unsaturated and the tonal range is flat. What prompted me to write this comment, though, was the cover of this month's issue - the figure is distractingly lit and the right side of his face is completely enshadowed. The magazine deserves something better to mirror what's inside, and it needs something more professional if it is to compete with Cindy Crawford (or, perhaps, Lara Croft) beside it on the newsstand.

I realize that my own experience no doubt exaggerates what I notice, but in comparison to the excellent design surrounding the photography, I figured that someone needed to call it to the editorial staff's attention. Perhaps LJ needs to consider adding a dedicated photo editor to the staff. I for one, though I love the magazine the way it is, would love it exponentially more if the portraiture and incidental shots had the same crisp, clean look of the graphics and the colorful glow of the screen shots they accompany.

Please keep up the great work!

Regards, Nate
—Nathan P. Willis, npw@acm.acu.edu


Coming of Age

I agree with Mark H Runnels (LJ Sept 99, page 6, Letters) that if we want to get Linux popular with the masses, a distribution is needed that converts the computer into a Linux Appliance. I hope that Red Hat, Caldera, and the other distributions that try to achieve this will succeed soon.

I do not agree with Mr. Runnels that this will be Coming of Age. An appliance produce the same result independent of the knowledge of the user. There is another use of computers as an enhancement to your brain, where the result depends on the knowledge of the user. This is a very important use of computers.

Consider the new chip that finds the DNA of a sample. A person that is not able to spell DNA, using an appliance with this chip will obtain the DNA of any number of samples. Such person will not be able to correlate these DNAs to other characteristics of the samples. A person with knowledge of DNA, correlations, computers, etc. is needed to put such information onto a computer and find why one sample is from a person susceptible to flu, while another is from somebody immune. Linux Journal is full of articles of this type.

For this reason, I hope that Slackware and the other distributions that let you free will prosper and continue to give us that choice. Notice that with the death of DOS in all its mutations, Linux is the only system that permits you to do with a computer what you want, the way you want. This is why I say “Linux makes you free.”

Dr. Michael E. Valdez, mikevald@netins.net


Coming of Age

I largely agree with most of Mark Runnels' letter - I still run windows myself, mainly because that's what we get given at work and I'm familiar with it. But if I had to recommend a PC to my mother, it would be an Apple iMac not a Wintel machine, because it comes closest to computing's holy grail “plug it in - it works”. I've just spent the best part of a week's spare time trying to get a new Wintel machine at home to work, so how any total beginner manages I don't know. Linux isn't yet for beginners, but it's robust enough so that it could be if suitably packaged. A “plug it in - it works” Linux distribution is really what is needed to get the system into the mass market.

—Edward Barrow edward@cla.co.uk or edward@plato32.demon.co.uk


LJ Sept 1999

I have been getting LJ for quite a while now and look forward to every issue. Of course I never send a message when I am happy, just when I am not, so take this in the context of a usually very happy customer.

In the “Cooking with Linux” section in the Sept 1999 issue you state that the articles in the section are all about “fun things we can do with Linux”. I was very eager to read the article on Multilink PPP because it is something I have been wanting to get working for Linux for ages, without success. However I was disappointed when the article basically had nothing to do with Linux, in fact as far as I can remember it did not even mention it. There were no followup references to the Internet where I could download drivers or pointers to web sites with more information on the use of MLPPP with Linux.

It was a good article but I don't know why it was included. If I want to read about different network protocols, which is an interesting topic, I can read general computer magazines etc. I hope the LJ keeps its Linux focus and does not start to import irrelevant articles.

Well there you have it, I feel better now. Its the only thing I have been unhappy about in all my issues so you are doing something right. Also thanks for the T-Shirt which I got when I went to Linux World Expo (I came all the way from Australia). Out of all the T-shirts I got the LJ one with “Geek by nature - Linux by choice” is the one I like to wear the most.

Cheers

—Rodos, rodney@haywood.org


Linux in the Future

I have to say that reading the letter from Mark Runnels in issue 65 made me feel that at least I was not alone in my struggles.I have been trying for some months now to switch over from Windows 98 to Linux. I will say that now that I have finally succeeded in getting a Linux system working on my computer thanks to Mandrake and a 16 bit sound card I removed from an old 486 computer. I am now struggling to get the software I need installed let alone working. So far I have given up on Siag , just would not compile at all, Star Office due to the huge download required and no guarantee that it would install after that and I have finally given up on KDE Office due to the fact that it requires qt 2.0 which after hours and hours of trying I just have not been able to install due to apparent syntax errors during the 'make'. I for one would be only to happy to buy a copy of an office system in a box that I could install without having to compile and hunt for missing files etc.

Linux in my opinion is an excellent system I have found that the Internet facilities are second to none and being able to use KDM for file transfer has been unbelievably reliable. I have also only had minor system failures which could have well been my fault but unlike Windows the recovery was painless. I feel now that it is time for the software writers to decide are you doing this as a pure academic exercise or do the most successful way of installing software on Linux so how about you all come to an agreement that this is not a bad system and supply all software this way otherwise I really cant see a future for Linux as a system for the masses. While I consider myself as a newbie with Linux I have been involved with computers for 25 years now having done a stint as a programmer myself ,so I feel that if I am having problems there must be others who have tried and probably given up.

The final request I would like to make is to hardware manufacturers, would it really be so difficult or expensive to write drivers for Linux then we would be able to use all your fancy hardware and we would look forward to new devices being released instead of having to use older hardware just to get it to work.

—Stephen Chilcott, f.chilcott@xtra.co.nz


Your column “Is KDE the Answer?”

Dear Phil,

I read your column “Is KDE the Answer?” in LJ 66 (Oct. 99) and was disappointed to find some major inaccuracies in it which I would not have expected in a publication that is so close to the Linux community.

You write:

Most distributions have adopted KDE as their default and, as far as I know, all distributions include KDE.

That is not correct. Debian currently does not include it in Debian GNU/Linux.

Debian proper (“main”) consists of 100% free software, as per the Debian Social Contract (http://www.debian.org/social_contract). In addition to “main”, we have “non-free” and “contrib” sections on our FTP servers; “contrib” consisting of free software that is dependent on non-free software (like KDE was with Qt 1).

At one time KDE was available from the Debian FTP sites in the “contrib” section. Discussion of the licensing issues regarding KDE resulted in its removal from the Debian FTP sites in October 1998. This was reported in a Debian news article (http://www.debian.org/News/1998/19981008) which also contains the analysis of the issue.

The glitch in KDE development was the non-open licensing of Qt.

That is only part of the issue. If it were just this, Debian could provide KDE binaries in the “contrib” section.

The real problem, which still exists with Qt 2.0, is in the interaction between KDE's license (the GPL) and Qt's, both the non-free Qt1 license and Qt 2's QPL.

For details on it, please see the news item. The core of it is that the GPL imposes requirements on the licensing of libraries a GPL-ed application depends on (with an exception for major system components like e.g. the C library on a proprietary Unix); Qt's license isn't compatible with these requirements.

The GPL doesn't impose restrictions on use, thus it is perfectly fine to download KDE source and compile it on one's own system. It does impose restrictions on redistribution of binaries, which are a problem in the KDE case. The Debian project honours software licenses, and has honoured KDE's GPL by ceasing to redistribute KDE binaries.

[...] With Qt 2.0, it is a non-issue. The Qt people saw the problem and created a new license that addresses it.

It addresses only Qt's freedom issue. The QPL (http://www.troll.no/qpl/) under which Qt 2 is released is indeed a free license according to the Debian Free Software Guidelines (http://www.debian.org/social_contract#guidelines), for which Troll Tech is to be applauded. Debian's Joseph Carter <knghtbrd@debian.org> has provided extensive input during the development of the QPL which has made it a better license from the free software perspective. Joseph also explained the GPL interaction issue to Troll Tech. Regrettably, Troll Tech have chosen not to make the QPL a GPL-compatible license. Thus, although Qt is now free, Debian still can't redistribute KDE binaries.

Reportedly, the KDE project have now recognised the issue and are switching away from Qt 1 to Qt 2 and from the GPL to a different, Artistic-like, license that does not have bad interaction with the QPL. Once this is done, KDE can be included in Debian proper (“main”).

Most distributions have adopted KDE as their default

Debian is about free software and choice. We don't really have a default window manager, and I suspect that in similar vein we won't have a default desktop environment.

In the meantime, KDE packages in .deb format have been made available outside the Debian project, both by KDE and by the KDE Linux Packaging Project (http://kde.tdyc.com).

I'd appreciate it if you could correct these inaccuracies in a follow-up.

Regards,

—Ray, J.H.M. Dassen, jdassen@wi.LeidenUniv.nl


“Floppy Formatting” Letter in Oct '99 issue of LJ.

Read that letter with interest and was surprised your editorial staff did not catch the last suggestion of writer, namely:

    chmod 777 /usr/bin/fdformat
    reboot
reboot? Reboot? What for?

Glenn G. D'mello, glenn@zaphod.dmello.org


Film Stuff

I just got my latest issue of Linux Journal, and I was very interested in the story on the Houdini port. I hope you'll continue to run stuff like that.

It seems to me that one of the big Linux stories that you guys are covering pretty much on your own is the growth of Linux in the film industry. I remember reading about the Alpha farm they used to render Titanic in your magazine as well.

I'd really like to see a broader view of the topic, an article that seeks out people in the film industry who are using Linux, and tries to take the measure of the trend. Something that would talk to the Alias and Houdini people and the guy who worked on the Alpha, and whoever else is working with linux. Why are they using it? How do the ports work? When you use Linux farms to render frames in movies, do you write the code from scratch? Who does that stuff? Did they have trouble selling it to their bosses? Do they use Linux at ILM?

I know you're looking for writers, but even if you're interested in it, I wouldn't know how to proceed, so I don't know if it's realistic. I don't know anyone in the film industry, and I wouldn't know who to call. But if you're interested and can put me in touch with people who could open some doors, I'd love to take a shot at it.

In any event, I hope you'll keep up the excellent work. I enjoy your magazine very much.

—alex, alex@crawfish.suba.com


Letters

I have just received LJ No. 66 (Oct 99) and read Mr Hughes's article about KDE. I was not pleased with what I read. Specifically he has the arrogance to presume that only the GNOME programmers can fix KDE and secondly that it is a bug that KDE defaults to a European paper size.

The last point was particularly irksome as for years we Europeans (can someone who is British say that?) have had to put up with incorrect spellings and the wrong date format, to name but two “bugs”.

It is just this kind of americocentric view that has lead me to decide not to renew my subscription to LJ.

—AFL, A.F.Lack@city.ac.uk


Copy Editing

It's ironic that Harvey Friedman (Oct. LJ, pp. 116-117) complained that “maddog” Hall needs a better copy editor, when someone at your establishment allowed him to say “... after first determining whether the computer is comprised of hardware on which Linux can run.”

This mis-use of “comprise” (when “compose” is meant) is a solecism that I find particularly irritating. If you have a list of troublesome words and phrases that you grep for in each incoming manuscript, please add “comprise” to it.

—Andrew Young, aty@sciences.sdsu.edu


About “Is KDE the Answer?”

Dear Mr Hughes,

I am probably about the 999th person to protest to you about your editorial “Is KDE the Answer?” in the most recent LJ. But I hope I can get your ear nonetheless.

You start the editorial with two questions: “Is looking like MS Windows good or bad?” and “Is KDE a better answer than GNOME?”

You spend half the editorial arriving at an answer to the first question with which I agree: “It depends.” For certainly some people, fresh in out of the wilderness, will like to have a Windows-like interface, for a while anyway. Whereas others will want to put childhood behind them. Fortunately everyone can have what suits them.

But of course it is the second question that is bothersome. It reminds me of a similar question: “Is emacs better than vi?” Haven't we all learned that this kind of question is silly and irritating?

Don't you value competition? don't you see the benefits that ensue? to name just one example: would Troll Technologies have modified their license for qt if they, and the KDE community, hadn't felt GNOME's breath at their backs?

To try to pick a “winner” between KDE and GNOME is like trying to pick a winner between emacs and vi. It is IMHO silly and, worse, unproductive. Let's have (at least) two wonderful editors, and let's have (at least) two wonderful windowing environments. That is the way for Linux and Open Source to make progress. That is what will benefit us consumers the most.

For the record, I am a charter subscriber to LJ, which I love, and: my window manager of choice is fvwm.

Best wishes,

—Alan McConnell, alan17@wizard.net


A Little Disappointed

Although you publish very nice articles about Linux related issues, I must say that I am a bit disappointed by your journal lately. Over the years the number of articles about free software such as fancy graphical tools, libraries, interpreters, compilers, and articles about Linux' internals steadily declined, and the number of articles about commercial applications on Linux steadily increased. In the last issue I had to search for information between the advertisements and product reviews. It may be good business for your journal but I am not interested in this commercial stuff and I do not want to pay for reading about it. I will start looking for another source of information.

Sincerely,
—Fulko van Westrenen F.C.v.Westrenen@tue.nl


Letter re. October 1999 Issue

I read with interest the article about 'dungeon games', more commonly called 'roguelikes'. While the article is fairly accurate for a one-page potted summary, it seems odd that it doesn't mention that ADOM - unlike the other games - is not free software; the source code is not even available for inspection, let alone modification. Linux Journal is, after all, a magazine about a free software operating system.

In my teens, I was lucky enough to use a Sun 3/60 with a copy of Hack on; not just a compiled binary, but also the source code. Not only was there this fascinating game to play, but you could change the way it worked, provided you could work with this thing called UNIX. I was hooked; and (while clearly it's his choice) I think Thomas Biskup is doing the roguelike community (especially those who use platforms which ADOM has not been ported to) a grave dis-service by keeping his source code closed.

I'd also like to take issue with Phil Hughes' call to support KDE. There are two very real problems with having to pay to develop proprietary applications that use Qt. One is a moral one; if KDE becomes the de facto GNU/Linux desktop, people will have to pay Troll Tech when what they want is to develop for GNU/Linux (but not necessarily Qt) - and I don't see why Troll should get that money when it is actually being paid in order to work with an OS which is almost all the work of people other than Troll. The second is a more practical one; GNU/Linux will be more of a success if we preserve the current situation where it costs nothing to develop for it. I have no problem with proprietary applications competing fairly - if Corel or Applix can write such a good wordprocessor that I would rather pay $50 for it than get StarOffice or AbiWord for free, good for them - and a charge of $1,000 per developer is not the way to attract those applications.

—David Damerell, damerell@chiark.greenend.org.uk


KDE vs Gnome

I certainly don't want to get into the middle of declaring that KDE is better or worse than Gnome, particularly since I prefer WindowMaker over either one. The truth is, WindowMaker, in my opinion, is the least of the evils.

I'm probably older than Phil Hughes, and with over a million lines of C and assembler behind me, I'm not an easy convert to point and click, however, my observations may be of value.

1. We shouldn't endorse any particular program for the sake of providing a unified look for novices. If they want mediocrity instead of programs that have evolved because of their usefulness, then there's another OS which has plenty of what they're looking for.

2. Incorporating `applications' inside of window managers is the kind of monolithic failure that is too evident in the `other' OS. That is, if I have to run a specific window manager to use a file manager, I'll probably do what I do now - command line it or write a script.

3. Focusing on glitz, with its code bloat and execution crawl, may catch a few eyes, but it will never replace solid functionality - anyone remember the early versions of LapLink? It looked like the text version it was, but worked with great ease and lightning fast.

4. A window manager that requires one to learn how it works, instead of providing a learning experience about how computers, programmers, and users interact is like giving people food instead of teaching them to grow their own.

So is it a good idea to all jump on KDE? I think not, even if a clear case could be made for its immediate superiority. I think that we need to stay the course—Linux isn't just about presentation with some directed
and lasting appearance. Linux is an ongoing evolution. Each can choose to use those parts of it which are consistent with one's ability and experience.

In the meantime, I'll probably keep using a clutter of xterms until a `real' window manager appears—nxterm -T `pwd` &.

Sincerely,

David Smead, smead@amplepower.com


Comments on October 1999 Issue

A few comments on a few of the articles in the October issue:

The KDE editorial: Very well written, especially as many of us can attest to the concept of “more sophisticated idiots”. However, I disagree with the point that seems to be made that the KDE and GNOME groups should pool their efforts to create a unified environment. Although a dedicated KDE user, I like linux for the choice it offers me. I can choose KDE or GNOME, depending on which I prefer. I usually introduce newbies to KDE first, as I feel it does make the transition from Microsoft a bit easier.

The article on writing CD-ROMS. I wonder why there was no mention of the -J option when mkisofs. The Joliet extensions to ISO9660 are an alternative to the Rock Ridge extensions (the -r) option. Although they were a Microsoft creation, they are still important based on how the created cd will be used. If the only purpose is backing up for a linux machine, Rock Ridge is better... however, if the CD will be shared on multiple machines, Joliet should be closely looked at.

However, excellent issue... I usually find at least a few articles in each issue that interest me (and I read all of them anyhow), but this issue is so far the best I have received.

—Jason Ferguson, jferguson3@home.com


KDE and GNOME

I'm just a home office desktop Linux user. I use Linux in my work, to maintain my church's Web site, and for other personal stuff (including playing games). I'm not a programming wizard and don't make my living in the world of bytes and cyberspace. But I really love Linux and have been using it since I installed Slackware back when the kernel version was around 1.0.6. (Winter of 1994-95, I *think* it was.)

But I don't get it. What is the big push to settle on one desktop?

Frankly, I'm not to worried about it because anyone who's ever used Enlightenment (and, from what I've been able to glean, there are lots of us who have) is never going to be happy with KDE, but I do not like being told by the “powers that be” (including the publisher of my favorite Linux rag) that I'm some sort of turncoat for not understanding the absolute necessity for the continued health of the Linux movement of shutting down all GNOME development this very afternoon.

I've used KDE a lot. I prefer (and use) many of the KDE applications, but, as a desktop, it's mostly boring. And anyway, if I wanted “powers that be” to make such decisions for me, I'd be running some Microsoft offering.

—Maury Merkin, merkin@his.com


Chez Marcel

Chez Marcel's article (Linux Journal, Sept. 1999) on French cooking inspired me to share some recipes of my own. The cooking metaphor is not new to computing, as Donald Knuth, in his forward to “Fundamental Algorithms” confesses he almost used “Recipes for the Computer” as its title. Without stirring the metaphor to vigourously, Chez' article gives me the opportunity to share two items of interest and give them the needed cooking flavor.

For some time, I've been concerned about what I regard are overuse or misuse of two programming constructs:

To continue the cooking analogy, these two may be thought of respectively as inconsistent or lumpy sauce, and uneven temperature. Realizing that we chef's like to work on one anohter's recipes, lets see what happens when we apply them to Chez Marcel's recipe, “Check User Mail”.

Before I'd read Marcel's article, my style of progamming used the tool metaphor. While not much of a chef, I now prefer the cooking metaphor, as it connotes more of a learning, and sharing model, which is what we do in programming.

Marcel's recipe is an excellent starting point for my school of cooking, as his receipe is complete, all by itself, and offers the opportunity to visit each of the points once. First, here is a copy of his recipe, without the comment header.

for user_name in 'cat /usr/local/etc/mail_notify'
do
   no_messages='frm $user_name |
      grep -v "Mail System Internal Data" |
      wc -l'
   if [ "$no_messages" -gt "0" ]
   then
      echo "You have $no_messages e-mail message(s) waiting." > /tmp/$user_name.msg
      echo " " >> /tmp/$user_name.msg
      echo "Please start your e-mail client to collect mail." >> /tmp/$user_name.msg
      /usr/bin/smbclient -M $user_name < /tmp/$user_name.msg
   fi
done
This script isn't hard to maintain or understand, but I think the chef's in the audience will profit from the seasonings I offer here.

A by-product of my cooking school is lots of short functions. There are those who are skeptical about adopting this approach. Let's suspend belief for just a moment as we go through the method. I'll introduce my seasonings one at a time, and then put Chez Marcel's recipe back together at the end. Then you may judge the sauce.

One of the languages in my schooling was Pascal, which if you recall puts the main procedure last. So, I've learned to read scipts backwards, as that's usually where the action is anyway. In Chez Marcel's script, we come to the point in the last line, where he sends the message to each client. (I don't know samba, but I assume this will make a suitable function):

   function to_samba { /usr/bin/smbclient -M $1; }
This presumes samba reads from its standard input without another flag or argument. It's used: “to_samba {user_name}”, reading the standard input, writing to the samba client.

And, what are we going to send the user, but a message indicating they have new mail. That function looks like this:

   function you_have_mail {
      echo "You have $1 e-mail message(s) waiting."
      echo " "
      echo "Please start your e-mail client to collect mail."
   }
and it is used: you_have_mail {num_messages}. writing the message on the standard output.

At this point, you've noticed a couple of things. The file names and the redirection of output and input are missing. We'll use them if we need them. But let me give you a little hint: we won't. Unix(linux) was designed with the principle that recipes are best made from simple ingredients. Temporary files are OK, but Linux has other means to reduce your reliance on them. Introducing temporary files does a few things:

Therefore, we seek to save ourselves these tasks. We'll see how this happens in a bit.

A key piece of the recipe is deciding whether or not our user needs to be alerted to incoming mail. Let's take care of that now:

   function num_msg { frm $1 | but_not "Mail System Internal Data" | linecount; }
This is almost identical with Marcel's code fragment. We'll deal with the differences later. The curious among you have already guessed. This function is used: num_msg {user_name}, returning a count of the number of lines.

What does the final recipe look like. All of Chez Marcel's recipe is wrapped up in this one line of shell program:

   foreach user_notify  'cat /usr/etc/local/mail_notify'
And that's exactly how it's used. This single line is the entire program, supported of course, by the functions, or recipe fragments we have been building. We peeked ahead, breaking with Pascal tradition, becuase, after looking at some low-level ingredients, I thought it imporant to see where we are going at this point. You can see the value of a single-line program. It now can be moved around in your operations plan at will. You may serve your users with the frequency and taste they demand. Note, at this point, you won't have much code to change if you wanted to serve your A-M diners at 10 minute intervals beginning at 5 after the hour and your N-Z diners on the 10-minute marks.

So what does “user_notify” look like? I toiled with this one. Let me share the trials. First I did this:

   function user_notify { do_notify $(num_msg $1) $1; }
thinking that if I first calculated the number of messages for the user, and supplied that number and the user name to the function, then that function (do_notify) could perform the decision and send the message. Before going further, we have to digress. In the Korn shell, which I use excusively, the result of the operation in the expression: $( ... ) is returned to the command line. So, in our case, the result of “num_mag {user_name}” is a number 0 through some positive number, indicating the number of mail messages the user has waiting.

This version of user_notify would expect a “do_notify” to look like this:

   function do_notify { if [ "$1" -gt "0" ]; then notify_user $2 $1; fi; }
This is OK, but it means yet another “notify” function, and even for this one-line fanatic, that's a bit much. So, what to do? Observe, the only useful piece of information in this function is another function name “notify_user”. This is where culinary art, inspiration, and experience come in. Let's try a function which looks like this:
   function foo { { if [ "$X" -gt "0" ]; then eval $*; fi }
This is different than the “do_notify” we currently have. First of all, $X, is not an actual shell variable, but here the X stands for “lets see what is the best argument number to use for the numeric test”. And the “eval $*” performs an evaluation of all its arguments. And here's the spice that gives this whole recipe it's flavor! The first argument may be another command or function name! A remarkable, and little used property of the shell is to pass command names as arguments.

So, let's give “foo” a name. What does it do? If one of its arguments is non-zero, then it performs a function (it's first argument) on all the other arguments. Let's solve for X. It could be any of the positional parameters, but to be completely general, it probably should be the next one, as it's the only other one this function ever has to know about. So, let's call this thing:

   if_non_zero {function} {number} ....
Using another convenient shorthand, it all becomes:
   function if_non_zero { [ $2 -gt 0 ] && eval $*; }
and we'll see how it's used later. With this function, user_notify now looks like:
   function user_notify { if_non_zero do_notify $(num_msg $1) $1; }
and is used: user_notify {user_name}. Note the dual use of the first argument, which is the user's name. In one case, it is a further arguement to the num_msg function which return the number for that user, in the other case, it merely stands for itself, but now as the 2nd argument to “do_notify”. So, what does “do_notify” look like. We've already written the sub pieces, so, it's simply:
   function do_notify { you_have_mail $1 | to_samba $2; }
At this point, we have (almost) all the recipe ingredients. The careful reader has noted the omission of “but_not”, “linecount”, and “foreach”. Permit me another gastronomic aside. Ruth Reichel, recently food editor of the New York Times, is now the editor for Gourmet magazine. One of the things she promises to do is correct the complicated recipes so frequently seen in their pages. For example, “use 1/4 cup lemon juice” will replace the paragraph of instructions on how to extract that juice from a lemon.

In that spirit, I'll let you readers write your own “but_not” and “linecount”. Let me show you the “foreach” you can use:

      function foreach { cmd=$1; shift; for arg in $*; do eval $cmd $arg; done; }
A slightly more elegant version avoids the temporary file name:
      function foreach { for a $(shifted $*); do eval $1 $a; done; }
which requires “shifted”:
      function shifted { shift; echo $*; }
The former “foreach”, to be completely secure, needs a “typeset” qualifier in front of the “cmd” variable. It's another reason to avoid the use of variable names. This comes under the general rule that not every progamming feature needs to be used.

We need one final “Chapters of the Cookbook” instruction before putting this recipe back together. Let's imagine by now, that we are practicing, student chef's and we have a little repertoire of our own. So what's an easy way to re-use those cooking tricks of the past. In the programming sense, we put them in a function library and invoke the library in our scripts. In this case, let's assume we have “foreach”, “but_not”, and “linecount” in the cookbook. Put that file “cookbook” either in the current directory, but more usefully, somewhere along your PATH variable. Using Chez Marcel's example, we might expect to put it in, say, /usr/local/recipe/cookbook, so you might do this in your environment:

   PATH=$PATH:/usr/local/recipe
and then, in your shell files, or recipes, you would have a line like this:
    . cookbook    #  "dot - cookbook"
where the “dot” reads, or “sources” the contents of the cookbook file into the current shell. So, let's put it together:
# -- Mail Notification, Marty McGowan, mcfly@workmail.com, 9/9/99
#
  . cookbook
# -------------------------------------------- General Purpose --
function if_non_zero { [ $2 -gt 0 ] && eval $*; }
function to_samba { /usr/bin/smbclient -M $1; }
# --------------------------------------- Application Specific --
function num_msg  { frm $1 | but_not "Mail System Internal Data" | linecount; }
function you_have_mail  {
   echo "You have $1 e-mail message(s) waiting."
   echo " "
   echo "Please start your e-mail client to collect mail."
}
function do_notify   { you_have_mail $1 | to_samba $2; }
function user_notify { if_non_zero do_notify $(num_msg $1) $1; }
#
# ------------------------------------------ Mail Notification --
#
  foreach user_notify  'cat /usr/etc/local/mail_notify'
On closing, there are a few things that suggest themselves here. “if_non_zero” probably belongs in the cookbook. It may already be in mine. And also “to_samba”. Where does that go? I keep one master cookbook, for little recipes that may be used in any type of cooking. Also, I keep specialty cookbooks for each style that needs its own repertoire. So, I may have a Samba cookbook as well. After I've done some cooking, and in a new recipe, I might find the need for some fragment I've used before. Hopefully, it's in the cookbook. If it's not there, I ask myself, “is this little bit ready for wider use?”. If so, I put it in the cookbook, or, after a while other little fragments might find their way into the specialty books. So, in the not too distant future, I might have a file, called “samba_recipe”, which starts out like:
# --------------- Samba Recipes, uses the Cookbook, Adds SAMBA --
. cookbook
# -------------------------------------------- General Purpose --
function to_samba { /usr/bin/smbclient -M $1; }
This leads to a recipe with three fewer lines and the cookbook has been replace with “samba_recipes” at the start.

Let me say just two things about style: my functions either fit on one line or not. If they do, each phrase needs to be separated by a semi-colon (;), if not, a newline is sufficient. My multi-line function closes with a curly brace on it's own line. Also, my comments are ”right-justified“, with two trailing dashes. Find your style, and stick to it.

In conclusion, note how we've eliminated temporary files and variables. Nor are there nested decisions or program flow. How was this achieved? Each of these are now “atomic” actions. The one decision in this recipe, “does Marcel have any mail now?” has been encapsulated in the “if_non_zero” function, which is supplied the result of the “num_msg” query. Also, the looping construct has been folded into the “foreach” function. This one function has simplified my recipes greatly. (I've also found it necessary to write a “foreachi” function which passes a single argument to each function executed.)

The temporary files disappeared into the pipe, which was Unix' (linux) single greatest invention. The idea that one progam might read its input from the output from another was not widely understood when Unix was invented. And the temporary names dispappeard into the shell variable arguments. The shell function, which is very well defined in the Korn shell, adds greatly to this simlification.

To debug in this style, I've found it practical to add two things to a function to tell me what's going on in the oven. For example:

   function do_notify   { comment do_notify $*
       you_have_mail $1 | tee do_notify.$$ |  to_samba $2
       }
where “comment” looks like:
      function comment { echo $* 1>&2; }
Hopefully, the chef's in the audience will find use for these approaches to their recipes. I'll admit this style is not the easiest to adapt, but soon it will yield recipes of more even consistency, both in taste and temperature. And a programming style that will expand each chef's culinary art.

—Marty McGowan, mcfly@workmail.com

LJ Archive