Skip to content
Thoughtful, detailed coverage of everything Apple for 33 years
and the TidBITS Content Network for Apple professionals

And Then There Was One…

First, a correction. While developing search engines for the TidBITS Search Engine Shootout, some entrants sent more than one URL as they changed configurations, or temporarily used different servers as test machines. The URL we gave last week for Glen Stewart’s WarpSearch entry such a temporary location, set up only for the duration of the Shootout. You can check out WarpSearch reliably at the following URL:

<http://associate.com/innovative/Glen_Stewart/ About_WarpSearch.html>

Last week in TidBITS-379 we introduced you to all the entrants and promised we’d make a decision this week. It hasn’t been easy. Of our 11 entrants, all of whom submitted excellent entries, four stood out.

The Criteria — We had hoped that one of the entries would obviously rise to the top, but we had no such luck. So, we came up with some refined criteria for comparing our top four entries. These criteria are:


  • Ease of use for the end user
  • Searching power for the end user
  • Ease of setup and maintenance for us
  • Searching speed
  • Setup cost
  • "Hit by a bus" survivability (I’ll explain this later)
  • Overall accuracy of results

We are aware that Apple has not yet shipped a final Telepathy extension, so we’re sure some of the comments below can easily be addressed by the developers. We’ve tried to take that flexibility into account, but overall, we judged what we saw.

Also keep in mind that we didn’t evaluate these search engines for which is generically the best. We instead chose which would be the best solution for TidBITS. That’s likely to be different from anything you may want a search engine to do, so if you want to build your own Mac OS-based search engine, you should investigate these technologies more closely (and check last week’s article for others that might suit your purposes).

Ease of Use — Obviously, a search engine should be as easy to use, because otherwise people will avoid it. This criterion is often at odds with the next one, which rates searching power, since the more options, the more complex the interface and the results list inevitably become. Ethan’s Phantom-based entry has more options on its main search page than the rest, lowering its ease of use slightly. Some of us like AltaVista’s interface, and familiarity on the Web is a good thing, so Ole and David’s Frontier and FileMaker entry gets points for providing both simple and advanced search forms. Curt’s Apple e.g. entry and Scott’s WebServer 4D entry have dead simple interfaces, which is good.

All our entrants provide results at the article level (and Ethan gets extra points for breaking out MailBITS separately), although Curt links to the article within the full issue rather than breaking the articles out as individual files. Curt’s technique forces people to download a full issue each time but provides context around the article in question and makes it easy to scan other articles in the same issue. Ole and David straddle the fence by breaking the articles out and also pointing into the full issue on our Web site, which is good for an independent search engine, but less important for something we’d run ourselves.

A final part of the ease of use criterion is the results page. The results should be attractive, easy to scan quickly, and sorted well. Ole and David score points from their homage to AltaVista but display results newest first, whereas Ethan and Curt both take advantage of relevance sorting. Ethan’s results list unfortunately includes the text from the navigation bar in the summary text, but that’s probably easily rectified. Scott’s results page does chronological sorting (relevance is slated for a later release) and uses a simple table with the issue number and article title, but no summary text, which makes it more difficult to determine which article you might want. I suspect that’s fixable.

Both Ethan and Curt include a field for a new search in the results list, and Ethan puts the search terms in the field. Apple e.g.’s option to find similar documents is more flexible than Phantom’s, since you can select multiple articles by clicking multiple More checkboxes, whereas you can only find documents similar to a single hit in Phantom’s results list.

Although we’re splitting hairs here, since all four are easy to use, we give the ease of use award to Curt Stevens and Apple e.g. for the combination of a simple interface and a clear and attractive results list.

Ease of Use: Curt Stevens and Apple e.g.



Searching Power — Sometimes you want to find information that’s not easily identified with a word or two. For that, you need additional flexibility and power in the search engine. You may know roughly when an article was published, or you may know how a word starts or how it sounds but not know how to spell it properly. Ethan’s Phantom-based entry wins hands down when it comes to searching power, which is the trade-off for losing a bit on simplicity of interface. Phantom provides Boolean searching, phonetic searching, word stemming, searching within certain HTML tags, and some level of date range searching. Ole and David’s Frontier/FileMaker entry offers an advanced search that provides Boolean searching, title searches, issue number searches, and date range searches, which are quite useful. Curt’s Apple e.g solution and Scott’s WebServer 4D entry offer little in the way of this sort of flexibility, although you can throw parts of dates (like the last two digits of the year) into the search string to improve granularity.

The capability to find similar documents is useful for narrowing searches. It’s provided by both Ethan and Curt via Phantom and Apple e.g., and both seem to do a good job at it. Overall, we found that Apple e.g. had a better interface for finding similar documents, but it’s not enough to compete with Phantom’s searching flexibility.

Searching Power: Ethan Benatan, Frontier and Phantom



Ease of Setup and Maintenance — This category is difficult to judge, because we neither set up nor attempted to administer all of the contest entries. However, based on what we know of the tools involved and what we know of our existing tools, we can make some assumptions.

Ole and David and Ethan use Frontier to suck in new TidBITS issues, parse them into articles (and MailBITS, in Ethan’s case), and then turn them over to the database engine (FileMaker Pro and Phantom, respectively). Ole and David also use Frontier as the CGI to communicate between the Web server and FileMaker Pro, whereas Phantom acts as both the indexer and the Web server. Using Frontier offers significant flexibility, but may suffer from ease of setup – scripting solutions seldom have well-designed graphical interfaces. Similarly, although the flexibility is there, changes require programming, and although both Geoff Duncan and Matt Neuburg are capable of that, the rest of us at TidBITS aren’t. Since we’re small, we try to keep overlapping skill sets so anyone can step in for anyone else if necessary.

Scott and Curt both look in a drop folder for new issues of TidBITS to index, which is an ideal solution for us, because it’s easy for us to modify our existing distribution automation to put a copy of the issue in a folder. Curt’s Apple e.g. entry is probably the best here, since we believe we can point it at our existing folder of TidBITS issues, whereas Scott’s WebServer 4D entry currently deletes the original from the drop folder after importing it. We’re sure that’s an easy thing to change if necessary.

Ease of Setup and Maintenance: Curt Stevens and Apple e.g.



Speed — Overall, we didn’t notice that any of the entries were particularly slow, and speed wouldn’t have entered our consciousness in a big way if it hadn’t been for Scott Ribe’s WebServer 4D entry. Everyone else seemed roughly similar (and since there are lots of variables in how fast something works on the Web, we ignored occasional differences), but Scott’s entry was blindingly fast, so much so that I ended up using it a few times in the last few weeks because I knew it would be the quickest to send results back. There’s not much else to say about this criterion, but wow!

Speed: Scott Ribe and WebServer 4D



Cost — Again, it’s difficult to estimate the cost of setting up one of these search engines since we already have some of the necessary equipment and software. For those of you interested in setting up a similar server from scratch, we’ll rough out the costs as we understand them.


  • Scott’s entry requires the $295 WebServer 4D from MDG, and he said that he hopes to sell the custom text indexing extension he’s writing for this purpose for somewhere in the $100 to $200 range. It achieves its blinding speed on a Quadra 800 with a PPC upgrade card, which is about as slow as Power Macs get, so CPU power isn’t much of an issue, nor is disk space or speed. RAM is useful though, and Scott recommends a system with 48 MB.

<http://www.mdg.com/>


  • Ethan’s entry uses Maxum’s Phantom running in stand-alone mode, so it doesn’t even require an additional Web server. Phantom is the major cost at $395, although Ethan’s setup also uses the free Frontier and the free Eudora Light (for reports). Currently, Ethan’s entry runs on a 32 MB PowerBase 180 from Power Computing.

<http://www.maxum.com/Phantom/>

<http://www.scripting.com/Frontier/>


  • Ole and David’s entry uses the free Frontier, Chris Hawk’s free Quid Pro Quo as the Web server, and Claris’s FileMaker Pro, which costs roughly $200. To avoid buying FileMaker Pro, Ole and David say that you could use their Frontier suite with other databases. Ole and David’s entry was hosted on two separate machines; the main one we pointed at turned out to use a 68040 and 20 MB of RAM, so hardware shouldn’t be an problem for their solution.

<http://www.scripting.com/Frontier/>

<http://www.socialeng.com/>

<http://www.claris.com/products/claris/ filemakerpro/filemakerpro.html>


  • Curt’s entry uses Apple e.g., which is free, although it does require a Web server such as StarNine’s WebSTAR, which we use, or the free Quid Pro Quo. It’s running on an Apple Workgroup Server 8150/110 with 40 MB (10 MB for Apple e.g.). That’s a 100 MHz PowerPC 601 – not a particularly fast machine. The bottom line comes down to the fact that if you have a Power Mac, you wouldn’t have to spend any money to get Apple e.g. up and running.

<http://cybertech.apple.com/apple_eg.html>

Cost: Curt Stevens and Apple e.g.



Hit by a Bus — As I noted before, TidBITS is a small organization, and as with any small organization, we worry about what TidBITS would do if something terrible (such as being hit by a bus) were to happen to one of us. As such, we avoid situations where any one of us is the only person who could perform an important task – if that person were to die in a freak gardening accident, that task would be difficult to continue. So, in thinking about which search engines to adopt, we considered the ramifications of the hit by a bus scenario for each one.

Curt’s Apple e.g. entry would seem to be the obvious winner, but for one wee problem: it’s currently a custom job. Curt works at Apple on Apple e.g., and he modified Apple e.g. to understand that TidBITS issues have more than one article in them. So, unless Curt’s custom changes are rolled into the public version of Apple e.g. and maintained (which is the plan), Curt becomes our weak link. And, given Apple’s recent troubles, Apple e.g.’s future in general is something of a question mark.

Scott’s entry suffers some of the same problems, given that the bulk of the work is his custom text indexing extension, which is currently hard-coded to certain aspects of TidBITS. If we were to change something about our format, and Scott had been abducted by space aliens, we’d be in trouble. Also, although WebServer 4D is obviously performing well, MDG is a small company in what can be a hard market.

Interestingly, although we marked Ethan and Ole and David’s entries down slightly for ease of setup because they’re based in large part on Frontier, they both do better in this category because of that. Frontier may not be the sort of thing that some of us have ever been able to wrap our heads around, but many people know it and could help in case of emergency. Ethan also uses Phantom, and Maxum seems like a solid company that is unlikely to disappear or drop Phantom. Ole and David rely on FileMaker Pro, and given that it’s the most popular database on the Macintosh, it’s a good bet that it will be around forever with plenty of people who know how to use it.

Ethan edges out Ole and David by a hair here, if only because he seems to rely on Frontier a little bit less, which means finding someone who could fix a problem in his code would be slightly easier.

Hit by a Bus: Ethan Benatan, Frontier and Phantom



Overall Accuracy — There’s nothing worse than not being able to find something you know exists thanks to some quirk in a search engine. Geoff Duncan was a software tester in a previous lifetime, and he briefly hammered on all of the entrants with deliberately stressful and unusual searches. I’ll let him report on which ones fared well.

Fortunately, the four final entrants all provide essentially correct and functional search results. Simple targeted tests for known items – the word "emporia," for instance, which until now only appeared in one TidBITS issue – worked correctly in all engines; similarly, Boolean functions plus issue and date restrictions appeared to function correctly where they were offered. Stress tests for large (or huge) results lists and simultaneous queries were also handled properly. However, some more complex (or more naive) queries occasionally generated mixed hits or unexpected results lists. After isolating the search engines’ behaviors, I tried to figure out how quirks might impact real users.

Both Curt with Apple e.g. and Ethan with Phantom sort search results by perceived relevance, which proves both a strength and a weakness. On one hand, they both tend to let the most appropriate articles float to the top of a results list, which is obviously useful. However, relevancy ranking also tends to break down with (perhaps unwittingly) vague queries. Apple e.g. casts a wide net, routinely finding more than 100 matches for simple queries ("RAM Doubler review"), of which the top-most matches were fine, but subsequent matches can appear random at first glance and also have a comparatively high relevancy. Phantom, conversely, throws away the chaff: the same query turns up just three items, the first of which is right on target, and the other two of which mention all the terms but (appropriately) have single-digit relevancy. Phantom does a similarly good job narrowing down results with other generally phrased searches.

Neither Scott’s nor Ole and David’s entries offer relevancy; instead sorting results from most to least recent. However (and this is probably fixable), Ole and David’s entry sometimes returns duplicate hits in early TidBITS issues, with some early hits appearing at the top of the results list, then repeated later in correct sort order. More often than not, trying to access these duplicated entries returns an error. Scott’s entry doesn’t suffer from result duplication, but it does ignore URLs, which (judging from TidBITS email) are frequently sought items.

So, although Apple e.g. provides more advanced features for finding articles similar to ones in a results list, for pure accuracy and relevancy of results, I give the nod to Phantom.

Accuracy: Ethan Benatan, Frontier and Phantom



Quantitative Ratings — As a final method of differentiating the search engines, I asked everyone at TidBITS to list these four search engines in order of overall preference. I figured that would help include any intangibles that might have slipped through the criteria above. I then took the ratings and assigned points, one point for the first choice, two for second, three for third, and four for fourth. I next added the points for each entrant, and ranked the entrants accordingly (like in the cross-country races I ran in high school and college). With five people voting, the scores could range between 5 and 20. Here’s how it came out:


  • 6 points: Curt Stevens and Apple e.g.
  • 13 points: Ethan Benatan, Frontier and Phantom
  • 14 points: Ole Saalmann and David Weingart, Frontier and FileMaker
  • 17 points: Scott Ribe and WebServer 4D

Quantitative Ratings: Curt Stevens and Apple e.g.



And in the End… I feel terrible having to single out a winner. All four entrants have done a fabulous job. Scott knocked our socks off with the raw speed of his search engine – keep an eye out for when he releases the commercial version of his text indexing extension. Ethan showed how he could use Frontier to enhance Phantom’s already impressive capabilities. Ethan also says he’s looking for work soon – someone give this man a job! Ole and David wanted to make sure Frontier got the exposure it deserves, and they put together a great resource despite not knowing each other and living on different continents. They’re a tribute to the spirit of the Internet. Curt wanted to show what Apple’s free Apple e.g. could do, and frankly, Apple can use all the impressive technology demonstrations it can muster.

In our eyes then, they’re all winners. But, we don’t need to run four separate search engines ourselves, so we plan to implement Curt’s Apple e.g. solution first because, all other things being equal, it seems to be the easiest to merge into our existing setup. Should we run into problems, we’ll next test both Ethan’s and Ole and David’s solutions. It will probably be easier to try Ethan’s solution, since it doesn’t have to integrate with our existing Web server. However, Ole and David’s solution might dovetail nicely with some other work that Geoff is doing with keyword indexing. The final option would be Scott’s WebServer 4D solution solely because it involves acquiring, installing, and learning several new pieces of software. There’s no overall problem in that, just the reality of how much time and bandwidth we have to learn new things.

Thanks again to all of our entrants!


Subscribe today so you don’t miss any TidBITS articles!

Every week you’ll get tech tips, in-depth reviews, and insightful news analysis for discerning Apple users. For over 33 years, we’ve published professional, member-supported tech journalism that makes you smarter.

Registration confirmation will be emailed to you.

This site is protected by reCAPTCHA. The Google Privacy Policy and Terms of Service apply.