I have eagerly been waiting for WolframAlpha, the uber-hyped knowledge engine (or whatever the hell it's supposed to be, rather than a search engine). I was keeping a relatively open mind, but figured it would probably suck. I was ready to be proved wrong, but from what I've messed with so far, I haven't been.
Here it is.
Have a spin yourself. First the good things, which are pretty sparse. It does pretty well with comparison, especially of geographic and temporal landmarks. It seems to aggregate information mostly from sources like the CIA World Factbook and Wikipedia. I guess it's a little more efficient than opening up two browser windows and comparing two regions or cities side by side, but how often do I want to do that?
On the bad side, it's language parsing abilities simply blow. It can't handle clauses or changes in word order very well at all. I don't expect it to be able to elegantly parse natural language, but I expect it to be able to do as well as Ask Jeeves (which wasn't very well).
On some sample queries of things I'm personally interested in, it couldn't tell me how many Android users there are, or an estimate of how many iPhone users there were. It didn't know how many licks it takes to get to the center of a Tootsie Pop. If I just typed in "lafayette louisiana" it gave me some decent summary statistics, but if I asked it any specific questions about the area, e.g., how many restaurants are in lafayette, louisiana, it couldn't answer. I thought this was supposed to be a major feature of the engine, it's ability to derive information from sources that wasn't simply transparently available.
Ah well...what about math? It's great at that, right?
Well, I've been working a lot with a particular function which is used in engineering and neural modeling called the Naka-Rushton function. WolframAlpha hadn't heard of it. It also couldn't parse "logistic sigmoid" or "logsig". WTF?
Color me not impressed. Let me know when it gets to beta so I can see if it sucks any less.