Intelligent Systems Report

May 1998, Vol. 15, No. 5

Challenges for the computational intelligence field:
WCCI '98: a report from Anchorage



A decade ago, when the neural network industry was still very much in its infancy, a frequently heard lament was that the technology was much sounder and more advanced than most people thought; the problem was, all the success stories were at companies that refused to have their applications discussed in public, for fear of losing a perceived competitive advantage. The handful of neural network companies in the late 1980s were all small firms relying on government contracts and venture capital to buoy them along until the day they could introduce development tools into the marketplace. And yet, despite the anonymity of the major developers and the surreal silence in terms of real-world success stories, word-of-mouth anecdotes passed along through various channels was leading some hype-mind analysts to predict a neural network industry worth a billion dollars or more within a few years.

Flash forward to the late 1990s, and you'll find a hauntingly familiar atmosphere surrounding the evolutionary computation field (also referred to in some circles as genetic algorithms, after the technology that has been successfully embodied into software tools, or the loftier catch-all term artificial life). The only real difference between neural networks then and evolutionary computation now is that nobody is predicting a billion-dollar industry for genetic algorithms... which is actually a good thing. With the mind-numbing push for product, evolutionary programmers will have the rare luxury of continuing their work at its own pace, rather than at the whims of Madison Avenue. Still, as with all technologies, results will need to be forthcoming soon.

Neural networks and evolutionary computing and fuzzy logic can all be lumped under the general phrase "computational intelligence," and for only the second time in this decade an entire conference was devoted to the research efforts of all three groups. More than 1,000 attendees gathered at the second World Congress on Computational Intelligence (WCCI '98), an every-four-years event held this time in the scenic though quite off-the-beaten-track city of Anchorage, Ak. Despite the remoteness of the location, it was the best-attended show of its kind since the last WCCI, held in 1994 in Orlando (a city quite a bit easier to get to, and boasting a considerably larger variety of outside activities). Virtually every neural network, fuzzy logic and evolutionary computation researcher of any note or notoriety was at the show.


Just scratching the surface
Of the three technologies, neural networks for better or worse have had the most promotion and are thus the best known. However, according to Paul Keller, with Battelle's Pacific Northwest Laboratory, the neural network field is still very much in its infancy, and is characterized by:

• simple models with little correlation to biology;

• toy applications;

• implementations on serial computers.

"We've just barely scratched the surface as to the potential of the field," Keller said. The raw processing power of personal computers will approach the level of the human brain in 30-40 years, if Moore's Law holds true; the question is, in what ways exactly will computers rival humans?

Keller stated that the biggest challenge for the neural network field is to focus on hope, not hype. "We need to get the technology into use, and get the public used to neural networks," he said. "We also need to keep a long-term vision of physiologically-motivated information processing going." Exploiting parallel processing computers will be an important vehicle for furthering the potential of neural networks.

Wlodzislaw Duch, with Nicholas Copernicus University in Poland, believes that the chief technical challenge for neural network developers is to produce "something useful," i.e., a real-world application that tackles hard problems, such as natural language understanding or machine reasoning. Developers need to key in on integration issues, such as incorporating statistical, pattern recognition, machine learning and various logical (fuzzy, probabilistic, etc.) approaches into neural network solutions.

"The neural network community lacks standards," Duch pointed out, "and that makes it hard for somebody to compare features from one system to another." He suggested that neural network proponents learn to cooperate and co-exist with the rest of the artificial intelligence community, rather than living in isolation on their own islands.


The CAM-Brain Project
One of the most promising developments on the horizon, according to Duch, is the CAM-Brain, an artificial brain project undertaken by Japan's ATR Brain Builder Group to build an evolved neural network module containing a billion artificial neurons, and able to control the behaviors of a life-sized robot kitten, by the year 2001. The neural network model implemented in the CAM-Brain is called CoDi (for Collect and Distribute).

CoDi is a 3-D cellular automata-based model implemented with field-programmable gate array (FPGA) devices. These devices, according to CAM-Brain's developers, are fully and partially reconfigurable, and feature "a coprocessor architecture with data and address bus access in addition to user inputs and outputs, and allow the reading and writing of any of the internal flip-flops through the data bus."

The robot kitten, called Robokoneko, requires roughly 10,000 neural modules to implement, and will be designed to perform such tasks as: walking on its four legs; standing on three legs; manipulating with its front leg(s); jump onto objects; curl up; lie down; move its tail; purr when stroked; etc. Robokoneko will include sound generation and detection; heat and touch sensors; robotic vision; and weigh roughly 1 to 2 Kgms. The actual intelligence of the kitten will be generated off-body via modem by the artificial brain.

The first stage of the project, the CAM-Brain, should be completed by mid-year 1998, while 1999 will be devoted to integrating the Robokoneko and the artificial brain.


IJCNN: Back from the dead
One of the most encouraging byproducts of the WCCI show which was actually three separate programs under one umbrella title was the revival of the International Joint Conference on Neural Networks (IJCNN). In the late 1980s and early 1990s, the IJCNN programs represented a cooperative effort between the two main trade associations for the field: the IEEE's Neural Network Council and the International Neural Network Society (INNS). The conferences, which initially generated a great deal of media interest and featured relatively large vendor exhibitions, helped fuel the fires of the fledgling neural network field. While some of the attendant hype surrounding the conferences probably led to some unreasonable expectations that could never be fulfilled, the IJCNN shows were basically a win/win proposition: the conference organizers were able to attract an international gathering of all the leading lights in the field, while attendees knew every year there would be one must-see conference to go to.

All that ended circa 1993, when in-fighting between the INNS and the IEEE led to a dissolution of their joint conference, and each group went its separate way, aiming for the biggest piece of pie from a group that basically only wanted one pie, not two. After several fallow years, the realization finally dawned on the show organizers that one joint conference a year is probably enough for everybody, leading to the IEEE's resurrection of the IJCNN conference at WCCI '98, sponsored with the cooperation of the INNS. Next summer, the INNS will take its turn with IJCNN '99, to be held in Washington, D.C.


Applying fuzziness to the real world
The fuzzy logic field is unique among the various intelligent technologies in that it has one clearly and easily identified personage to rally around: Lotfi Zadeh of the University of California, Berkeley. Zadeh has been referred to as the "father of fuzzy logic" so many times now that it's probably redundant to even mention it; what is not redundant, though, is his untiring push to continue furthering the reach of the field he helped to found more than 30 years ago.

Zadeh speaks more often these days about soft computing rather than pure fuzzy logic. Soft computing, he explained, "is a consortium of fuzzy logic, neurocomputing, evolutionary computing and probabilistic computing," and he sees it as a key enabler of various recognition technologies. He described a number of emerging applications that embody soft computing, such as:

• Omron's MailJail software, which uses fuzzy logic rules to filters out unwanted junk e-mail.

• IBM's computer virus detection system, which uses neural networks to detect both known and new viruses.

• A supermarket checkout scanner that uses scent sensors to identify fruits and vegetables

• NCR's biometric identification system for ATM cash machines, which uses iris recognition.

• A molecular breath analyzer that can detect diseases such as hepatitis, lung cancer and stomach ulcer earlier than currently used tests.

In terms of cultural acceptance, fuzzy logic has caught on much more rapidly overseas than it has in the U.S., particularly in western Europe and the Far East. Representatives from European electronics giants Siemens (Germany) and SGS-Thomson (France) spoke at WCCI '98 about their work with fuzzy coprocessors and how these chips are being applied to solve real-world problems.

Siemens, for instance, is developing an electronic brake system called Brake by Wire (BBW) which the company believes could replace the hydraulic brakes currently used in most automobiles. According to Siemens, "the main advantage of BBW is fast, highly accurate and autonomous control of each wheel's braking pressure." To achieve this type of control, the company is using the 8-bit fuzzy coprocessor SAE 8 1C99A chip as a central controller that computes the higher braking functions and produces setpoints for the four braking pressures.

Siemens is also testing the application of a 12-bit fuzzy coprocessor, the SAE 8 1C99 1 chip, for image processing applications, particularly in medical situations. With hospitals pushing developers for portable solutions to reduce costs while still maintaining optimum care, fuzzy logic-based imaging is seen as a key to enabling "mobile nurses" that could be used to analyze a patient at home rather than having to hospitalize them.

SGS-Thomson, meanwhile, has introduce a family of fuzzy digital coprocessors and microcontrollers called WARP (Weight Associative Rules Processor). One application of these WARP chips is a "virtual sensor" that controls the air flow in a vacuum cleaner. As Thomson representatives explained it, "To sense the air flow a temperature sensor is used, implementing a fuzzy mapping between the temperature and the air flow."

Similarly, WARP chips are being used for thermal regulation: "A fuzzy model identifies the room temperature by using data coming from the temperature sensor located on the thermal radiator, thus obtaining a fuzzy control of the real room temperature." The fuzzy processor then implements different sets of rules for the temperature sensor, the system modellization and the control algorithm implementation.


A need for compelling answers
For the evolutionary computation (EC) field, one of the biggest challenges is identifying what direction the field should be pointed toward. For instance, Xin Yao of the University of New South Wales (Australia), asked, "If evolutionary computation is the answer, then what is the question?" He observed that genetic algorithms cannot at present solve any problems that no other technique can solve. The best EC can claim right now is that it can solve problems more reliably and efficiently than some other techniques. Those are the areas where researchers and developers need to focus their attention.

Thomas Baeck with the University of Leiden (The Netherlands) spelled it out succinctly: "We need to concentrate on real-world applications that will convince industry that EC is the compelling answer to their problems." He listed some of the most promising application areas as being: telecommunications, the Internet, molecular biology and bioinformatics, consumer electronics, and robotics.

Finally, Hans-Paul Schwefel with the University of Dortmund (Germany) illustrated that, while the EC field has been around for about 40 years, only in the last few years has the field exploded with so much activity that it has become impossible to keep tabs on what everybody is doing. Nearly 50 genetic algorithm patents have been issued, and 2,000 papers on the subject are being written every year. If current trends hold true, 10 years from now we may see self-adaptive code that can solve nearly any specific problem; in fact, the more we learn about evolutionary computing, the more we should learn about the nature of real life itself. Lofty goals, true, but then again, the AI field by its very definition has always striven to scale the highest mountain peaks that the human mind can imagine.



Web Site © Copyright 1997, 1998 by Lionheart Publishing, Inc.
All rights reserved.



Lionheart Publishing, Inc.
2555 Cumberland Parkway, Suite 299, Atlanta, GA 30339 USA
Phone: 770-431-0867 | Fax: 770-432-6969
E-mail: lpi@lionhrtpub.com
Web: www.lionheartpub.com


Web Design by Premier Web Designs
E-mail: lionwebmaster@preweb.com