Rebel Science News
11/28/2012
Jeff Hawkins Is Close to Something Big
 
8/26/2012
The Myth of the Bayesian Brain
 
8/23/2012
The Second Great AI Red Herring Chase
 
8/15/2012
Rebel Speech Recognition Theory
 
8/8/2012
Rebel Speech Update
 

The Devil's Advocate

 

 

 

 
Rebel Science Home
Why Software Is Bad
Project COSA
Operating System
Software Composition
Parallel QuickSort
The Devil's Advocate
COSA Discussion Forum
Not Associated with V.S. Merlot, Inc.
Contact Me

 

Introduction
The Devil's Objections

Introduction

I regularly get criticism from detractors who object to my arguments in favor of adopting a non-algorithmic, signal-based, synchronous software model. The following is a compiled list of objections followed by my rebuttals. I will add new items to the list as they come to my attention.

 

The Devil's Objections

 
1. Hardware is more reliable than software because correcting flaws in hardware is very difficult and expensive, so they get it right the first time.
 
Correcting flaws in mission-critical software is equally expensive. Just ask any manufacturer who has had to recall thousands of products due to a defect in the software. Ask NASA or the FAA how expensive and dangerous malfunctioning software can be. Mission and safety critical software goes through the same stringent testing as hardware. The fact remains that algorithmic software is still more prone to failure than hardware regardless of how careful the designers and testers are.
 
2. Hardware has just as many bugs as software. Just look at the errata sheets for a new chip.
 
Nobody is claiming that there are no bugs in hardware. The claim is that almost all bugs in hardware are found, corrected and documented during the testing process. Once released, an integrated circuit will almost never fail except for physical reasons. By contrast, there are almost always hidden bugs in released software that the quality control process invariably fails to catch during testing. In addition, most hardware bugs are due to physical defects introduced during manufacturing or the result of bad physical layout.
 
3. Hardware is more reliable than software because it is less complex.
 
Not true for two reasons. First, if one compares hardware and software of roughly equal complexity, the hardware is invariably orders of magnitude more stable than the software. Second, when most people talk about hardware, they usually think of a single IC chip or function. They overlook the fact that a chip is more comparable to one or more subroutines or objects in a software application. A hardware system, such as a computer, usually consists of multiple chips working together in very complex ways. Combining any number of chips to form large systems is not known to increase their logical failure rate after release. Likewise, combining many functions on a single chip does not degrade the quality of the finished product. By contrast, combining subroutines to create larger programs is known to increase the likelihood of failure in deployed software systems.
 
4. The brain is asynchronous, not synchronous as you claim.
 
This is not supported by research in neurobiology. One of the most amazing aspects of the brain that has come to light in the last half century is the existence of synchronizing oscillations mostly in the 10 to 120 Hertz range.
 
5. Contrary to your claims, the human brain is a very unreliable system. It continually makes mistakes, creates memories of events that never happened and often makes irrational decisions.
 
Unlike our current computer systems, the brain is self-correcting, that is to say, it uses a trial and error process to modify itself. Making and correcting mistakes is what it is programmed to do. To expect a child to ride a bicycle without falling or running into obstacles is like faulting a chess program for not playing tic-tac-toe. Barring a physical failure, the brain always does what it is programmed to do, flawlessly, even if it turns out to be a mistake.

In order to survive in an uncertain and chaotic environment, the brain uses a technique known as pattern completion to fill in missing or unknown information. This mechanism makes it possible for us to understand a garbled conversation in a noisy room or recognize a partially occluded face. It also makes it possible for animals to recognize danger in the wild even when the predator is hidden from view. Certainly, the mechanism often leads to false assumptions but this must not be equated with failure on the part of the brain. This is the way it is supposed to work. Anything else would lead to extinction. As an aside, our future intelligent robots will behave in a very similar manner. The super rational, logical and unerring Mr. Spock is a modern myth.

 
6. What about the famous Pentium FDIV bug? Isn't that a case of hardware failing after release?
 
No. The Pentium floating point processor did exactly what it was supposed to do, which was to fetch a value from a location in a table in on-chip memory. It just so happened that the table was wrong. This Pentium division bug is a perfect example of blaming hardware for a fault in the embedded software. For whatever reason, the quality control department had failed to test a portion of the design. The promise of the synchronous model is not to eliminate design mistakes, although it can go a long way toward that goal (see this news item). The promise is this: once a design is tested to behave a certain way, it will continue to behave in the same way barring a physical failure. One cannot fault a chess program for not playing tic-tac-toe.
 

 

 

 

 

Send all comments to Louis Savain

2006 Louis Savain

Copy and distribute freely