Showing posts with label Biology. Show all posts
Showing posts with label Biology. Show all posts

Thursday, November 18, 2010

Basic vs Applied Research - Linear vs Nonlinear Models

I was reading a review in IEEE Spectrum of Henry Petroski's "The Essential Engineer: Why Science Alone Will Not Solve Our Global Problems", and found several references to the linear model of how research is being conducted in the U.S.:

"Part of the problem, he [Henry Petroski] says, is the linear model of technological progress: Basic research spawns applied research, which in turn fuels technological development. This model is wrong historically, and it undervalues the creative juices of good engineering ... This linear model became ascendant in U.S. science policy after 1945 when Vannevar Bush submitted his famous report, “Science—the Endless Frontier”, which enhanced the prestige (and funding) of basic research at the expense of applied work."

I wrote in one of my previous posts about linearity and nonlinearity in biology, engineering, and financial systems, and how linear models (such as blood vessels in biology, waterfall model in engineering) have a negative impact on the systems where they are applied. The review of the book has the same underlying message, namely how moving away from the linear model of basic research towards the more practical approach of applied research (or development projects) has a greater outcome in terms of engineering advances. 

Wednesday, November 12, 2008

Self-Healing Hulls

In the 2008 November issue of IEEE Spectrum, there is an article that talks about self-healing hulls. The carbon-fiber composition of a yacht can considerably heal itself after a collision. You can improve the healing process by inducing a little electric current. The work is being done by Eva Kirkby, a graduate student from EPFL. The main idea is that carbon-fiber is composed of carbon fibers and epoxy; the problem is that in case of impacts, these materials tend to separate internally causing cracks parallel to the surface of the material. In order to counter this problem, the material is infused with hundreds of very small bubbles filled with liquid-monomer molecules plus some small particles of catalyst. The outcome would be a hardening of the material. In order to keep the concentration and size of the bubbles to a minimum, Kirkby incorporated into the composition wires of a smart alloy, an alloy that can return to its initial shape after being deformed by applying heat (electricity) through it. Great idea!

Dilbert's self-aware comic strip

In my recent posts I kept mentioning biologically-inspired concepts such as self-aware, self-healing etc. Here is a comic view of self-awareness:


I love it!

Tuesday, November 4, 2008

Controlled Chaos

In the 2007 December issue of IEEE Spectrum entitled Controlled Chaos, the authors describe a new generation of algorithms based on concepts related to the thermodynamic concept of entropy, which is a measure of how disordered a system is. By the fact that malicious code changes the flow of data in the network, the entropy of the network is thus altered. The new malicious threat, called Storm, uses different ways to be installed on the host machine, mostly through email attachments. Hot do we protect the networks? First step is to know how the network traffic moves around the network. Such collections of data from nodes in the network are possible because routers or servers are configured in such a way as to provide information about the network traffic in form of source and destination IPs, source and destination port numbers, the size of the packet transmitted, and the time elapsed between packets. Information regarding the routers themselves is also collected. Such information is used by the proposed algorithms to build a profile of the network’s normal behavior. It is stressed that the entire network is monitored, not just one single link in the network.

The principle behind the entropy-based algorithms is the fact that "Malicious network anomalies are created by humans, so they must affect the natural "randomness" or entropy that normal traffic has when left to its own devices. Detecting these shifts in entropy in turn detects anomalous traffic." When the network has established patterns, any outcome that is different from the normal states of the network can be easily detected. Even if the malicious code manifests by downloading pictures from the internet, the fingerprint of the network would look unusual, different from what is expected, from how the network was used. The authors make an interesting point, namely that Internet traffic has both uniformity and randomness. A worm will alter both, making the traffic either more random, or more structured. In case of the 2004 Sasser attack, the information entropy associated with the destination IP addresses rises suddenly, indicating an increase in randomness in traffic destinations due to the scanning initiated by the infected machines, as it looks for new victims. At the same time, the entropy associated with the source IP addresses suddenly drops, indicating a decrease in randomness as the already infected computers initiate a higher than normal number of connections. The conclusion is that the network goes into a new internal state unknown before, hence easily detectable.

The Storm worm I mentioned at the beginning works in some perspective similar to other worms, namely new code is placed on the computer (because the user clicks on some attachment), which will make it to join a botnet. However, there are distinct differences between old warms and Storm. One of them is the way it makes the user click the attachment, like using a clever subject line for the email, or attachment name, related to hot topics that are currently on the news, such as elections, hurricanes, major storms, etc. Most importantly, Storm hides its network activity. It first looks what ports and protocols a user is using. If it finds a P2P program, such as eMule, Kazaa, BitComet etc, it will use that program’s port and protocol to do its network scanning. Storm will also look at what IP addresses the P2P program communicated with, and will communicate with them, instead of new IP addresses, which would trigger its detection. Furthermore, Storm will not spread as fast as it can, because it has a dormant and a walking mode. It will gather information for a short period, then it will go quit. Very interesting that Storm actually tailors its behavior based on the pattern of the network usage. How to detect Storm? The worm will still alter the network entropy. For example, during its active period, the host computer will send many emails, which is unusually for normal use. In addition, the port used is not 25. All these are hints that something is wrong inside the network.

A great article! Nothing short of what I am used to expect from IEEE Spectrum.

Tuesday, October 14, 2008

Why is self-healing in computer systems important?

Reading about the malfunction that rendered the Hubble Space Telescope silent, I realized indeed how important is for systems to have the property of self-healing that our human body has, and in the broader sense, be autonomic. This would imply for the system to know itself in such a way as to poses detail knowledge of its components, status, and internal and external connections. If a system does not have information about a specific component, it cannot control it, hence the importance of knowing itself.

Furthermore, reconfiguring itself based on the environment is also a desired property. Most importantly, such a system would need to heal itself without the interference of human experts. Why do we need such a self-healing characteristic in computer systems? The Hubble Space Telescope mentioned above is one example. The failure in the telescope came from a unit that collects data and transmits it to earth. Solving this problem means to remotely send commands to the telescope to switch its operations to a backup unit. Why hasn't this been achieved automatically? Because Hubble was not designed with self-healing in mind. Would have been that complicated to discover the problem on its own and find an alternative resource that it could use to continue its normal operation? Self-healing really means just making use of redundant or underutilized components to take over the task of the malfunctioned element. This is similar to how the brain works when parts of it are damaged. The problem is that because Hubble is not 'aware' of its backup unit, that specific unit has not been utilized since 1990, making it subject to "harmful rays of the sun, extreme temperature changes during orbits and 18 years of cosmic debris".

If you are interested in Autonomic Computing, IBM has a whole research project devoted to this subject. You can find out more about this here.

Monday, October 6, 2008

Linearity and Nonlinearity in Biology, Computer Science and the Financial Market

One of the questions that I have, inspired by one of the graduate classes taught by Dr. Ravi Shankar at FAU, is can we exploit the biological architectures to come up with improved computer architecture, both software and hardware. One aspect that rose my interest was the nonlinearity of different biological systems, and if we have such examples of nonlinearity in computer science. In ecological systems, we have Robert May's bifurcation diagram shown below:




If the population growth factor equals to 4, the population jumps around in a chaotic, nonlinear manner. The main idea of the bifurcation diagram is to be able to see the successive period doubling which take us into chaos. We have such bifurcation, such branching, in the blood vessels, as well as in the air vessels. Now for the blood vessels, they can expand to adapt the blood flow. Let us denote the internal pressure, the pressure across the walls, with P, and the volume inside with V. Plotting this system we get the figure below:

What we get is a deterministic nonlinear system. The upper part of the slope is around 100 times the lower part. In case of a disease, like arteriosclerosis where you loose elasticity and you get a higher blood pressure, or in the case where you have more elasticity hence you have a higher pressure, you get a more linear system:

The blood vessels system functions properly only in the nonlinear case. Both linear cases have a negative impact. Do we have such linearity in system development? In software development process, the Waterfall model is the only model that has this linearity property:

The waterfall model is argued to be considered a bad practice mainly because it is impossible to 100% complete one phase before moving to another. When using such a model, it is implied that there are no changes in the requirements, problems can be fully predicted, and the design is correct before the implementation phase starts. Any other development process models do not exhibit the linearity property, but are rather nonlinear. Let us take the spiral model for example, shown below:

Spiral model (Boehm, 1988)

The diagram above can be viewed as a fractal, which is an example of a nonlinear system. Another model that expresses nonlinearity is the iterative and incremental one that is an essential or general part of other models such as RUP, Extreme Programming, Agile Software Development:



The iterative and incremental approach has been developed in response to the linear waterfall model. The nonlinearity characteristic is necessary in order for these models to display the property of adaptability to change, redesign, and learning from other iterations.

If we look at the DOW Jones Industrial Average for 2008, we can see some characteristics of chaos (another manifestation of nonlinearity) depicted in the shaded blue rectangle, that would predict the continuous fall starting with the beginning of October:

SOURCE: Yahoo! Finance

Nonlinearity is present throughout this graph, but chaos really sets in when the rate of the index goes up and down by a much higher rate than it did in the past. The high change in the DJI over a small period of time is a characteristic of chaos behavior. This is also seen if we look at the volume indicator below that plots the number of shares traded during that period of time:

SOURCE: Yahoo! Finance

From September 12 to September 19, there is an almost 50% increase in the number of shares traded. Between September 19 and September 25, there is around 50% decrease in the number of shares traded. Such a high increase and decrease in a two week period is chaotic and uncontrolled, and it is not seen anywhere in the graph for the past year. Having said that, we could certainly look for such chaotic manifestation in the future to better predict the market.

As a conclusion, we have seen several examples where nonlinearity in the form of fractals and chaos is useful and beneficial. Thinking in such terms will have a positive impact in domains ranging form computer science to financial systems.