Autonomic computing and communications uses concepts of feedback control, self-organisation, monitoring and analysis to the construction and management of large systems. Theses areas tackle the design and implementation of stable, "self-*" algorithms as well as the overall design and analysis of adaptive properties and behaviour.
The aim of such self-* systems is to improve the dynamic responses of systems to changing conditions, simplifying their construction, management and deployment.
I present two examples of such systems designed for IP/WDM resource provisioning and Optical Burst Switching (OBS) networks performance improvement.
First, I present a new structure-preserving method of sampling self-similar traffic with direct applications to network monitoring and resource provisioning in IP over WDM. Predicting the bandwidth required by upcoming traffic plays a key role for providing an efficient and intelligent resource provisioning. To achieve this, we are proposing a periodic sampling method, (called maximum-based sampling) that picks one measurement during a sampling interval of size ô . Mathematical analysis and simulation results demonstrate that the proposed maximum-based sampling method preserves the self-similarity property of the original traffic over many timescales. The LMMSE (Linear Minimum Mean Square Error) prediction method is used for traffic forecast.
Afterwards, I present a framework of using closed loop feedback control theoretic techniques to improve the performance of OBS networks. In OBS networks, the Burst Loss Ratio (BLR) is the ratio between the lost bursts to the sent bursts. The BLR is used as a performance metric. The desired burst loss ratio depends on the application using the network. Some applications might tolerate more burst loss ratios than other applications. Higher network link utilization could be achieved by having more control over the burst loss ratio. Burstification rate is the rate of injecting bursts into the OBS network. In this paper, a novel technique to control the burst loss ratio in OBS networks is proposed. The technique is based on classical control theory approaches to tune the burstification rate in order to achieve a desired burst loss ratio to satisfy the application requirements. Extensive experiments show that the proposed technique achieves promising results. That is, the measured burst loss ratio hovers around the desired burst loss ratio and higher utilization is observed. Empirical approaches are used to identify the proposed model. The empirical model fits the OBS network by a value that did not fall below 75%.