Publication Date



Technical Report: UTEP-CS-07-52a

Short version published in Proceedings of the International Conference on Information Technology InTech'07, Sydney, Australia, December 12-14, 2007, pp. 11-20. Full version published in International Journal of Automation and Control (IJAAC), 2008, Vol. 2, No. 2/3, pp. 317-339.


Most data processing techniques traditionally used in scientific and engineering practice are statistical. These techniques are based on the assumption that we know the probability distributions of measurement errors etc. In practice, often, we do not know the distributions, we only know the bound D on the measurement accuracy - hence, after the get the measurement result X, the only information that we have about the actual (unknown) value x of the measured quantity is that x belongs to the interval [X - D, X + D]. Techniques for data processing under such interval uncertainty are called interval computations; these techniques have been developed since 1950s. Many algorithms have been designed to deal with interval uncertainty.

In many practical problems, we have a combination of different types of uncertainty, where we know the probability distribution for some quantities, intervals for other quantities, and expert information for yet other quantities. It is therefore desirable to extend interval techniques to the situations when, in addition to intervals, we also have a partial probabilistic and/or expert information. We provide an overview of related algorithms, results, and remaining open problems, and we emphasize the following three application areas: computer engineering, bioinformatics, and geoinformatics.

tr07-52.pdf (266 kB)
Original file: CS-07-52

tr07-52a.pdf (206 kB)
Updated version: UTEP-CS-07-52a