UP

Computing with Reals

Some History

Reals - how they might have developed Thumbnail History of Reals
Constructive Mathematics
Alan Turing (1936)
Recursion Theory (1936-)
Rice (1954)
Marvyn Minsky (1967)
Errett Bishop (1967)
Dana Scott (1970-)
Wiedmer (1976)
Weihrauch (1995)
Other Recent Work

I have myself been interested in this area for at least 20 years, (or, since the publication of Minsky's book, depending on what you count as interest). It has never got high enough on my priority list for more than the odd tinker. Recently my interest was revived because I was thinking about theorem provers capable of doing automatically computations in real analysis, and for this purpose the ability to do exact computations with reals is essential (IMO).

The following is my best shot at an outline of the history of the subject. I have no doubt whatever that the story is incomplete, and its probably wrong too. If you know better, spare me a few minutes to help brush up or fill in the picture.

Constructive Mathematics

The story surely must start here. Its not easy to say where constructive mathematics starts, historically, since some say that maths always was constructive until non-constructive techniques appeared in the 19th century. However, the emergence of intuitionism, the first explicitly constructive doctrine, as a distinct thread in mathematical philosophy is a phenomenon of the early twentieth century. I believe that intuitionism was the first brand of constructivism to emerge, and has been followed by many others.

Constructive mathematics is distinguished by a particular concern for the computational content of mathematics, and often by a rejection of classical mathematics and classical logics. Classical mathematics is founded in Cantor's set theory, which builds up to (and beyond) the complex analysis widely exploited in science and engineering. Some of the criticisms, levelled by intuitionists and other constructivists againsts classical mathematics at a time long before the invention of digital computers (or even Turing machines), are not so very far from the feelings which a scientifically oriented software engineer might feel about the use in science of number systems which simply aren't implementable computationally.

To those of us who were born after the invention of the digital computer, who learned about recursion theory in our youths, (perhaps even those who escaped the recursion theory, but couldn't escape computers), the writings of intuitionist may appear unnecessarily obscure. Recursion theory provides a basis for an understanding of computability separated from too great an exposure to philosophy.

Alan Turing, 1936

As far using computers for real computations is concerned, it looks like Turing had this idea first. And that was before the computers were invented so its pretty good going!

Computable reals are defined in Turing's first classic paper [Turing36] where he uses (what we now call) universal Turing machines to show that there exist unsolvable problems in elementary number theory. In this paper Turing defines computable reals as those whose decimal expansion is computable by Turing machine. Pretty soon he spotted that this is not so good (not what we would now call an admissible representation) and comes out with a correction in [Turing37], where he goes for convergent sequences of nested intervals with rational end-points (which is an "admissible" representation).

Recursion Theory

From the works of Turing (and Church, Gödel, Kleene...) emerged recursion theory, constructive mathematics de-mystified.

A precise mathematical understanding of computation arrived well in advance of the digital computer. This appeared with the notion of an effective procedure which was introduced by logicians in the 30s for the purpose of demonstrating the unsolvability of certain classes of mathematical problem. At that time multiple independently constructed accounts of what consisted of effective methods proved to be equivalent, including three which have all played significant roles subsequently in computer science. Church's lambda calculus, Turings machines, and Posts productions systems (closely related to Chomsky's generative grammers, which feature in the theory of languages and automata) were all shown to be equivalent, and are all computationally equivalent to abstract digital computers with unbounded (but finitely utilised) memories.

This provides a basis for discussing effective number systems independently of any philosophical stance on the nature of mathematics, and permits the subject to be addressed in the context of classical mathematics rather than as an alternative to classical mathematics. It is the notion of effectiveness established by Church, Turing and Post which is the objective of the work we are concerned with here.

For many years after Turing's first papers recursion theory was concerned primarily with computation over the natural numbers and little work was done on computation over reals. Theoretical studies in this area now seem to be reviving, though practical implementations still don't get so much attention.

Digital Computers

Finally the wherewithall. On the invention of digital computers it became possible in principle to do something which had previously been impracticable. To fully realise computationally all those parts of classical mathematics which are susceptible of such realisation.

Sidetracked by the floating point.

Rice 1954

Published a paper entitled Recursive Real Numbers. I havn't located a copy yet.

Marvyn Minsky, 1967

Minksy has a material on computable reals in his book Computation: Finite and Infinite Machines ([Minsky67]), but he neither identified adequate representations nor considers effective operations. So I guess this is actually a step backwards from Turing, except that even to mention computable reals in a popular book is something!

Errett Bishop, 1967

Bishop's Foundations of Constructive Analysis is generally regarded as a milestone, enunciating the broader view of the mathematics realisable constructively which had become accepted by the cognoscenti of the time. Bishop's work is claimed to be an account of constructivism which is moderate in the sense that the results he derives are all true in classical analysis as well as in constructive mathematics (whereas a more radical treatment would also establish constructive results which appear to contradict classical ones). This encourages the view that constructive mathematics has the same subject matter as classical mathematics, but has more rigorous standards of proof resulting in the establishment of fewer results. This view is difficult to sustain, nontheless.

Bishop's definition of a real number (a sequence of rationals converging in a rather slow but clearly prescribed way) is not so far off beam from a classical perspective, but this view may neglect a significant difference in what these two schools would accept as a sequence.

Nevertheless it is worth considering what kind of analysis we get if we do not place any constraint on the sequences involved (apart from the convergence requirement), which Bishop does not explicitly do.

This question bears directly on the possibility of realising good computional capabilities in a proof based mathematical tool supporting a classical treatment of analysis (such as HOL).

Dana Scott, 1970-

Surely must get a mention here. However, I dug out the few papers of his I have, and didn't find anything that seemed likely to help a hacker.

Domain theory was invented in order to give "mathematical" accounts of the semantics of programming languages, and therefore there is some connection with effectiveness. Scott uses continuity, perhaps as a mathematically nicer concept closely related to effectiveness, and his early innovation was the introduction of lattices. Connections between effectiveness and domain theory are explicitly studied in, for example, the work of Mike Smyth on Effectively Given Domains, from about 1975. Connections between effectiveness and topos theory appear in Martin Hyland's The Effective Topos and later in a Rossolini's doctoral dissertation Continuity and Effectiveness in Topoi.

This is pretty rarified stuff, and theres lots more of it, at best tenuously connected with the practicalities of implementing and exploiting effective real analysis (so far as I can see, which isn't very far in this direction!). If anyone can give me a story for the layman on the relevance of this research to the hacker who just wants software for doing maths I would be very grateful for the enlightenment.

See also: [Gianantonio93] and Domain Theory and Real Number Computation.

Wiedmer 1976,1980

The first paper I have discovered on this topic, which appears to talk about doing computations over computable reals, is Exaktes Rechen mit reellen Zahlen by Edwin Wiedmer. From what little I can understand of this paper (my comprehension of German is tiny) he is past the starting point, at least in having a viable representation over which the basic operations are computable, and probably in having algorithms for doing a good range of computations.

This report was probably produced early in his research which later resulted in a Doctoral dissertation and a paper entitled Computing with Infinite Objects. Since this is in English, the following observations are based upon it.

Wiedmer affirms (what he claims was even then "well known") that addition is not effective over reals represented by their decimal expansions, but that it is effective if signed decimal expansions are used. In a signed decimal expansion the significance of each position is the same as in a normal decimal expansion, but the range of digits permitted is from +9 to -9 rather than the usual +9 to 0.

He also connects the question of whether addition is effective with topological criteria. A connection between effectiveness and continuity had previously been recognised. Wiedmer connects criteria of continuity with the adequacy of representations of real numbers.

Wiedmer identifies the class of effectively computable functions over reals with the class of functions described by intuitionists as constructive.

Weihrauch 1985

See [Weihrauch85], [Weihrauch95] and [Weihrauch95b].

Other Recent(ish) Work

I have extracted these from the recent discussion in alt.lang.functional and elsewhere, and include them for the sake of best up-to-date coverage even though I havn't yet been able to look at any of them. That means I really don't know whether any of this work passes muster against my basic criteria (let alone anyone elses).

Vernon Lee, Jr., and Hans-J. Boehm, Optimizing Programs over the Constructive Reals, PLDI 90 (SIGPLAN Notices 25,6), pp. 102-111.

There is more discussion in the theses of Vernon Lee (Rice University) and Valerie Menissier-Morain (Paris VII, in French).

1986 lisp and FP p162-173 SIGPLAN 1987 symposium on interpreters and interpretative techniques

Boehm, H. and Cartwright, R., "Exact Real Arithmetic: Formulating Real Numbers as Functions". Chapter 3 of D.A. Turner, ed., _Research Topics in Functional Programming_, Addison-Wedsley, 1990.

The paper ``Vuillemin's Exact Real Arithmetic'' from David R. Lester may be of help. It appeared in the ``Proceedings of the Glasgow Functional Programming Workshop 1991'' which are published by the Springer-Verlag in the ``Workshops in Computing Series''. From the acknowledgments given in the paper, I assume that David wrote an implementation in Orwell.

Apparently:

The classical approach to arbitrary-precision real and complex arithmetic (floating-point numbers with mantissa of any size) is implemented in many computer algebra systems and the CLISP Common Lisp implementation.

The lazy digit sequence approach (similar to lazy power series) has, for example, been implemented by Michael Stoll in Common Lisp, 1988.


UP HOME © HOME created 1996/1/14 modified 1997/11/30