Types as values

One intuitively expects to be able to declare variables of type ``set of something''. If one allows such types, it implies that one allows sets to be elements of sets, which implies that one ought to allow types to be valuesgif.

Of course, if a programming language allows very large sets, like Int, to be regarded as values, the language implementations must resort to a form of lazy evaluation: one would not wish the compiler to generate code that fully evaluates a set like {x ∈ Int &sp;|&sp; odd(x)} if the set is only used as the target of a membership test.


next up previous

Prof Herman Venter
Thu May 2 09:26:52 GMT 1996