One intuitively expects to be able to declare variables of type ``set of something''. If one allows such types, it implies that one allows sets to be elements of sets, which implies that one ought to allow types to be values.
Of course, if a programming language allows very large sets, like Int, to be regarded as values, the language implementations must resort to a form of lazy evaluation: one would not wish the compiler to generate code that fully evaluates a set like if the set is only used as the target of a membership test.
Prof Herman Venter