To: info-hol@edu.ucdavis.clover
Subject: LOOSE DEFINITIONS
Date: Tue, 03 Jan 89 16:01:59 -0800
From: Phil Windley <windley@edu.ucdavis.cheetah>
Sender: windley <windley%edu.ucdavis.cheetah@edu.ucdavis.clover>
Status: RO


[This message is from Roger Jones.  I'm posting it because of some network
problems.  Please let me know if you are having any kind of problems with
the info-hol mailing list.  --phil--]

LOOSE DEFINITIONS

Concerning Mike's proposal on the loose definitions:

(i) Deferment of proof obligation

In addition to wanting to make loose definitions we want to be able to defer
proof obligations.  This is so that type checking and full formal review of
specifications can be completed before any proofs are attempted.  This process
can extend over more than a year in the development of security models.  Our
temporary implementation of the loose definition facility allows definitions
to be introduced of the form:

        "?x.P x |- P a"

Which can subsequently be converted into theorems of the form:

        "|- P a "

if P can be proved consistent.

With your proposal the closest to this behaviour we could get would be to use
the definition:

        "|- ?x.P x => P a"

which is less tidy.

We also are gradually automating an increasing number of existence proofs.  It
would be nice to have easy things proven automatically but it does take extra
time.  Once possibility would be to have a definitional mechanism which allows
a tactic to be supplied for the proof of consistency, e.g.

const1 : string -> term -> tactic -> thm

const1 `a` "P a" tac;;

would then store the definition "|- P a" if tac proves "|-?x. P x" and would
otherwise store "?x. P x |- P a".

Use of FAIL_TAC during intial development of specifications would give speed.
Later substitution of a super existence prover would enable proofs to be
completed on a reload.

This could alternatively be done by always storing the existence caveat and
having a special routine to remove them:

        remove_caveat : string -> string -> theorem -> theorem

remove_caveat  `-` `a` "|-?x. P x";;

would replace "?x. P x |- P a" by "|-P a".


Non of this is essential, but it would be helpful to us.

(ii)    duplication of information and types

Current definition mechanisms involve a certain amount of repetition.  The
mechanism which is most economic, and is most sympathetic to a Z-like view of
specifications involves a definitional mechanism (which need not be primitive)
taking a single term as a parameter.  We have the procedure "const_def" for
this purpose:

const_def : term -> thm

The term here is a pair, the first of which is a signature which is a tuple of
variables the names of which are the names of the constants to be introduced.
The second element of the pair is the defining property.

Thus:
        const_def "a:type, P a"

would introduce the same constant.  In our current system the consistency
proof is attempted and an existence caveat is stored automatically if it
fails.  This could be done using a tactic which could be modified by
assignment or by procedure.

        set_existence_tactic : tactic -> void

defaulted to FAIL_TAC.


(iii) Multiple definitions

We wanted multiple definitions, but the introduction in our system of the
loose definitions coincided with a fix to the polymorphic definition problem
and it was not so obvious how to check the validity of a multiple constant
definition.

I think the only simple test is to insist on all the constants introduced in
any one definition being polymorphic in all the type variables occurring in
the defining predicate.  This may be unduly pessimistic for recursive type
definitions, but on reflection I cannot think of any examples in which this
should not be the case (except where the constants are not fully
interdependent, e.g.  "(a:*, b:**), T").

(iv)

I vote for retention of the old definition mechanism, or a simplified version
of it, rather than trying to do everything with the new one.

(v)

I don't understand Paul Loewenstein's objections.  I'm sure someone will find
a way of misusing the facility but it doesn't seem to me to invite misuse, or
to be as easily misunderstood as the use of the choice function.  It is
certainly very important to our work, partly because is enables the deferment
of proof obligations.

It may be worth noting that we very rarely (I can think of no examples)
deliberately introduce two loose constant definitions with the same property.
However we still run the risk, in using the choice function instead of the
loose definition facility, that constants introduced with properties not
obviously identical, will turn out to be provably the same.  The provability
of properties which are simply not necessarily true in the intended meaning of
the specification is highly undesirable and undermines the value of all proofs
obtained with the system.


PRIMITIVES

I don't think that @ is unsatisfactory as a primitive.  (Church has it as
primitive also, or at least definite description) The thing that is wierd is
to define ? in terms of @, and to bury the axiom of choice in the definition
of existence!  What's wrong with the obvious definition of existence using
universal quantification and negation?

Church's treatment of the axiom of choice provides a clean separation.  His
"i" is just a definite description function until the axiom of choice is
added, which promotes it to a choice function.

I vote for keeping the present primitives.  If you don't like the PR aspects
of the current use of the choice function, then I would suggest changing the
definition of existence and adjusting the axioms accordingly (I havn't figured
out exactly how).


POLYMORPHIC TYPE CONSTRUCTORS

I vote for option 2, introducing all the type variables on the new
constructor.

Hope this isn't too late!

Roger Jones

