[isabelle-dev] New (Co)datatypes: Status & Plan (FYI)
Jasmin Blanchette
jasmin.blanchette at gmail.com
Tue Aug 6 01:59:24 CEST 2013
Hi Alex,
> - The package registers a datatype interpretation to generate congruence
> rules the case combinator for each type.
I guess it would make sense to have the package do that, or is there a specific reason why it is better to do it in a function-related extension?
> - The pattern splitting code and the pat_completeness method both
> retrieves the constructors for a given type, using Datatype.get_constrs.
>
> - The pat_completeness method also uses the exhaust theorem.
>
> - partial_function also destructs the case combinator and looks up
> rewrite rules for it.
>
> - The automated pattern proofs implicitly invoked by "fun" (essentially,
> the "auto" method) rely implicitly on various things in the simpset, in
> particular the distinctness and injectivity simp rules. This is the only
> situation where freeness is used.
These should be no problem at all (except that we need to think up an ML interface for the "Datatype.xxx" calls).
> - The generation of size functions (which was moved into the Function/
> subdirectory at some point, but without any real reason) is extremely
> cryptic. This is mostly because it does a too complex task, essentially
> reversing the nested-to-mutual conversion in this particular instance.
> It appears that this should be re-done completely based on the
> infrastructure of the new package, and it would probably become much
> simpler.
Probably. But Dmitriy told me you had a different idea on how to do this, based on well-founded relations instead of size functions? Do you remember what that could be and if so would you be willing to elaborate?
> In summary, except for the size functions which are somewhat special
> (and only used for termination proofs), the function package never uses
> any inductiveness and would happily swallow codatatypes as well.
Thanks for the overview!
> Thus, I would propose to model the concept of a "type made of
> constructors" explicitly as a declaration which other tools can look up
> and add further interpretations to (in the sense of datatype
> interpretations).
Indeed, we have the rudiments of such a concept. There's an Isar command in BNF called "wrap_free_constructors" that takes a few characteristic theorems about constructors (exhaustive, distinct, injective) and case (characteristic equations) and that generate all the other theorems you're used to from "datatype" except induction and recursion, as well as discriminators and selectors (on demand). The command is used internally by "datatype_new" and "codatatype".
There is currently no database of such "freely constructed types" but it would be easy to change this state of affair.
> The function package would only depend on that. If I
> understand correctly, that declaration would essentially be rep_datatype
> without the induction rule.
Yes, that's pretty much what "wrap_free_constructors" does, except that it currently expects a case constant. We should probably change that slightly: It should be easy to define the case constant and its characteristic equations from the other properties.
> But, according to your schedule, these considerations become important
> only after the coming release, so I'll shut up for the moment.
It's always good to discuss things ahead of time.
> I'd like to learn about corecursion up-to, which I have not come across so far. Are there any good references for this? Or should I rather have a beer with Andrei?
Andrei would know both about the references and the beer (or the kebab -- that's where (co)datatypes are happening these days ;)).
But things are evolving quickly on this front. Andrei's corecursor up-to is getting more powerful by the day, and one promising avenue for further experiments is to reduce productivity (the criterion that ensures well-definedness of functions constructing codatatypes) to termination. This way we could support a superset of what "function" does and reduce everything to it. This is speculative -- the best I can say for now is that it appears to work well in a non-LCF paper world. We can come back to you once the dust has settled, since I presume that the idea of reducing everything to termination might be appealing to you. Incidentally, Andrei found a reference to a 2011 TCS article where they do something similar [1].
Jasmin
[1] http://www.few.vu.nl/~diem/publication/pdf/lapro-tcs.pdf
More information about the isabelle-dev
mailing list