AFP entry Neural_Networks

Achim D. Brucker adbrucker at 0x5f.org
Wed Nov 19 23:37:33 CET 2025


Hi,
I did a first investigation. Overall, this part of the theory is 
intended to serve as regression test. Thus, it worked as it should: 
acting as early warning for subtle changes that otherwise would go 
undetected, potentially creating "bit rot".

The change is actually due to a difference in Poly/ML used in Isabelle 
2025 vs the current development version (and, 2025-1, for that matter). 
Namely:

ML\<open>
     PackReal32Little.fromBytes (Word8Vector.fromList [0wxC5, 0wx10, 
0wxE8, 0wx3B])
\<close>

returns a different value, namely:

* Isabelle 2025:           val it = 0.007082077209: PackReal32Little.real
* Isabelle development/2025-1:      val it = 0.007082077: 
PackReal32Little.real

As the rule-of-thumb regarding the precision of 32 IEEE-745 floating 
point numbers is around 7-8 decimal digits, the new behaviour is OK. 
Hence, the fix to get the AFP entry working again is to update the 
theories to reflect the new values.

We will take care of this tomorrow.


Best,

Achim (and Amy)



On 19/11/2025 16:25, Lawrence Paulson via isabelle-dev wrote:
> Neural_Networks is badly broken. I took a look and the failure is in a 
> proof that tests whether a certain generated construction is equal to 
> an explicit construction. They are supposed to be literally identical 
> and are not. I think we should get back to the authors to find out why 
> it failed and how it can be made robust.
>
> Larry

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://mailman46.in.tum.de/pipermail/isabelle-dev/attachments/20251119/f4c2d19b/attachment.htm>


More information about the isabelle-dev mailing list