On Jan 19, 2003, Alexandre Oliva wrote: > the minimum normal exponent should be set such that all 107 > significant bits are available (and it's 107, not 106, as I > implemented it) for normals. It can't possibly be 107. IRIX must be doing something wrong in setting LDBL_MANT_DIG to 107, since there are only 53 bits of precision in each double, even if you count the implicit 1s. Besides, the emulation I implemented worked correctly down to the least significant bits with IRIX's printf, so I'm checking these bits in too, per rth's approval.