Re: Getting greatest decimal accuracy out of G_PI
- From: Robert Pearce <rob bdt-home demon co uk>
- To: gtk-list gnome org
- Subject: Re: Getting greatest decimal accuracy out of G_PI
- Date: Sat, 3 Feb 2007 09:29:24 +0000
Hi Sergei,
On Fri, 2 Feb 2007 16:14:09 -0800 (PST) you wrote:
> You can still use explicit cast, i.e.
>
> ((long double)G_PI)
>
> , can't you ? Even without the trail 'l' you have correctly suggested to
> add.
You can, but would it work? AIUI the C standard only requires that to result in a long double type - it's perfectly valid (and may even be required) for the compiler to treat G_PI as a normal double first, thus losing the precision, then convert that result to a long double. The trailing 'l' instructs the compiler to treat it as a long double while compiling, thus retaining as much precision as it can.
[
Date Prev][
Date Next] [
Thread Prev][
Thread Next]
[
Thread Index]
[
Date Index]
[
Author Index]