Re: [glib] Why is g_ptr_array_set_size take a gint instead of guint for the length parameter?
- From: jcupitt gmail com
- Cc: gtk-devel-list <gtk-devel-list gnome org>
- Subject: Re: [glib] Why is g_ptr_array_set_size take a gint instead of guint for the length parameter?
- Date: Tue, 23 Dec 2014 13:02:44 +0000
I'm looking for the rational of using 'gint' instead of 'guint' in the
call:
g_ptr_array_set_size (GPtrArray *array, gint length);
I imagine that the use of a signed integer was an oversight at the time
which can now not be corrected without breaking API. It's not worth
that.
I remember (a long, long time ago) there was a dislike for uint.
Mixing uint and int can be fiddly and produce a range of bugs, some
more subtle than others, and the extra bit of range you get is
unimportant. int-only code is usually simpler and safer. The uints
scattered through xlib are a good example of the confusion they can
cause.
The argument the other way would be that declaring it unsigned gives
extra information about what "length" means (it's a number of things,
not a distance). I guess that point of view won out.
I agree that the inconsistency is annoying and puzzling.
John
[
Date Prev][
Date Next] [
Thread Prev][
Thread Next]
[
Thread Index]
[
Date Index]
[
Author Index]