sequence allocation
- From: Stéphane Genaud <genaud icps u-strasbg fr>
- To: orbit-list gnome org
- Subject: sequence allocation
- Date: Thu, 21 Sep 2000 20:43:08 +0200
Hi,
i can't manage memory allocation for a simple sequence (it core dumps
on the server side). My idl is :
module VSERV {
interface image {
typedef sequence<octet> bytevect;
bytevect readpixel (out short width,out short height);
};
};
On the client side i have the following declarations :
VSERV_image image_obj;
CORBA_short width,height;
VSERV_image_bytevect *img;
....
img = VSERV_image_readpixel(image_obj,&width, &height, &ev);
....
On the server side, the sequence is filled in with :
static VSERV_image_bytevect *
impl_VSERV_image_readpixel(impl_POA_VSERV_image * servant,
CORBA_short * width,
CORBA_short * height, CORBA_Environment * ev)
{
VSERV_image_bytevect *retval;
*width = 3;
*height = 5;
retval = VSERV_image_bytevect__alloc();
CORBA_sequence_set_release(retval,TRUE);
retval->_length = 2;
retval->_maximum=100;
retval->_buffer= CORBA_sequence_VSERV_image_bytevect_allocbuf(2);
retval->_buffer[0] = 41;
return retval;
}
So does anyone has a clue why it segfaults ? I have run the client and
the server on
the same machine so far. the client just returns with dummy values for
width and heigth
after the segfault on the server.
Thanks.
S.G.
[
Date Prev][
Date Next] [
Thread Prev][
Thread Next]
[
Thread Index]
[
Date Index]
[
Author Index]