Allocating and freeing char * in C ++
Hi everyone, I am getting a heap error that I cannot understand.
char * c = (char *) malloc(1);
// main loop
_gcvt_s(c, 100, ball->get_X_Direction(), 10);
if(pushFont(c, (SCREEN_WIDTH - 30), (SCREEN_HEIGHT - 40), message, screen,
font, textColor) == false)
{
//return 1; // error rendering text.
}
// end main loop
free(c);
The above code is the only time I use a c-pointer in _gcvt_s and pushFont (), which just takes a char * as its first parameter and puts text on the screen. The other is that I am not using c. When I try to free c after the main loop (which I suppose I should be doing), I get an error where Visual Studio crashed on the heap (heap corruption).
While commenting out the call to pushFont, I still get the error.
Can anyone explain to me why freeing the character (1 byte allocated by me on the heap) would give me a bunch of corruption?
Finally my main loop is doing a lot of things buddy and I am doing a pong game with WinSocket, the rest of the main body is a play loop. I didn't think it was necessary for the post, but I will update my post with the whole main loop if necessary, but I believe I am just with my understanding of malloc () and free ().
Thanks everyone,
a source to share
Isn't _gcvt_s using the second parameter as the maximum size of the allocated buffer? You allocate 1 byte, but tell _gcvt_s that 100. So it happily writes up to 100 bytes to the buffer, decomposing your heap. Then free crashes. Allocate 100 bytes if you are going to access 100 bytes.
EDIT: It looks like you need to find out how C stores and manages strings. C stores strings as individual bytes in contiguous memory runs, followed by an extra character to indicate the end of the string. This additional character has the ASCII value of 0 (not the "0" character, which is ASCII 48). So if you have a string like "HELLO", it takes 6 bytes to store - one for each of the 5 letters and the terminator.
For _gcvt_s () to return a value to your buffer, you need to include enough bytes to convert and an additional completion byte. In the case of _gcvt_s (), you are asking for 10 characters of precision. But you must also reserve space for the decimal point, a potential negative sign.
According to this [documentation] ( http://msdn.microsoft.com/en-us/library/a2a85fh5(VS.80).aspx) there is a #define in the headers for the maximum required buffer size: _CVTBUFSIZE. An example that should help you with this problem.
a source to share
According to the documentation, I can find it _gcvt_s()
takes a buffer and the length of that buffer as the first two arguments.
errno_t _gcvt_s(
char *buffer,
size_t sizeInBytes,
double value,
int digits
);
Your malloc()
ed buffer is 1 byte long, you say _gcvt_s()
which is 100 bytes long . I would start looking here.
a source to share
You need more than one byte to store the float. Allocate more practical length than 1 byte ...
You don't need a heap either, try a 16 byte buffer (a bit oversized) and give _gcvt_s the correct buffer length (instead of the magic 100 you give it). De-magic your constants while you're at it.
const unsigned int cuFloatStringLength = 16;
const unsigned int cuFloatStringPrecision = 10;
char c[cuFloatStringLength];
_gcvt_s( c, cuFloatStringLength, ball->get_X_Direction(), cuFloatStringPrecision );
Then the problem should go away.
a source to share