GCC and pointer overflows
In summary, the advisory states:
Application developers and vendors of large codebases that cannot be audited for use of the defective length checks are urged to avoiding [sic] the use of gcc versions 4.2 and later.
This advisory has disappointed a number of GCC developers, who feel that their project has been singled out in an unfair way. But the core issue is one that C programmers should be aware of, so a closer look is called for.
To understand this issue, consider the following code fragment:
char buffer[BUFLEN];
char *buffer_end = buffer + BUFLEN;
/* ... */
unsigned int len;
if (buffer + len >= buffer_end)
die_a_gory_death("len is out of range\n");
Here, the programmer is trying to ensure that len (which might come from an untrusted source) fits within the range of buffer. There is a problem, though, in that if len is very large, the addition could cause an overflow, yielding a pointer value which is less than buffer. So a more diligent programmer might check for that case by changing the code to read:
if (buffer + len >= buffer_end || buffer + len < buffer)
loud_screaming_panic("len is out of range\n");
This code should catch all cases; ensuring that len is within range. There is only one little problem: recent versions of GCC will optimize out the second test (returning the if statement to the first form shown above), making overflows possible again. So any code which relies upon this kind of test may, in fact, become vulnerable to a buffer overflow attack.
This behavior is allowed by the C standard, which states that, in a correct program, pointer addition will not yield a pointer value outside of the same object. So the compiler can assume that the test for overflow is always false and may thus be eliminated from the expression. It turns out that GCC is not alone in taking advantage of this fact: some research by GCC developers turned up other compilers (including PathScale, xlC, LLVM, TI Code Composer Studio, and Microsoft Visual C++ 2005) which perform the same optimization. So it seems that the GCC developers have a legitimate reason to be upset: CERT would appear to be telling people to avoid their compiler in favor of others - which do exactly the same thing.
The right solution to the problem, of course, is to write code which complies with the C standard. In this case, rather than doing pointer comparisons, the programmer should simply write something like:
if (len >= BUFLEN)
launch_photon_torpedoes("buffer overflow attempt thwarted\n");
There can be no doubt, though, that incorrectly-written code exists. So the addition of this optimization to GCC 4.2 may cause that bad code to open up a vulnerability which was not there before. Given that, one might question whether the optimization is worth it. In response to a statement (from CERT) that, in the interest of security, overflow tests should not be optimized away, Florian Weimer said:
Joe Buck added:
It is clear that the GCC developers see their incentives as strongly
pushing toward more aggressive optimization. That kind of optimization
often must assume that programs are written correctly; otherwise the
compiler is unable to remove code which, in a correctly-written
(standard-compliant) program, is unnecessary. So the removal of pointer
overflow checks seems unlikely to go away, though it appears that some new warnings will be added to alert
programmers to potentially buggy code. The compiler may not stop
programmers from shooting themselves in the foot, but it can often warn
them that it is about to happen.
| Index entries for this article | |
|---|---|
| Security | CERT |
| Security | GCC |