This may be someone trying to use a C11-style anonymous struct/union, which should be flagged as an error until and unless those are supported. Otherwise, it probably just indicates that the programmer is confused. In any case, an error should be flagged for it.
C89 restricts bit fields to (signed) int and unsigned int only, although later standards note that additional types may be supported. ORCA/C supports the other integer types as an extension.
This fixes the compco01.c test case.
These would previously be allowed, with the void value treated as if it was unsigned long.
Void values are still allowed for the second and third operands of the ?: operator, but only if they are both void. This is as required by the C standard.
This patch should also permit the union initialization code to handle unions containing bit fields, but for the time being they are still prohibited by code elsewhere in the compiler.
This fixes a problem with ORCA/C conformance test C5.6.0.1.CC, which was introduced by commit bf9fa66. A slightly more involved fix was needed to preserve the correct behavior while avoiding the memory trashing fixed by that patch.
Unsigned constants in the range 0x8000-0xFFFF were erroneously being treated like negative signed values in some contexts, including in array declarators and in the case labels of switch statements where the expression switched over has type long or unsigned long.
This could lead to bogus compile errors for array declarations and typedefs such as the following:
typedef char foo[0x8000];
It could also lead to cases in switch statements not being properly matched, as in the following program:
#include <stdio.h>
int main(void)
{
long i = 0xFF00;
switch (i) {
case 0xFF00:
puts("good");
break;
default:
puts("bad");
}
}
The case label values are converted to the promoted type of the expression being switched on, as if by a cast. In practice, this means discarding the high bits of a 32-bit value to produce a 16-bit one.
Code requiring this is dubious and would be a good candidate for a warning or a lint error, but it's allowed under the C standards.
The following code demonstrates the issue:
#include <stdio.h>
int main(void)
{
int i = 0x1234;
switch (i) {
case 0xABCD1234:
puts("good");
break;
default:
puts("bad");
}
}
This also fixes a logic error that may have permitted other conversions to be improperly omitted in some cases.
The following program demonstrates the problem (should print 211):
#pragma optimize 1
#include <stdio.h>
int main(void)
{
unsigned int i = 1234;
long l = (unsigned char)(i+1);
printf("%li\n", l);
}
This resulted from the addition of the signed-to-unsigned comparison optimization. Specifically, it calls TypeOf for the expressions on each side of the comparison, and this did not handle function calls. That support has now been added, and will give the proper return type for direct and indirect calls to C functions. The IR for tool calls doesn't include the return type (just the number of bytes), so we return cgVoid for them. This is OK for the present use case.
Without this fix, an expression of the form "0 * exp" would be reduced to simply "0" unless exp contained a function call; other side effects of exp (such as assignments or increments) would be removed.
A similar issue could occur with additions that use the same expression on both sides of the "+": after optimization, it would only be evaluated once. I think the cases addressed here are all undefined behavior under the C standards, so the old behavior wasn't technically wrong, but the new behavior is still less confusing.
Specifically, this ensures that the depth-first numbering of basic blocks starts from 1, which is what ReachingDefinitions expects. Without this fix, reaching definitions wouldn't be correctly computed for functions that contain unreachable basic blocks (including the implicit one to return at the end). This could result in invalid hoisting of operations out of the loop.
This fixes the compca26.c test case.
Specifically, convert signed word comparisons to unsigned if both sides were either unsigned byte values or non-negative constants. This is incorporated as part of intermediate code peephole optimization (bit 0).
This should alleviate some cases of performance regressions due to promoting char to int instead of unsigned int.
This occurred because the code to handle the function-like macro use would read the following token, which could prompt processing of the following preprocessing directive in an inappropriate context.
The following example illustrates the problem (the error message would be printed):
#define A()
#define FOO 1
A()
#if !FOO
# error "shouldn't get here"
#endif
Previously, such initializations would sometimes generate a garbage value pointing up to 65535 bytes beyond the start of the string constant. (This was due to a lack of sign-extension in the object code generation.)
Computing a pointer to before the start of an object invokes undefined behavior, so the previous behavior wasn't technically wrong, but it was unintuitive and served no useful purpose. The new behavior should at least be easier to understand and debug.
The latter would require more changes to the code generator to understand it, whereas this approach doesn't require any changes. This is arguably less clean, but it matches other places where a byte value is subsequently operated on as a word without an explicit conversion, and the assembly instruction generated is the same.
This fixes the compca06.c test case.
Note that this generates inefficient code in the case of loading a signed byte value and then immediately casting it to unsigned (it first sign-extends the value, then masks off the high bits). This should be optimized, but at least the generated code is correct now.
This fixes the compca22.c test case.
This optimization could be fixed and re-enabled, but to do so, you would have to check if the stored value is ever used subsequently, which is not information that's readily available in the peephole optimization pass. It would also be necessary to check if there are any stores to the same location within the right-side expression, which could kill the optimization.