These test various properties of the functions, including normal computations and edge case behavior. The edge case tests are largely based on Annex F, and not all of the behavior tested is required by the main specifications in the C standard. ORCA/C does not claim to fully comply with Annex F, but it provides useful guidelines that we try to follow in most respects.
This was broken by the varargs changes in commit a20d69a211. The code was not accounting for the internal representation of the parameters being in reverse order, so it was basing address calculations on the first fixed parameter rather than the last one, resulting in the wrong number of bytes being removed from the stack (generally causing a crash).
This affected the c99stdio.c test case, and is now also covered in c99stdarg.c.
When initializing (e.g.) an array of arrays of char, a string literal would be taken as an initializer for the outer array rather than for an inner array, so not all elements would be initialized properly. This was a bug introduced in commit 222c34a385.
This bug affected the C4.6.4.2.CC test case, and the following reduced version:
#include <stdio.h>
#include <string.h>
int main (void) {
char ch2[][20] = {"for all good people", "to come to the aid "};
if (strcmp(ch2[1], "to come to the aid "))
puts("Failed");
}
There were several existing optimizations that could change behavior in ways that violated the IEEE standard with regard to infinities, NaNs, or signed zeros. They are now gated behind a new #pragma optimize flag. This change allows intermediate code peephole optimization and common subexpression elimination to be used while maintaining IEEE conformance, but also keeps the rule-breaking optimizations available if desired.
See section F.9.2 of recent C standards for a discussion of how these optimizations violate IEEE rules.
This could give incorrect results for extended-to-comp conversions of certain negative integers like -2147483648 and -53021371269120. To get a fix for the same problem with regard to long long, ORCA/C should be linked with the latest version of ORCALib (which also works around some instances of the problem at run time). There are still other cases involving code in SysFloat that has not yet been patched.
This indicates that floating-point exceptions are used to report math errors. The existing functions will still also set errno in the existing cases, but the new C99 functions generally will not.
For example, declarations like the following should be accepted:
char *p[] = {"abc", "def"};
This previously worked, but it was broken by commit 5871820e0c.
Parameters declared directly with array types were already adjusted to pointer types in commit 5b953e2db0, but this code is needed for the remaining case where a typedef'd array type is used.
With these changes, 'array' parameters are treated for all purposes as really having pointer types, which is what the standards call for. This affects at least their size as reported by sizeof and the debugging information generated for them.
This has the side effect of treating most parameters declared as arrays as actually having pointer types. This affects the value returned by sizeof, among other things. The new behavior is correct under the C standards; however, it does not yet apply when using a typedef'd array type.
This allows the length of the string plus a few extra bytes used internally to be represented by a 16-bit integer. Since the size limit for memory allocations has been raised, there is no good reason to impose a shorter limit on strings.
Note that C99 and later specify a minimum translation limit for string constants of at least 4095 characters.
In the new implementation, variable arguments are not removed until the end of the function. This allows variable argument processing to be restarted, and it prevents the addresses of local variables from changing in the middle of the function. The requirement to turn off stack repair code around varargs functions is also removed.
This fixes#58.
We previously ignored this, but it is a constraint violation under the C standards, so it should be reported as an error.
GCC and Clang allow this as an extension, as we were effectively doing previously. We will follow the standards for now, but if there was demand for such an extension in ORCA/C, it could be re-introduced subject to a #pragma ignore flag.
The previous limit was 4096 bytes, and trying to allocate more could lead to memory corruption. Raising the limit allows for longer string literals created via concatenation.
When using such a string literal to initialize an array with automatic storage duration, the bytes after the first null would be set to 0, rather than the values from the string literal.
Here is an example program showing the problem:
#include <stdio.h>
int main(void) {
char s[] = "a\0b";
puts(s+2);
}
The code for this was recursive and could overflow if there were several dozen consecutive string literals. It has been changed to only use one level of recursion, avoiding the problem.
This is required by C95 and later; it may be set by character/string conversion functions. Note that the value of 12 conflicts with GNO's existing definition of EPERM. This should not cause much trouble, but GNO could potentially define its own different value for EILSEQ, with the GNO version of ORCALib adjusted accordingly.