Use correct symbol for minimum int64 value

The old code used SEQ_MINVALUE to get the smallest int64 value.  This
was done as a convenience to avoid having to deal with INT64_IS_BUSTED,
but that is obsolete now.  Also, it is incorrect because the smallest
int64 value is actually SEQ_MINVALUE-1.  Fix by writing out the constant
the long way, as it is done elsewhere in the code.
This commit is contained in:
Peter Eisentraut 2016-07-17 09:37:33 -04:00
parent 16e28fcec2
commit 805f2bb53f

View File

@ -213,10 +213,7 @@ GIN_SUPPORT(int4)
static Datum
leftmostvalue_int8(void)
{
/*
* Use sequence's definition to keep compatibility.
*/
return Int64GetDatum(SEQ_MINVALUE);
return Int64GetDatum(-INT64CONST(0x7FFFFFFFFFFFFFFF) - 1);
}
static TypeInfo TypeInfo_int8 = {false, leftmostvalue_int8, btint8cmp};
@ -243,10 +240,7 @@ GIN_SUPPORT(float8)
static Datum
leftmostvalue_money(void)
{
/*
* Use sequence's definition to keep compatibility.
*/
return Int64GetDatum(SEQ_MINVALUE);
return Int64GetDatum(-INT64CONST(0x7FFFFFFFFFFFFFFF) - 1);
}
static TypeInfo TypeInfo_money = {false, leftmostvalue_money, cash_cmp};