Fix reading of most-negative integer value nodes

The main parser checks whether a literal fits into an int when
deciding whether it should be put into an Integer or Float node.  The
parser processes integer literals without signs.  So a most-negative
integer literal will not fit into Integer and will end up as a Float
node.

The node tokenizer did this differently.  It included the sign when
checking whether the literal fit into int.  So a most-negative integer
would indeed fit that way and end up as an Integer node.

In order to preserve the node structure correctly, we need the node
tokenizer to also analyze integer literals without sign.

There are a number of test cases in the regression tests that have a
most-negative integer argument of some utility statement, so this
issue is easily reproduced under WRITE_READ_PARSE_PLAN_TREES.

Reviewed-by: Tom Lane <tgl@sss.pgh.pa.us>
Discussion: https://www.postgresql.org/message-id/flat/4159834.1657405226@sss.pgh.pa.us
This commit is contained in:
Peter Eisentraut 2022-09-24 18:10:11 -04:00
parent 03bf971d2d
commit 43f4b34915

View File

@ -267,7 +267,7 @@ nodeTokenType(const char *token, int length)
char *endptr;
errno = 0;
(void) strtoint(token, &endptr, 10);
(void) strtoint(numptr, &endptr, 10);
if (endptr != token + length || errno == ERANGE)
return T_Float;
return T_Integer;