Determines the number of digits used when comparing numbers.

Syntax

SET PRECISION TO [<expN>]

<expN>

The number of digits, from 10 to 16. The default is 10.

Description

Use SET PRECISION to change the accuracy, or precision, of numeric comparisons. You can set precision from 10 to 16 digits.

SET PRECISION affects data comparisons, but not mathematical computations or data display. Math computations always use full precision internally. To change the number of decimal places dBASE Plus displays, use SET DECIMALS.

In general, you should use as little precision as possible for comparisons. Like many programs, dBASE Plus handles numbers as base-2 floating point numbers. This format precisely represents fractional values such as 0.5 (1/2) or 0.375 (3/8), but only approximates other values such as 0.1 and 1/9. In addition, precision is also used to represent the integer portion of a number; the larger the integer portion, the less precision is available for the fractional portion. Therefore, comparing values with too much precision results in erroneous mismatches.