We examined the presence, magnitude, and consequences of systematic and random errors caused by terminal digit preference in the measurement of highest systolic blood pressure during prenatal visits in 28,841 non-referred pregnant women who delivered between 1 January 1982 and 31 March 1990. In the overall distribution of terminal digit readings, 78% were read to 0, 15% to even digits other than 0, 5% to 5, and only 2% to odd digits other than 5. This preference for 0's was consistent across the entire distribution of blood pressure and for a variety of maternal characteristics. The relative frequency of the cutoff value of 140 mmHg (i.e. the percentage of readings on 140 mmHg) within the range containing the value (i.e. 138-142 mmHg) was similar to the relative frequency of other multiples of 0. This was true whether the comparison was made in the overall study sample, or in a pre-selected low-risk subgroup or high-risk subgroup, indicating no systematic bias. On the other hand, a strong tendency to read blood pressure values to the nearest 0 had a marked effect on the classification of hypertension. Changing the definition of hypertension from > or = 140 mmHg to > 140 mmHg produced a reduction in prevalence of hypertension from 25.9 to 13.3% in the overall study sample, from 15.4 to 6.3% in the low-risk subgroup, and from 43.3 to 25.3% in the high-risk subgroup. Epidemiologic studies that compare prevalences of hypertension in different populations based on routine clinical measurement of blood pressure and a single cutoff point should assess the consequences of terminal digit preference in defining hypertension.(ABSTRACT TRUNCATED AT 250 WORDS)