In describing the first-order properties of laser speckle under polarized illumination conditions, it is almost an article of faith that the contrast is unity. In many processing schemes, however, the contrast defined as the quotient of the standard deviation and the mean is calculated over a localized spatial region. In such cases, this local contrast displays a distribution of values that can depart substantially from unity. Properties of this distribution depend on details of the data acquisition and on the size of the local neighborhood over which the contrast is calculated. We demonstrate that this local contrast can be characterized in terms of a log-normal distribution. Further, we show that the two defining parameters of this model can in turn be expressed in terms of the minimum speckle size and the extent of the local neighborhood. Performance of the model is illustrated with some typical optical coherence tomography data.