In this paper we provide both qualitative and quantitative measures of the precision of measuring integrated volatility by realized volatility for a fixed frequency of observation. We start by characterizing for a general diffusion the difference between realized and integrated volatility for a given frequency of observation. Then we compute the mean and variance of this noise and the correlation between the noise and the integrated volatility in the Eigenfunction Stochastic Volatility model of Meddahi (2001a). This model has as special cases log-normal, affine and GARCH diffusion models. Using previous empirical results, we show that the noise is substantial compared with the unconditional mean and variance of integrated volatility, even if one employs five-minute returns. We also propose a simple approach to capture the information about integrated volatility contained in the returns through the leverage effect. We show that in practice, the leverage effect does not matter.