Beer’s law suggests that a calibration curve is a straight line with a y-intercept of zero and a slope of εb. In many cases a calibration curve deviates from this ideal behavior, as shown here.
One limitation to Beer’s law is the assumption that the radiation reaching the sample is of a single wavelength—that is, that the radiation is purely monochromatic. Even the best wavelength selector, however, passes radiation with a small, but finite effective bandwidth (see here for example).
Non-monochromatic radiation always results in a negative deviation from Beer’s law, but the effect is smaller if the value of ε is essentially constant over the wavelength range passed by the wavelength selector. For this reason, as shown in the illustration below, it is better to make absorbance measurements at the top of a broad absorption peak.