Thought I would do a search re this. From this article and others like it I take that you really are not changing the sensitivity of the CCD device - you are boosting the signal. If I understand this correctly (and I doubt that I am

) you would be wise to shoot at the native sensitivity of your sensor and post process to get the image you want.
Anybody explain this better and why I would want to vary ISO to something other than the sensitivity of the sensor? Is the amplification within the camera prior to digitizing better than doing the amplification during post processing?
From Digital Photo Pro Mag
http://www.digitalphotopro.com/articles/2006/mayjune/isospeeds.php
ISO speed.
Image sensors have an innate “native” sensitivity, generally in the ISO 100 to 200 range. When you set a higher ISO speed, amplifiers in the image sensor’s circuitry increase the gain before sending the image data to the A/D converter to be digitized. The sensor’s sensitivity doesn’t actually increase; the camera is just amplifying the data it produces. In the process, image noise is also increased, making the image “grainier”—sort of like what happens when you “push” film speed. But generally, digital SLRs produce better image quality at higher ISOs than film, especially pushed films.
If you set a lower ISO speed than the sensor’s native sensitivity, the camera’s image processor adjusts the image data after the A/D converter converts it to digital form. In the process, the dynamic range is reduced. So it’s best to shoot at the sensor’s native ISO whenever