No image processing is done at the sensor level where the RGB/CMYK Photons are counted.
Though if they were really "counted" at the photosite, that would be a digital pixel, no?;). And they are working on such sensors.
But, as you say, the sampling is normally done at the A/D converter, which would typically not be sensor level in the case of CCD's. However, with CMOS that is often not the case:
The success of CMOS has been driven in part by its potential to allow more functionality on the chip than was practical with CCDs. Many functions that used to require separate chips have moved on to the imaging sensor: analog to digital conversion (aka A to D, ADC or A/D), power conditioning, timing control, internal clock drivers, and a host of output interface options, from serializers to various parallel protocols.
If you are just trying to make the point that image processing is much easier to do in the digital domain, I concur wholeheartedly. In audio gear for instance, an analog dynamics compressor/limiter may be 4u in the rack and weigh 20 pounds. The virtually identical DSP plug-in could run in a tiny fraction of the space, weight and power.
Though, I'm not sure this knowledge is what the OP had in mind. Maybe a good topic for a new discussion?