About a year ago the Avaneya project released software that can decode the Viking Lander tapes. A few tapes conatain images that were (as far as I know) never released.
According to Wikipedia they used a movable mirror that reflected the light to 12 photo diodes for different wave lengths and then they just scanned the environment from left to right (or maybe the other way round) with five vertical lines of 512 pixels per second. The 300° panoramas are made out of 9150 (vertical) lines.
UHF 1Kb/sec to the orbiter, also could use S-Band direct to earth at a lower bandwidth. The orbiter could buffer 40 Mb. Not sure how fast its link to the DSN was.
ed: 16Kb/sec in there too. Example DSN doc:
ipnpr.jpl.nasa.gov/progress_report2/42-33/33C.PDF
My memory was everything was first shown grayscale, and then color. Not sure if that was a ground processing thing or the lander/orbiter sent two streams. Maybe the orbiter buffered the deeper bits and sent 8bit for near-realtime enjoyment.
Looks like the buffer was on tape - it would record at the speed of acquisition, and then when it was time for transmission it would spool the tape at whatever rate was appropriate for the quality of the radio link.
(Educated) guess: the monochrome image was simply the first wavelength band that came down. The later color image could be composed after the other bands were received.
I don't know if a single wavelength monochrome image would look very good, especially if it were, say, blue, instead of red.
Creating the color images was a lot harder than we might imagine, in a world where Photoshop does CMYK separations at the click of a mouse. Since I'm old enough to remember the Viking landing, they first reported on TV when the first color pictures were processed that Mars had a blue sky just like Earth. However, this was quickly corrected once the color pictures were correctly calibrated. Of course, interestingly enough you can see blue in the Martian sky at dusk or dawn.
Of course it depends on the wavelength, but single band images look a lot like any grayscale image. It's typically only when different single-band images are compared side by side that the differences really jump out. Which is, of course, why they bother with multiple bands. I worked on the lunar reconnaissance orbiter camera, and was involved with image processing and calibration for the wide angle camera, which is multiband (5 bands in visible, 2 in UV).
Samples: https://gist.github.com/jakeogh/fa995a3277d500ab59b1
https://directory.fsf.org/wiki/Avaneya:_Viking_Lander_Remast...