I keep asking these in relevant threads, but so far no answers. I'd like to have one set of lenses and have the 35mm DOF regardless of what my stored image size is. So...
1. Is it technically possible to scale raw images (hopefully in camera) and keep them "raw" (in other words, keep their raw latitude)? If there are R, G, and B photosites on the sensor, isn't it possible to resample and merge their values just as you would with pixels in a non-raw image? Then store them in a file that looks (in terms of data layout) like a 2K raw file?
2. Why is the RGB storage only 4:2:2? This becomes a bigger deal if the answer to #1 is no, leaving 1080P RGB our only 4K-to-smaller scaling option.