Real-world, I'm not convinced there's a gigantic difference with most moving pictures at average brightness levels. I think you absolutely can see the difference with static pictures with a lot of fine detail -- the test we used at Kodak was with lots of tall vertical buildings shot in downtown Manhattan -- but with two people sitting in a room at a dinner table... uh-uh. And yet: everybody could see an improvement from films scanned from 4K to 2K, vs. a direct 2K scan. There are often advantages starting at a high resolution and then downrezing, depending on how it's done.
I think the CNet author is essentially correct, but some of his methodology is suspect, plus I think not having any 4K material for comparison kind of renders (so to speak) his argument specious. Once we have access to convenient real-time 4K playback on a real 4K display, plus a side-by-side comparison to a same-sized HD display, then you could do a fair test. Maybe this would be an ideal opportunity for Red to do a demo like this at NAB. I think it's a fantastic idea if they want to convince the naysayers.
To me, chasing resolution alone is kind of like the audiophiles who are only concerned about frequency response. There's lots of factors that affect sound quality: distortion, noise, acoustics, timbre, directionality. Sharpness alone is not the biggest factor in picture quality, either. It's just one of many. And my gut feeling has always been that exposure range is actually more important in the real world, assuming the camera and lens can at least hit some realistic specs on sharpness.