33.8 & 34.7 & 35.5 KA
I think this is more of a technical question but why do radar detectors have a harder time detecting 33.8 vs 34.7 or even 35.5.
I've just noticed when looking at detection ranges of the top end models that there is hundreds of feet in difference when 33.8 vs 37.4 is used.
So how come in the Guys of Lidar 2006 test's the V1 (#2 Average) detected 34.7 at 2119 Feet and V1 (#2 Average) 33.8 at 752 feet?
Why is it easier to detect 34.7? vs 33.8 and 35.5?
It also says this about 1/4 of the way down:
The absolute distances mean nothing in the real world, for example these test results do not mean that "Detector A can only detect radar 100 feet away" or that "Detector A is only 100 feet better than Detector B". There is also no way to tell from this test that "radar gun A is hard to detect and radar gun B is easy to detect".
In this specific test, more foam was used for some guns than others, and the foam also has slightly different effectiveness at different frequencies. In this test, the results offer a good comparison of the relative sensitivity of one detector against another one on the same band or frequency. But you cannot be directly compare the absolute results to other bands or frequencies.
So in theory a 33.8 signal and a 34.7 signal should be detected at the same distance provided the gun power is the same, but like you said real world variables can change things quite a bit.
I thought perhaps radar detectors scan 34.7 more often than 33.8 or 35.5 and thus this provided a better detection distance.