I completely forgot about Veri's post earlier this summer which shows Michael B's raw dBm sensitivity results for four different and more recent V1 radar detectors. These results, when averaged and used as a baseline against GOL's August straight line radar detector maximum detection distance tests, do very slightly affect my original results when converting GOL's detection distances to overall dBm sensitivities for each radar detector. Being conscientious, I have recalculated each radar detector's dBm sensitivity based upon the average sensitivity of the four more recent V1 models as tested by Michael B, using the results as presented by Veri.
To refresh your memory, following are Veri's very nicely done sensitivity spreadsheet based upon Michael B's reported results:
I decided to average Veri's tabulated V1 sensitivity results for just the four V1 hardware version 1.8 models (highlighted in yellow, above) since the 1.7 model was significantly inferior on 35.5 Ka and since GOL's August tests of the three V1s which they tested were obviously all version 1.8 models. Thus, I tossed out the V1 version 1.7 model and averaged the four version 1.8 V1 models.
Here are the average V1 Ka sensitivities for the four version 1.8 models tested by Michael B and as reported by Veri:
The most important variable to nail down with regards to converting GOL's very carefully measured straight line distance measurements for each radar detector is to have very accurate baseline sensitivity measurements on each Ka frequency for just one single radar detector model. I chose the V1 as the baseline dBm sensitivity reference since I was able to average the very similar results for the four V1 version 1.8 models tested by Michael B and reported by Veri.
So, we now have a very accurate baseline which should be accurate to within +/-2 dBm for each Ka radar frequency. Thus, following is my revised graph of the results of my calculations to convert GOL's distance measurements to raw dBm sensitivities for each radar detector model:
My 3dBm wide rankings from "below average" to "superb" in the above chart are based on my personal experiences when testing radar detectors over the last 20 years. Yet note that the various automotive magazines which tested radar detectors over the years considered anything less than 98dBm to be poor performance. I consider anything less than 95dBm to be poor performance -- essentially matching the criteria used by the various automotive magazines over the years.
Why am I graphing each radar detector's sensitvity in terms of dBm (or more accurately, dBm/cm^2)? Because this scale is the industry standard method for measuring radar sensitivity in the lab, and because for the past 20 years, Car & Driver, Road & Track and MotorTrend have considered any radar detector with sensitivities less than around 95dBm to be not even worth considering.
Stuff you should know about dBm sensitivity measurements...
We all know that a radar detector which is 3dBm more sensitive is actually twice as sensitive to radar. Likewise a radar detector which is 6dBm more sensitive is four times as sensitive to radar. 10dBm more sensitivity translates to 10 times greater sensitivity to radar, and 20dBm more sensitivity translates to 100 times more sensitivity to radar. 30dBm translates to 1000 times more sensitivity.
Looking at my graph for the 35.5 Ka sensitivities of the Escort 9500i or Valentine V1 and the RMR C430, one can see that the Escort 9500i is more sensitive by almost exactly 30dBm. This correctly means that the 9500i or V1 is approximately 1000 times more sensitive to 35.5 Ka radar compared to the RMR C430! Since radar obeys the inverse square law of radiation, we can take the square root of 1000 which is approximately 32. This means that the RMR C430 must be roughly 32 times closer than an Escort 9500i or V1 to the 35.5 radar gun in order to detect it, which is what GOL's distance measurements also show. It also means that the RMR C430 is essentially dead to 35.5 Ka compared to the best radar detectors. If you were using a RMR C430, you could throw a rock at the patrol car when you finally detected his 35.5 radar!
Here is a chart which shows how differences in dBm relates to sensitivity and detection distance. It should help you to easily interpret what my industry standard dBm graph is trying to show:
1dBm = 1.3X sensitivity = 1.1X detection distance
2dBm = 1.6X sensitivity = 1.3X detection distance
3dBm = 2X sensitivity = 1.4X detection distance
4dBm = 2.5X sensitivity = 1.6X detection distance
5dBm = 3.2X sensitivity = 1.8X detection distance
6dBm = 4X sensitivity = 2X detection distance
10dBm = 10X sensitivity = 3.2X detection distance
20dBm = 100X sensitivity = 10X detection distance
30dBm = 1000X sensitivity = 32X detection distance
Following are my original spreadsheet entries, with formulas shown in separate columns, which show how I arrived at my results for each Ka frequency which are plotted in the above chart.
Ka 33.8 spreadsheet:
Ka 34.7 spreadsheet:
Ka 35.5 spreadsheet:
How accurate are my results? They should be accurate to within +/-2dBm. Indeed, my calculated dBm sensitivity results for the RX-65, STi, 8500 X50 and 9500i appear to match Veri's tabulated results within a 2dBm range.
As another comparison, lets look at the FCC photo which shows the internals of the STi submitted for testing and FCC certification:
Note the numbers written in pencil on atop the radar horn. The numbers, from the front end of the horn towards the rear are:
X band -- 119
K band -- 116
35.5 Ka -- 108
34.7 Ka -- 113
33.8 Ka -- 112
The Ka numbers agree very well with my spreadsheet and graphs for the STi. I calculated:
35.5 Ka -- 109.6
34.7 Ka -- 113.8
33.8 Ka -- 112.5
My results are within 2dB of the results which Bel wrote on the STi which they sent in for testing in order to be certified by the FCC.
In conclusion, if GOL's distance measurements were done carefully (and I am sure that they were), then my spreadsheet and bar graphs should be accurate to within +/-2dBm or better.