I remember in the good ole days(when Bel and Escort were rivals), Bel actually used to list the sensitivity of their radar detectors. Typically it was something like -99Dbm on K band and -114 Dbm on Xband. Escort never gave their actual data.
What I would like to see is a series of tests that actually measured the sensitivity of various radar detectors at each frequency in the bandwidth.
For those familiar with audio, it would be like the graphs and data on high-end speakers.
Another thing to add would be a chart of the time it took to respond to a signal. Unlike the old days, today's detectors take time to respond to a signal because they are actually surfing through the various radar channels looking for a signal.
Of course you would still need the various real world tests too. But those tests should be conducted at two speeds. First they should be run in a car traveling 5-10 mph. Then they should be run in a car doing 80-100mph.
I believe if someone where to run those kinds of test it would be clear who the top manufacturer is. Unfortuantely with the amount of variability in the typical tests I see run, I think all the top detectors fall in the same margin of error.
I wager if one took 10 Escort X50's and 10 Bel Rx65's, they would be very difficult to distinguish from each other, just by looing at the test data.