Page 1 of 3 123 LastLast
Results 1 to 10 of 26
  1. #1
    Newcomer
    Join Date
    Nov 2005
    Posts
    38

    Default The way detector tests should be run

    I remember in the good ole days(when Bel and Escort were rivals), Bel actually used to list the sensitivity of their radar detectors. Typically it was something like -99Dbm on K band and -114 Dbm on Xband. Escort never gave their actual data.

    What I would like to see is a series of tests that actually measured the sensitivity of various radar detectors at each frequency in the bandwidth.
    For those familiar with audio, it would be like the graphs and data on high-end speakers.

    Another thing to add would be a chart of the time it took to respond to a signal. Unlike the old days, today's detectors take time to respond to a signal because they are actually surfing through the various radar channels looking for a signal.

    Of course you would still need the various real world tests too. But those tests should be conducted at two speeds. First they should be run in a car traveling 5-10 mph. Then they should be run in a car doing 80-100mph.


    I believe if someone where to run those kinds of test it would be clear who the top manufacturer is. Unfortuantely with the amount of variability in the typical tests I see run, I think all the top detectors fall in the same margin of error.

    I wager if one took 10 Escort X50's and 10 Bel Rx65's, they would be very difficult to distinguish from each other, just by looing at the test data.

  2. #2
    Advanced Member
    Join Date
    Feb 2005
    Location
    Southern MA
    Posts
    10,127

    Default

    Why at 80-100mph? Radar detectors are a saftey device...not a device that is for you to speed. There will be no differece in the detection range no matter the speed

    Sensity is test by SML the proper way with a stop watch and a radar gun


    There are differences between the BEL rx65 and the Escort X50 the bel has a slightly larger antenna and has better KA sensitivity

    Spoiler: show

    Radar Detectors-V1 & BEL v995
    Laser Jammer-Laser Interceptor Quad
    GPS Camera Locator-Cheetah C100
    GPS Nav-Garmin nuvi w/Trapster
    CB Radio-Galaxy DX-949 w/Wilson 500
    Scanner-RS Pro-96

  3. #3

    Default Re: The way detector tests should be run

    Quote Originally Posted by tktm
    I remember in the good ole days(when Bel and Escort were rivals), Bel actually used to list the sensitivity of their radar detectors. Typically it was something like -99Dbm on K band and -114 Dbm on Xband. Escort never gave their actual data.

    What I would like to see is a series of tests that actually measured the sensitivity of various radar detectors at each frequency in the bandwidth.
    For those familiar with audio, it would be like the graphs and data on high-end speakers.

    Another thing to add would be a chart of the time it took to respond to a signal. Unlike the old days, today's detectors take time to respond to a signal because they are actually surfing through the various radar channels looking for a signal.

    Of course you would still need the various real world tests too. But those tests should be conducted at two speeds. First they should be run in a car traveling 5-10 mph. Then they should be run in a car doing 80-100mph.


    I believe if someone where to run those kinds of test it would be clear who the top manufacturer is. Unfortuantely with the amount of variability in the typical tests I see run, I think all the top detectors fall in the same margin of error.

    I wager if one took 10 Escort X50's and 10 Bel Rx65's, they would be very difficult to distinguish from each other, just by looing at the test data.
    Two posts and this guy is already my favorite forum member. It's been a while since I've read anything this insightful here. If you leave, I will hunt you down and kill you.


  4. #4
    Newcomer
    Join Date
    Nov 2005
    Posts
    38

    Default

    Quote Originally Posted by crazyVOLVOrob
    Why at 80-100mph? Radar detectors are a saftey device...not a device that is for you to speed. There will be no differece in the detection range no matter the speed

    The test would be run at two speeds to highlight difference in the quality/speed of the band scanning. Doing the first test at a slow speed should give an indication of the sensitivity of the detector (measuered in distance from the source). Running the same test at higher speed should give an indication of the quaility channel surfing that the detector has been programmed to do. Ideally, you should get a warning at the very same distance in both the high and low speed tests. But you most likely won't, especially on the "lower risk frequencies"

    And rememeber just because the tests are done at high speed does not mean that you should drive at high speed. Cars are tested under all sorts of driving speeds and conditions , but that does not mean that you should drive like that outside the testing facility.



    Sensity is test by SML the proper way with a stop watch and a radar gun
    There is a good amount of inherent error in that method, especially if your trying to measure something like POP. They should probably do all their tests at low speed to reduce that error.

    There are differences between the BEL rx65 and the Escort X50 the bel has a slightly larger antenna and has better KA sensitivity
    I wager that the typical testing protocols, like the one you mentioned above, are sloppy enough that you will not be able to determine a consistent winner without a large margin of error.


    If you really want to determine the real champ, the testing has to be done in a lab with exact measuerements. Then you do a real world test. As long as the there is not a HUGE difference in the real world tests. The lab tests will give you the real picture.

  5. #5
    Advanced Member
    Join Date
    Dec 2004
    Location
    Michigan
    Posts
    7,509

    Default

    Welcome to the board...

    Quote Originally Posted by tktm
    I remember in the good ole days(when Bel and Escort were rivals), Bel actually used to list the sensitivity of their radar detectors. Typically it was

    something like -99Dbm on K band and -114 Dbm on Xband. Escort never gave their actual data.

    What I would like to see is a series of tests that actually measured the sensitivity of various radar detectors at each frequency in the bandwidth.
    For those familiar with audio, it would be like the graphs and data on high-end speakers.
    I have always said this too!
    This would be great. They should publish their sensitivity, just like scanners manufacturers do.

    Tests like that have been done before, though not recently. Here's one test (it was funded by Beltronics though):
    http://www.beltronics.com/pdfs/gtri.pdf
    (I have the results around somewhere)

    I believe one was also done many years ago by Car & Driver.
    The reason that this isn't done often is that the test equipment to do this for microwave frequencies is horrendously expensive. But I would think it could be leased or something? Would be nice to see this though.

    Quote Originally Posted by tktm
    Another thing to add would be a chart of the time it took to respond to a signal. Unlike the old days, today's detectors take time to respond to a signal because they are actually surfing through the various radar channels looking for a signal.
    This would be great too (but certainly not using the same "stopwatch method" used by SML!).
    It is also a common misconception that response is directly related to sweep rate (even by some test sources ). To a point it is limited by the sweep rate, but not much. Most of the high end detectors complete their sweep of all bands in a few tenths of a second or less. What limits response time is, most of the detectors incorporate a filtering scheme to prevent falsing, where a radar signal has to be present for a certain amount of time before the alert is reported. This is ultimately what determines response time. Michael B tested a few detectors here:
    http://www.radardetector.net/viewtop...=asc&start=240


    Quote Originally Posted by tktm
    Of course you would still need the various real world tests too. But those tests should be conducted at two speeds. First they should be run in a car traveling 5-10 mph.

    Then they should be run in a car doing 80-100mph.
    Unless I am missing something, I don't think this would have any merit in comparing the performance of detectors.
    Bottom line, the point at which each detector first acquires a signal will not change at all based on the speed of the vehicle. Why would you believe otherwise???
    I'm sorry but I don't follow your reasoning here.

    Quote Originally Posted by tktm
    I wager if one took 10 Escort X50's and 10 Bel Rx65's, they would be very difficult to distinguish from each other, just by looing at the test data.
    In general, you might be correct, as long as the RX-65 is run in "International" mode. In USA mode, I think it has a distinct Ka advantage over the X50. Maybe the next time someone does a "lab test" they'll try this.

    Jim

  6. #6

    Default

    Quote Originally Posted by jimbonzzz
    Quote Originally Posted by tktm
    Of course you would still need the various real world tests too. But those tests should be conducted at two speeds. First they should be run in a car traveling 5-10 mph.

    Then they should be run in a car doing 80-100mph.
    Unless I am missing something, I don't think this would have any merit in comparing the performance of detectors.
    Bottom line, the point at which each detector first acquires a signal will not change at all based on the speed of the vehicle. Why would you believe otherwise???
    I'm sorry but I don't follow your reasoning here.

    Jim
    If the "sweep rate" is significantly poor, then this would show up to a greater extent at higher travel rates (i.e. more road covered before an alert was registered). I think this is what was implied by tktm, anyhow.

  7. #7
    Newcomer
    Join Date
    Nov 2005
    Posts
    38

    Default

    Quote Originally Posted by jimbonzzz
    Welcome to the board...

    Unless I am missing something, I don't think this would have any merit in comparing the performance of detectors.
    Bottom line, the point at which each detector first acquires a signal will not change at all based on the speed of the vehicle. Why would you believe otherwise???
    I'm sorry but I don't follow your reasoning here.
    Thanks,

    Think of it this way, if the car is driving very slowly, then even if the radar detector takes a long time to alert you (say because of a slow sweep, or a low priority sweep, or because if its trying to determine if it is a 'real signal'), that slowness to RESPOND to a signal won't have as much of an impact on the test. Picture the worst case of this kind of testing, you drive the car 5 feet forward, stop, wait 10 seconds to see if you get an alert, if not drive another 5 feet forward and wait another 10 seconds. If you keep doing this you will discover the minimum distance to the radar source before the alert. This in effect will be a measure of sensitivity.

    Next you run the test with the car traveling at a high rate of speed (the faster the better). Now the time it takes for the detector to alert you to a signal does play a part, if the detector takes a good amount of time to analyze (or find) the signal, you will be significantly closer to the radar source then you would be in the slower testing method. The difference between the fast measurement and the slow measurement would give an indication of how well 'programmed' the detector is.

    I wager that if you did the test that way with an X50, you would probably notice a real difference between the two measurement methods with "pop" turned on and off. Especially on frequencies that it is not specifically looking for 'pop' to be on.

    Even on my 8500 I have noticed this tendency. There always seems to be a bit of lag between when the radar hits and when the detector alerts. The same is true on the backside. The detector continues to alert for a bit even after the threat has been turned off. I suspect they do this to present the alerts to the user in a more "pleasant" manner. Unfortunately making the alert more stable also sort of robs the high end user of detail information.

  8. #8
    Newcomer
    Join Date
    Nov 2005
    Posts
    38

    Default

    Jim

    If the "sweep rate" is significantly poor, then this would show up to a greater extent at higher travel rates (i.e. more road covered before an alert was registered). I think this is what was implied by tktm, anyhow.
    Bingo.

    Im not completely familiar with the jargon you guys use here, so its a bit more laborious for me to explain. But Jim has got the gist of what I was trying to convey.

  9. #9
    Advanced Member
    Join Date
    Dec 2004
    Location
    Michigan
    Posts
    7,509

    Default

    OK, I completely undertand what you are talking about now.
    Such as in this hypothetical situation:

    One detector might be able to detect radar 500 feet from a radar source at "slow or stopped" speed, but it takes two seconds to respond. At 80 MPH in those two seconds, the vehicle has travelled 235 feet closer to the source. The driver gets an alert 265 feet from the source.

    Another detector doesn't detect radar until it is 400 feet from a radar source at "slow or stopped" speed, but it responds in only 1/2 second. At 80 MPH in that 1/2 second, the vehicle has travelled 59 feet closer to the source. The driver gets an alert at 341 feet from the source.

    So, in the above scenario, although the first detector is capable of detecting the radar farther away from the source, at 80 MPH the second detector actually alerted the driver farther from the source even though it is less sensitive, because the "filter delay" is less.


    This is definitely something that should be taken into consideration. I don't think it necessarily needs to actually be tested though, since it can simply be calculated in all cases, once you know how long it takes the detector to respond. You could even chart it for different speeds. Someone could take SML's current data and plot it like that if they wanted to.

    At the higher speeds, you will have also travelled farther before the driver responds to the alert. It will also take longer to slow down to legal speeds. But all of this will vary a lot based on vehicle weight, driver response, braking, etc etc. This is all good stuff to consider though.

    Jim

  10. #10
    Power User
    Join Date
    Feb 2005
    Location
    In Car RamRod
    Posts
    4,001

    Default

    the FCC site post all this sensitivity info

 

 

Similar Threads

  1. Q's about GOL 8/08 Radar Detector tests
    By Tambourine-Man in forum Detector & Counter Measure Testing and Reviews
    Replies: 21
    Last Post: 09-08-2008, 03:34 PM
  2. Old Old...REALLY Old Site With Detector Tests!
    By icantdrive55 in forum Radar Detectors - General
    Replies: 5
    Last Post: 11-12-2007, 01:09 PM
  3. Radar detector comparison tests against photo radars in EU.
    By T-T in forum Detector & Counter Measure Testing and Reviews
    Replies: 87
    Last Post: 09-11-2005, 07:07 AM
  4. V1 RADAR DETECTOR DETECTOR (RDD) TESTS
    By BiGeAsYgUy in forum Valentine One
    Replies: 0
    Last Post: 08-20-2005, 11:46 PM

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •