Guys I was thinking the other day why some manufactures are putting such a long lag time on their radar detectors. When I say lag time I mean the following: Lets say you are testing an RD on a particular band and you press the trigger for a second and let it go. Well some RD's will alert for about 10 seconds after you let go of the trigger even though the signal is no longer present! Why would they do this? Simple they want to put the illusion that their RD is more sensitive than they really are. So lets say you were testing a V1 vs an RD that has major lag time. Well lets say the V1 picks up the faint signal then lets go of it because of less lag time. Now RD A picks it up and keeps it locked for 10 seconds even though it technically is no longer there, one might come to a conclusion that RD A held onto the signal longer and more consistently therefore it is more sensitive. The other scenario would be passing a signal and now it is still picking it up even though RD A does not have a rear antenna and the V1 does so one might come to a conclusion that RD A is just as good as picking up signals from the rear even though it is not. Clever ha? I have seen so many posts of how some RD's are holding on to a signal better and come to a conclusion that this RD is more sensitive than their other one.