When you see a radar detector capable of "self-calibration," what exactly does this mean? I am looking for a technical explanation if anyone has one.
Thanks
When you see a radar detector capable of "self-calibration," what exactly does this mean? I am looking for a technical explanation if anyone has one.
Thanks
In the case of the Belscorts, the 2nd LO is crystal referenced, so it is pretty accurate. The detector sets the 1st and 2nd LOs to specific values, and then looks for (normally unwanted) mixer products which occur from mixing the 1st and 2nd LOs. By looking at these "markers" the 1st LO can be adjusted or "calibrated" with pretty good accuracy.Originally Posted by parasonic
In a detector that self-calibrates in this manner, there is less of a need to "oversweep" to ensure that the police bands are covered when the detector's oscillator drifts around. Less extra sweeping results in a sensitivity gain.
Jim
What are the frequencies of the LO's, and what IF's do you get, typically? Also, what would the frequency be for the sweep/scan? (This is really helpful to my understanding!)
Typically:
1st LO = Can sweep from 14.5 GHz to 15.5 GHz
2nd LO = Can sweep from 3.994 GHz to 4.194 GHz
1st IFs = 3.188 GHz, 5.0 GHz
2nd IF = 906 MHz
In the case of self-calibration, the second mixer is bypassed and the 2nd IF of 906 MHz is output directly from the 1st mixer (in which case it is actually the first IF).
To calibrate, the 2nd LO is set to 4.025 GHz, producing a 4th harmonic at 16.1 GHz. The 1st LO is swept until there is an IF signal at 906 MHz. Then, the micro knows that the frequency of the 1st LO is at 15.194 GHz (16.1-15.094=.906) and the VCO voltage reference for that frequency, at which point it is calibrated.
I'm so lost..............
Makes sence, Jim. But like TRC, i'm still confused some what as well :?
RR
Bookmarks