Thetis 2.8.11 Rx Level Calibration

k6avp
Posts: 26
Joined: Mon Apr 10, 2017 1:41 am

Thetis 2.8.11 Rx Level Calibration

Postby k6avp » Tue Dec 15, 2020 10:24 pm

Been running Thetis on my ANAN 100B for quite a while now. Finally, decided to try calibrating my (panadapter and digital readout display). Found some interesting things that I would like to see comments on. I am running 2.8.11 on latest Win10 (64 bit). My calibration input signal is from a 10MHz GPSDO with -50dB attenuation and measured by a Fluke 50MHz DMM/scope.

The signal after the tap turns out to be -41dBm. I feed it into ant 1 with no signals on other antenna inputs. I set -41 in the level input box of the calibrate button and press "go". The digital readout reads -41dBm for all sample rates. The panadapter grid, and bottom line left readout, reads various values from about -51dBm to -61dBm depending on whether the unit is running at 48k s/s to 1.5M s/s, respectively. The bottom edge right readout reads about -31dBm to -41dBm depending on whether the unit is running at 48k s/s to 1.5M s/s, respectively.

With the same signal in and setting VFO A readout to 10MHz, running the frequency calibration provides panadapter readings of the input signal (via cursor and bottom edge right readout) of about about 9.99800MHz to 9.999998 depending on panadapter multiplier of 0.5x to 4x and the unit at running at 1.5M s/s to 48k s/s, (both) respectively.

The AGC gain line shows on the panadapter (grid) as about -20dBm lower than its digital gain setting (left of pan adapter). I have found I can make the AGC gain number (reasonably) match the panadapter grid by putting in a calibration level box entry of -21dBm. However, the panadapter level readings are then even farther off (~20dB farther) than when inputting the true Rx signal level. The Rx level reading being correct.

In other words, both the panadapter frequency and level signals vary with sampling speed set, while the digital Rx level always matches the true Rx signal input. However, the frequency and level variances are oppositely affected by sample rate. Also, calibrating at the lowest sample rate available (48k s/s) provides less variance in panadapter readings, but does not seem to affect the Rx signal level digital display. Switching the sample rate does however cause the Rx digital display to read wrong (the bottom of its range I suspect) until a modulation mode button is toggled/changed, at which point it starts reading correctly until the next sample change.

I take all this to probably mean there are some scaling problems with the panadapter display, while the Rx digital display is works accurately. I guess it could be a 100B unique problem. I have not checked this on mrx, or earlier Thetis, versions. Any one else seeing/seen these things?
User avatar
w-u-2-o
Posts: 5572
Joined: Fri Mar 10, 2017 1:47 pm

Re: Thetis 2.8.11 Rx Level Calibration

Postby w-u-2-o » Wed Dec 16, 2020 2:52 pm

k6avp wrote:The signal after the tap turns out to be -41dBm.
What do you mean by "after the tap"?

I feed it into ant 1 with no signals on other antenna inputs. I set -41 in the level input box of the calibrate button and press "go". The digital readout reads -41dBm for all sample rates. The panadapter grid, and bottom line left readout, reads various values from about -51dBm to -61dBm depending on whether the unit is running at 48k s/s to 1.5M s/s, respectively.
Are you controlling your FFT bin width as you change from one sample rate to another? What is it set to? The setting can be found in Setup > Display > RX1.

The reason this is important is because the result of any RF power measurement is wholly dependent on how much of the signal being measured fits within the measurement bandwidth. Even a CW signal has a finite bandwidth.

The bandwidth of the S-meter power measurement is equal to the selected receive passband. In other words it is normally fairly wide compared to most signals. Thus as long as the signal to noise ratio (SNR) is high the signal power measurement will be pretty straightforward for the S-meter.

The measurement bandwidth of the panadapter is the FFT bin width referenced above. It is typically quite small, and often far less than even the bandwidth of a CW signal from a signal generator.

For each sample rate, if you ensure that the FFT bin width is larger than the width of the signal generator signal, and assuming a good SNR of say better than 12dB, then both the panadapter and S-meter levels will be the same. Note that, unfortunately, the bin width changes when the sample rate is changed. So every time the sample rate is changed the bin width needs to be manually adjusted in order to keep it at the same value.

See also viewtopic.php?f=6&t=2463

With the same signal in and setting VFO A readout to 10MHz, running the frequency calibration provides panadapter readings of the input signal (via cursor and bottom edge right readout) of about about 9.99800MHz to 9.999998 depending on panadapter multiplier of 0.5x to 4x and the unit at running at 1.5M s/s to 48k s/s, (both) respectively.
This is entirely due to interpolation issues associated with the selected FFT bin width (as discussed above) and Hz/pixel resolution. If you want to convince yourself that your calibration is nearly perfect, process the received audio in an external application like Fldigi.

The AGC gain line shows on the panadapter (grid) as about -20dBm lower than its digital gain setting (left of pan adapter). I have found I can make the AGC gain number (reasonably) match the panadapter grid by putting in a calibration level box entry of -21dBm. However, the panadapter level readings are then even farther off (~20dB farther) than when inputting the true Rx signal level. The Rx level reading being correct.
The location of the gain line is somewhat arbitrary and not tied to any particular standard. It is merely meant as a visual cue that one can leverage for quick adjustments relative to the displayed average noise level (DANL).

Any one else seeing/seen these things?
Everyone sees these same things when using any spectrum analyzer, which is exactly what a "panadapter" is. That is true whether the spectrum analyzer is based on a $100,000 laboratory instrument or a cheap USB stick receiver. Whether or not they are noticing them is another story :)
k6avp
Posts: 26
Joined: Mon Apr 10, 2017 1:41 am

Re: Thetis 2.8.11 Rx Level Calibration

Postby k6avp » Sat Dec 19, 2020 1:11 am

It makes sense that the pan adaptor level readings should be affected by the the bin width being used in the FFT and matches what I see when rechecking while adjusting the bin width. I have not had much experience with spectrum analysis so did not catch the bin width to signal width measurement relationship.

I also assumed the Rx level calibration routine would scale both panadapter and RX digital display to a known level input. Too bad it does not. Maybe it would make sense for the panadapter lower right readouts to show the VFO and digital display dBM scaled readings.

Previously, I left the bin width at max when switching between sample rates. However, I ran more measurements holding the bin width at 23.434 for 96K spa to 1.5M s/s and still show a pretty big difference in what the display meter reads vs the grid and lower panadapter right hand readout. I used of WWV 10MHz (vs GPSDO), with a +-10Hz (same as before) input filter and FFT=Hann at 23.434 width and they were:

panadpter lower cursor readings @96K sps: -49dB @0.5x, -45dB @ 1x, -42.5dB @4x, non cursor right -41dB, -39.5dB, -37.2dB
panadpter lower cursor readings @ 758K sps: -57dB @0.5x, -52dB @1x, -45dB @4x, non cursor right -46.1dB, -42.2dB, -39.6dB
panadpter lower cursor readings @ 1.5M sps: -48dB @0.5x, -51.4dB @1x, -48dB @4x, non cursor right -48dB, -44.9dB, -41.1dB
Rx digital display readout stayed constant at -29dBm in SigAvg (-32.1dBmFS in ADCL & ADCR)

I used a -50dB line sampler as the attenuator for my original checks, so "after the tap" meant level going to ANAN Ant 1 port.
User avatar
w-u-2-o
Posts: 5572
Joined: Fri Mar 10, 2017 1:47 pm

Re: Thetis 2.8.11 Rx Level Calibration

Postby w-u-2-o » Sat Dec 19, 2020 2:27 pm

k6avp wrote:It makes sense that the pan adaptor level readings should be affected by the the bin width being used in the FFT and matches what I see when rechecking while adjusting the bin width. I have not had much experience with spectrum analysis so did not catch the bin width to signal width measurement relationship.
It's really an RF power measurement thing. Spectrum analysis is just a special case of power measurement. In professional spectrum analyzers this value is often called "resolution bandwidth".

I also assumed the Rx level calibration routine would scale both panadapter and RX digital display to a known level input. Too bad it does not.
It absolutely does do that. If you use a signal generator to generate a single carrier at, for example, -60dBm, both the panadapter and the S-meter will read -60dBm after calibration, but only if the bin width setting is larger than the entire carrier bandwidth.

Maybe it would make sense for the panadapter lower right readouts to show the VFO and digital display dBM scaled readings.
I have no idea what you mean by this. But I will say that trying to get the panadapter power measurements to match the S-meter is pointless. They are two different types of measurements and you don't want them to be the same! The panadapter intentionally uses a small bandwidth so that the shape and spectral content of the signal can be appreciated. The amount of power in each small bandwidth is far less than all of the power in the wider S-meter bandwidth. The S-meter makes what we call a "channel power measurement" in the RF engineering world. Apples and oranges.

Previously, I left the bin width at max when switching between sample rates. However, I ran more measurements holding the bin width at 23.434 for 96K spa to 1.5M s/s and still show a pretty big difference in what the display meter reads vs the grid and lower panadapter right hand readout.
The lower right hand readout is broken and can't be used. Use your cursor and the left lower readout. Or change your grid to 1dB per division. When I want to make more precise measurements I do that, change to panadapter mode (instead of panafall), change to "collapse" mode, and go to full screen on the window size.
k6avp
Posts: 26
Joined: Mon Apr 10, 2017 1:41 am

Re: Thetis 2.8.11 Rx Level Calibration

Postby k6avp » Sun Jan 03, 2021 1:42 am

I think I get the operation now. Thanks for your help.

BTW, the lower right hand amplitude readout seems to show the right value if the largest signal in the band is right on the center (VFO) frequency, and it is within the set rx filter by itself. Otherwise, the value is pretty far off.

Return to “Thetis”