Saturday, May 23, 2009

Why did Dish Network change their signal meter?


Why Did Dish Network Change Their Signal Meter?

 

Recently, Dish Network sent out an announcement that there has been improvements made to their signal meter. The "improvement" did several things - none of which are an improvement from a troubleshooting or an installation standpoint. Let us examine these changes.

 

One change was to even all the signals across all receivers.

 

Prior to the signal meter "improvement", each of the receivers (by model number) had different readings. Receivers with the same model number all showed similar signal. The signal meter showed the highest signal readings on the simplest receivers like the 301. Each successive receiver showed a lower reading with the HD/Dual/DVR receivers showing the lowest of all.

 

The signal meter readings were different for good reason. Two good reasons, actually. The readings reflected the added intrinsic noise of the receivers and they showed the greater signal demands of HDTV.

Dish meters measure signal quality/fidelity/integrity, not signal strength. The meter reflected the difference in the signal integrity between the receivers.

 

Given: The same signal STRENGTH produces less signal integrity as you add noise.

 

Each of the higher numbered receivers showed this decrease in integrity as additional components added noise and reduced the integrity of the signal (dual receiver-added tuner etc, DVR capability).

 

 

And,

 

Given: The same signal STRENGTH produces less signal integrity for an HD signal than it does for SD.

 

The astounding jump to the lowest signals ever, those of the HD models, reflect the need for additional signal. (Warning: Cliff!)

The effects of the leveling signals across models:

 

Removes the ability to compare other readings in the home.

Removes the ability to switch places with an existing receiver to check the signal integrity of the line.

Removes the question of “Why there is plenty of signal on my SD receivers but not for my HD?”

 

A second change was to change (reduce) the scale of the signal meter.

 

Reduction in the scale of a measurement device causes a reduction in precision. (Sad.)

 

Changing the scale also did away with the one benchmark that there was in digital scale meters. That benchmark is/was 70. Here again, while few remember that there WAS a standard, those who would apply it to the average HD signal readings found across the country would see that HD signal is pretty much ready to fall off the digital cliff! 

 

Effects of reduction of scale:

Reduces the precision of measurement.

Negatively impacts troubleshooting and installation.

Confuses installers and customers.

 

Effects of changing the scale:

Lost benchmark

Confuses installers and customers.

 

Now, of course, there is much guesswork about what is good signal strength. (And the benchmark of 70 that is still used in digital technology, is just not used by Dish.)

 

Lastly, in the signal meter "improvement", Dish also saw fit to increase the latency time for channel changes.

 

At present, all receivers read the same meter readings (HD’s lower readings) and all of them change channels far slower than ever. (They increased buffer size to try to accommodate their cliff-dwelling signal, but I’ll address that later.)

Why would changing the amount of time it takes to change channels matter?

 

As a troubleshooter, the first evidence of low signal readings was slow channel changes. Now they all change at the same slow rate.

 

Effects of latency time:

There goes another troubleshooting method.

 

So why did Dish Network change the signal meter?

 

If you are still wondering why, perhaps you should tune in for part 2.

Something smells fishy, very fishy.

 

No comments:

Post a Comment