Saturday, April 28, 2018

Gone Streaming. Sorry, Comcast.

So sorry to hear that the cable TV industry is suffering because of the growing defection to streaming media services.  See this recent Fierce Cable article. We seem to be entering a meltdown, where increasing cable prices encourage more of us to "cut the cable" and go to streaming solutions.  That means that cable companies have to increase their rates, which leads to more defections.

You don't want to be the last one to switch over in a game like this.

We aren't the first by any means, but our sky-high bill finally got to be too much when the last of our introductory discounts disappeared.

Technically, the Comcast service in our area is very good.  Internet performance has inched up over 250 Mb/s.  Unfortunately, the monthly charge is running around $1 per Mb/s.

So we had an abundance of bandwidth and a similar abundance of channels -- most of which we never used.  The Internet bandwidth is sweet when I want to download a new Linux DVD every 6 months, but how much is that really worth?

TL;DR. We have just dropped cable video and phone service and cut back our Internet speed to 60 Mb/s -- quite enough for our small household.  These changes cut our Comcast payment by 70%!

The new system is built on a Netgear CM600 modem, an Asus RT-N66U WiFi router, an Ooma Telo VOIP box, and a Roku streaming device. (Our nice Sony HDTV predates "smart TV".*) In addition, we're watching more over-the-air TV, mainly to get the PBS Newshour live.  (PBS hasn't figured out how to live stream, it appears.) In this location, we need an amplified antenna that mostly works for us indoors, but it will need to be installed outdoors for solid performance.

The thorny issue now is how to make sense of the many streaming services.  People worry about what will be happening without "net neutrality".  The Internet is likely to fragment into walled gardens.  As others have pointed out, this already is happening in the streaming market.  Do I want Amazon Prime, Netflix, Hulu, CBS Now, etc.?  There are several providers for live streaming TV channels, too. Each of these has some interesting content.  Even if I didn't mind paying for all of them, the data management gets to be overwhelming.  There is no simple navigation or program guide I know of that crosses those boundaries.

Brave new world?  Chaos?  All of that. Glad to help the cable industry find its destiny.

* Smart TV: I worry that the "smarts" get obsolete well before the "TV" does.  Integrating them should help simplify the user experience, but the quick obsolescence is a worry.

Monday, April 09, 2018

Frequency Measurement Test

The old way: BC-221 meter
April 6, 2018 was my second attempt at the ARRL Frequency Measurement Test in which amateurs are invited to measure the exact frequency of a test signal transmitted from a central site. The first time, years ago, I came out OK with a manual procedure using my old TenTec Orion radio, carefully calibrated against the US NIST Time and Frequency Station WWV. Measurement accuracy depended on an imprecise estimate of the WWV calibration compounded by an imprecise measurement of the W1AW test signal.

This time, we've upped the ante, using the FlexRadio Systems 6500 transceiver (an SDR radio) with its GPS Disciplined Oscillator as a master frequency reference.  The reference is said to be accurate to some parts in 1012, though we have no way to verify that number at this time.  If the receiver is tuned to a known frequency just below the test signal in upper sideband mode, so that the received signal shows up as an "audio" tone.  (In this SDR receiver, there is no "audio", since everything is digital.  That bypasses various audio measurement problems that might otherwise have cropped up.)  The fldigi software package is used in its "spectral analysis" mode to accurately measure the offset, combined with the known local oscillator tuning, to yield a good measurement of the unknown RF frequency.  The software outputs an Excel CSV file that records time and best-fit frequency each second.

Here in Branford, CT the antenna for 20 M is a 3-element SteppIR at 40 ft pointed west. For 40 M and 80 M the antennas were dipoles oriented NW-SE, more or less.

This exercise involved transmissions on the 20, 40, and 80 Meter amateur bands from K5CM in eastern Oklahoma.


The actual numbers as transmitted are reported on the FMT results page for 2018.

Band  F Measured (Hz)  F Actual (Hz)   Error (Hz)
20M  14,121,963.42 +/- .08*  14,121,963.34   +0.08
40M   7,064,257.09 +/- .20*   7,064,257.06   +0.03
80M   3,598,169.5**   3,598,169.73   -0.23

* Error bar quoted is 1/2 the total peak-to-peak frequency excursion in the 2 minute test transmission.
** 80M results were compromised by a data handling problem.  Precision is reduced, and an error bar could not be estimated.

WWV reported geomagnetic conditions Kp=2 and Ap=9 during the test.

The graphs at the right show the (almost) raw data measured on the 20 and 40 M bands.  The 20 M signal strength was quite good, touching S9+10 dB, and the measurements appear largely free of statistical noise. The major feature is an sine-like variation that presumably reflects true changes in the signal data path.  I believe that the "glitch" at the left is an artifact of an initial mis-adjustment of the radio.  It was left out of the average calculation. (Ignore the "even samples" tag.)

The 40 M signal was about S8, i.e. up to 16 dB weaker than on 20 M.  This probably produced much of the short-period noise on the graph. However, we might also expect the ionosphere to produce more variability on the lower frequency.

For 20 and 40, we calculate a simple average frequency, after eliminating the initial points on 20 M.  Note that we might have done better of we could weight the samples according to instantaneous signal strength.  There are two sharp dips in the 40 M data that may well have arisen from deep signal fades.  If they were eliminated, we would have a slightly higher frequency estimate, which would have increased our final error value.

Because of the problems with the 80 M data, there is no meaningful graph to plot for that band.


The final error (Measured - Actual) is under 0.1 Hz for the two fully analyzed bands, while the 80 M error is -0.23 Hz based on fewer data points.  These are surprisingly good, leaving relatively little room for improvement given the "noisiness" of ionospheric propagation conditions.  Presumably, we might get a somewhat better measurement if we had a longer test run, perhaps 5 or 10 minutes or more, or if we got lucky and had a period of super-stability in the ionosphere.