Quantcast
Channel: Comcast XFINITY TV forum - dslreports.com
Viewing all articles
Browse latest Browse all 4835

CM tuner quality question

$
0
0
So we just got the X1 package after I talked Comcast down from a 250$ bill to around 205$ and they threw in the X1 bundle as a 2 year no contract promotion. This last weekend, it was all set up by the technician that came in the 11am-1pm window and he showed me how to get into the diagnostics menu on the X1 DVR and terminals that span throughout the house. Basically, the X1 platform uses the same frequencies your cable modem uses (IP frequencies). So I see both the X1 DVR and Arris TM722G use the 699mhz channel as their primary frequency, but the SNR for both devices is pretty different. They both come off a two-way Antronix 3.5db splitter (one to the main X1 DVR and one to the TM722G modem). Downstream levels for both devices are within -1 to +1 and upstream power is 43-43.5 dbmv for both. However the SNR for the X1 DVR (and all the terminals for additional TV's) is 40-41db and the cable modem's are 36-37db. How is this possible when both devices are on the same run throughout the house (comes in the house to a 15db Antronix return amp and goes all the way up to my office to that two-way splitter)? Could it be the tuner quality on the modem? Because mainly what determines SNR is the length of the coax line and how much noise by amps or ingress is on the line. I understand this is not a critical issue because as long as the cable modem has at least 33db for 256QAM, I shouldn't see bandwidth issues (and I do not). But for future-proofing on higher speeds, I just want to know if it's the cable modem or something else causing the SNR differences.

Viewing all articles
Browse latest Browse all 4835

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>