• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

Difference between MHz and Mbits/sec

yersinia29

Thinker
Joined
Apr 23, 2004
Messages
165
I'm looking at cabling and splitters thats rated 1-1000 MHz.

Is MHz interchangeable with Mbits/sec? These are both supposedly measures of bandwidth.

If a cable/splitter is rated at 1000 MHz, does that mean the maximum throughput speed is 1000 Mbits/sec?
 
No.

MHz is a measure of signal frequency
Mbits is a measure of data rate

The two are related, but not equal. For instance, Wireless LAN has a bandwitdh of 22 MHz, but its signal rate varies depending on the singnalling used and can acheive a data rate of 1.25 Mbits to 11 Mbits IIRC.

Also, MHz can apply to more than one entity. It can apply to the central or carrier frequency of a signal, or the size of the range of the signal bandwitdh.

Your dealing with cable, and I believe 1000 MHz is sufficient, but not having dealt with it in a while I am not sure.

Walt

Edit: I just checked, I believe the cable into your house is RG-59, which can in theory handle 2 GHz. However, I don't believe the signal that our provider shoves down the cable uses all of that. So I still think your safe at 1GHz, but 2 GHz is guaranteed safe.
 
No,

The frequency range of a splitter specifies its bandwidth. This is usually the frequency range that a device will transmit a signal with less than a 50% power loss.

Relating bits per second to the frequency requiremt of the transmission system is more complicated.

It depends on the encoding technique use to transmit the data, the the signal to noise ratio of the transmitting medium and the bandwith of the transmitting medium. Shannons law is an equation which predicts the maximum possible throughput of a transmission medium given the bandwidth and the signal to noise ratio of that medium.

Take a phone line as an example. The bandwith of a phone line is someplace around 3000 Hertz. A standard modem can transmit data at about 25000 bits per second (without compression) through that line given the signal to noise ratio of the telephone line. (Ignore the issue of 56 K modems for now.)

That 25000 bits per second is right at the limit predicted by Shannons law. This suggests that the encoding technique used by the modem is approximately an optimal encoding technique, because Shannon's law predicts that no matter what encoding technique you use you won't be able to get better throughput than that.

Brief digression to discuss 56K modems: These improve on the 25000 hertz by improving the signal to noise ratio by eliminating some processing in the central office. The same trick can't be done from for the uplink direction so uplink transmission is limited to about 25000 hertz.

It is possible to understand Shannon's law intuitively. Imagine if you will that a particular medium had an infinite signal to noise ratio. This would allow a system to be designed with an infinite number of voltage levels to represent an infinity of numbers. You can imagine that such a system would be able to transmit data infinitely fast. Conversely, if you had a transmission medium that had an ifnite bandwith you could transmit data infinitely fast by just turning on and off the signal to represent your data. So clearly signal to noise ratio and bandwidth are the two parameters that are going to limit the ability of any transmission medium to transmit data.
 
So is a cable rated at 1000 MHz "good enough" to support a speed of, say, 10 Mbits/sec? does somebody know the real numbers that I can use to calculate the maximum throughput for a given MHz rating?

Sorry, I dont know what the SNR and other parameters are. The manufacturer just reports the bandwidth and none of the other stuff.
 
I've been having problems with my cable modem/internet connection.

There is one main outlet in my 3 story house. The circuit setup is thus:

1) Main outlet
2) 20 feet of coax
3) Splitter
4) 20 feet of coax
5) Splitter
6) 20 feet of coax
7) Splitter
8) 5 feet of coax
9) cable modem

When I first hooked up the cable modem, it was able to receive/transmit data just fine. However, the next morning, it suddently stopped working. It couldnt connect to the internet.

So, I backed up the cable modem behind the first splitter (each splitter is 3 dB attenuation, which if I'm interpreting correctly, represents a 50% loss in signal power). When I hooked it up here, it worked fine for 1 day. The next morning (it always works fine at night, then stops working in the morning) it stopped working again.

So I backed up the modem behind the 2nd splitter, and it worked fine for 1 day. However, the very next day, you guessed it, it stopped working again.

I'm assuming that the signal is too weak for the cable modem and that I have to plug it directly into the main outlet with no splitters in between.

My question is, why did the cable modem work at all? If the signal is too weak, why did it consistently work for 1 day and then stop all of a sudden?

Does the traffic of the surrounding network in the neighborhood affect my signal reception? this just doesnt make any sense to me.
 
The two are related, but not equal. For instance, Wireless LAN has a bandwitdh of 22 MHz, but its signal rate varies depending on the singnalling used and can acheive a data rate of 1.25 Mbits to 11 Mbits IIRC.

Wireless stuff nowadays is almost always 2.4ghz. Assuming it has 802.11 compliance, this can give you throughput of 11mbps (for b) or 54mbps (for g).

My question is, why did the cable modem work at all? If the signal is too weak, why did it consistently work for 1 day and then stop all of a sudden?

Might it be something to do with the time of day? One of my biggest annoyances with my cable is that during the middle of the day, I can't get the highest 2 channels that I receive. One is a latin channel which I don't mind, but the other is a subscriber channel. The cable company has given me no satisfactory explanation.

My suggestion for you is to get some drop amps set up, or stop splitting the signal so much, or use a larger splitter and a longer piece of cable.
 
Fade said:
Might it be something to do with the time of day?

My suggestion for you is to get some drop amps set up, or stop splitting the signal so much, or use a larger splitter and a longer piece of cable.

Absolutely, and this is where my thoughts are leaning. It is now Saturday night at 11 PM, and everything works fine.

However, come Monday morning, I bet it will stop working.

I thought cable internet was relatively immune to how many people are connected and that DSL was the service that suffered under that scenario.
 
I'm not sure about cable, but DSL is point to point: each subscriber has a seperate pair of wires and a seperate port in the phone company's office, along with their own modem. Of course, all that traffic is then concentrated onto some shared network connection like a T1 or T3, but the subscriber line is individual.

I think cable is the other way - you're basically sharing your coax with a bunch of other people, as they're all tied together at the street. It's possible somebody else has a piece of equipment that louses up the line when they turn it on, if its badly grounded or something like that. I have a crappy VCR that makes our analog cable go fuzzy on higher channels when its turned on...

Wiring around the extra splitters is a good plan - each of those is going to add some loss to the signal. Also, if you have any wall jacks that aren't in use, you might want to try disconnecting those at the splitters: unterminated connections can cause signal reflections at RF frequencies, which can interfere with your connection.

Regarding wiring, I think RG-6 is used for higher-frequency signals like digital satellite. It's the same impedance as RG-59, but higher quality with better shielding and lower loss over distance. Over short distances it shouldn't matter, though.
 
To answer your original question:

MegaHertz is a measure of frequency. Something that says it works "up to 1 gigahertz" is saying it as a 1 gigaHertz bandwidth (give or take, since we don't see any specification of how many dB down it is, etc, and I've met more than one bit of equipment that doesn't figure things in the usual way), although I'm sure it has a low frequency cutoff, too (well, depends on the kind of splitter, really).

Bits/second (and its scaled counterparts) are measures of information flow. Maximum bits/second is related to frequency by the Shannon channel capacity theorem.

If you want to know more, ask.
 
Fade said:
Wireless stuff nowadays is almost always 2.4ghz. Assuming it has 802.11 compliance, this can give you throughput of 11mbps (for b) or 54mbps (for g).

The 2.4 gHz referred to above is roughly the carrier frequency for 802.11 transmission. Data throughput is not directly a function of this. For instance they could have chosen 1 gHz for the carrier frequency and still had a 54 mbps throughput.
 
Fade said:


Wireless stuff nowadays is almost always 2.4ghz. Assuming it has 802.11 compliance, this can give you throughput of 11mbps (for b) or 54mbps (for g).
I mentioned the bandwidth (22 MHz) as it pertains to throughput more than carrier frequency. The signal is in the 2.4 to 2.485 GHz band as you mention.

Walt
 
I just realized some of what people have been saying, expecially me, may be a little misleading with respect to your question.

The cable in a cable modem system carries the entire bandwidth of cable tv programs including cable modem data. Shannon's law and all those other musings are a little off topic in this situation. Most cable TV stations in the US use a maximum frequency of about 750 mHz. although when I was in the cable TV business about 4 years ago there was a move afoot to increase that to 1.5 gHz to support more channels through the same cable. I don't know where that got to though.

If you have a cable company that is transmitting above 750 mHz you probably should be using splitters designed for 1.5 gHz. They cost about three times as much as the 750 mHz versions but they're still pretty cheap. If nobody has mentioned this yet, there are four ways splitters available. It is better to use one four way splitter than to use multiple two way splitters that need to be ganged to get more than two cable tv connections.

Each US television station takes up about 6 mHz so the total cable bandwidth is broken up into lots of 6 mHz bandwidth chunks. Some of which are used for cable modems. There were two cable modem standards back then in common use. They had names like QAM 64 and QAM 256 as I recall. The lower throughput one could deliver 24 mbps and the higher throughput one was about 40 mbps as I recall.

The reason that the cable modem from your view point is 10mbps is that that is the throughput of the ethernet transmission channel which connects your cable modem to your computer. The actual data pumped through the RF cable is considerably faster than 10mbps and that helps compensate a bit for the fact that you are sharing the cable channel with lots of other users.

On the practical side, I just spent a considerable length of time investigating a problem with my brother's computer and cable modem before I discovered that an RF router was causing enough interference to screw up the modem . This was true even when the RF router wasn't connected to anything other than power. The solution of this problem led to the solution of another mystery. My brother's cable modem connection had never worked as well as it did before he got his new computer. It turns out that when he got the new computer he bought an RF mouse. And this was causing some interference with the cable modem operation also and moving the transmitter improved things immediately.
 
posted by yersinia29
quote:
--------------------------------------------------------------------------------
There is one main outlet in my 3 story house. The circuit setup is thus:

1) Main outlet
2) 20 feet of coax
3) Splitter
4) 20 feet of coax
5) Splitter
6) 20 feet of coax
7) Splitter
8) 5 feet of coax
9) cable modem

When I first hooked up the cable modem, it was able to receive/transmit data just fine. However, the next morning, it suddently stopped working. It couldnt connect to the internet.

So, I backed up the cable modem behind the first splitter (each splitter is 3 dB attenuation, which if I'm interpreting correctly, represents a 50% loss in signal power). When I hooked it up here, it worked fine for 1 day. The next morning (it always works fine at night, then stops working in the morning) it stopped working again.

So I backed up the modem behind the 2nd splitter, and it worked fine for 1 day. However, the very next day, you guessed it, it stopped working again.

I'm assuming that the signal is too weak for the cable modem and that I have to plug it directly into the main outlet with no splitters in between.

My question is, why did the cable modem work at all? If the signal is too weak, why did it consistently work for 1 day and then stop all of a sudden?

Does the traffic of the surrounding network in the neighborhood affect my signal reception? this just doesnt make any sense to me.
--------------------------------------------------------------------------------

quote:
--------------------------------------------------------------------------------
Originally posted by Fade
Might it be something to do with the time of day?

My suggestion for you is to get some drop amps set up, or stop splitting the signal so much, or use a larger splitter and a longer piece of cable.
--------------------------------------------------------------------------------


The amount of signal power your cable modem receives from the ISP should and almost always falls into the -8 to 8dB range. If this figure exceeds the -15 to 15 dB range, the modem will not operate at all. Rarely does the downstream power have anything to do with problems such as the ones you are experiencing.

Your problem is with loss, caused by excessive noise, resulting in too low of a SNR, or signal-to-noise ratio on the downstream portion of the cable between your modem and the UBR. You do not have to understand this terminology in order to check the status of your cable modem. Just understand that for Downstream, your SNR should be at 30dB or higher, and power should be between -8 and 8dB.
Upstream SNR can only be detected at the UBR cable end. Upstream power level must be less than 55dB.

Go to this web page...http://192.168.100.1/signal.html

This will display your cable modem status.

Yersinia29, test for your downstream SNR under different scenarios, such as elimination of splitters, proper termination, etc... If your SNR drops between 24 and 29dB, you will have problems like those you described.

My suggestion. Go to Radio Shack, and buy(for about $40) a bidirectional cable amplifier, install it, turn it to maximum level, then test your SNR. Problem solved, most likely. Local traffic should not be an issue. If anything, changes in the temperature affect SNR more than anything else.
 
These are my current parameters:


Signal to Noise Ratio: 33 dB

Downstream Power: -11 dBmV

Upstream Power: 62 dBmV

I bought an amplifier, but I think its the wrong kind. It doesnt say anything about being bidirectional or compatible with cable modem. Its a 4 way 10 dB UHF/VHF/FM amplifier, 50-900 MHz.

I guess this kind of amp is only for cable TV and not for cable modem.

Also, I noticed that the cabling is all RG59, which I heard is not shielded well. How much does RG6 or RG11 cost?
 
Walter Wayne said:
I mentioned the bandwidth (22 MHz) as it pertains to throughput more than carrier frequency. The signal is in the 2.4 to 2.485 GHz band as you mention.

Walt

I realized this hours and hours later when I thought about it. Forgive me for conflating principles.
 
No problem. It probably needed clarification anyways, as the OP was about the frequency range of cable, and I should have been clear on all three terms (frequency, bandwidth, datarate).

And no need to need to ask for forgiveness.

Walt
 

Back
Top Bottom