Re: What causes the 56kbps limit on dial-up internet connections?

On Sep 9, 9:13 am, Jeff Liebermann wrote

formatting link
:

snipped-for-privacy@excite.com hath wroth: > >Usually, how much bps until this limit is reached? > Look at a communications channel this way. You can get *ANY* speed > through a bandwidth limited channel, up to a given error rate. If > your application requires a very low error rate to function, then you > have to have a good signal to noise ratio, and your thruput will be > fairly small. However, if you have a mess of forward error > correction, small packets, and a very tolerant application, you migtht > be able to squeeze some more thruput through the same channel.

So if I don't care about the errors, I can get whatever speed I want? What about getting 1 Gbps on dial-up if I have a baud of

1-symbol-per-second but 1-billion-bits-per-symbol? Is that possible?
I'm not going to expound on how V.90 works in detail. It gets messy > fast. There are modulation schemes for increasing the base 600 baud > modulation rate (bits per baud) to much higher bits/baud. Then add, > adaptive equalizers, echo cancellers, error detection, error > correction, data compression, etc. Anything to squeeze more thruput > into a rather ugly looking POTS line. However, that's just between > the user and the CO (central office). Once at the CO, everything gets > converted to digital and the rules change.

Okay

>I don't understand how you got that equation. Please clarify. > Nope. I don't want to get into how a DS0 (digital) line work. You > can get 64Kbits/sec out of a DS0 if you can use out of band > signalling. However, if you're using in band signalling, you're stuck > with 56Kbits/sec. Even if the analog part of the puzzle can go faster > than 56Kbits/sec, the digital part at the CO will limit the speed to > 56Kbits/sec.

Why is the digital part limited to 56K?

>Okay, but if only 1 baud is used what is the maximum-bits-per-baud > >that can be used on a phone line? > 56Kbits/sec. The limit is NOT all from the analog part of the line. > The analog modem glop gets converted to digital at the CO and that's > limited to 56Kbits/sec. I could easily (well maybe not so easily) get > more than 56Kbit/sec thruput going between my house and the CO, but > the digital thruput at the switch will limit thruput to 56Kbits/sec.

Can this be changed so that the digital throughput will go up to

1Gbit/sec instead of just 56Kbits/sec?
The problem is worse when dealing with SLC (subscriber line > concentrators) where the analog to digital conversion is done outside > the CO, such as with Pair Gain. The best you can do with those is > perhaps 28.8Kbit/sec, mostly because the digital audio filter cuts off > at a much lower frequency than the filters at the CO. Since most of > the energy is in the higher frequency part of the audio spectrum, the > loss of the higher frequencies is fatal to higher speed modem > operation. >

Okay. Seems from the link that digital pair gain is more efficient than the analog counterpart.

Reply to
Green Xenon [Radium]
Loading thread data ...

formatting link

Sure, just don't expect them to be correct. And if you don't care about errors why bother with the link at all? Just use random noise instead and free up your phone line.

There would be no point. It's meant to carry voice, conversations are not going to significantly benefit from a significantly higher data rate. If you simply want data you know where to get it. The current most popular incarnation over phone lines is probably DSL.

Robert

--
Posted via a free Usenet account from http://www.teranews.com
Reply to
Robert Adsett

Of course not. You will have a 100% error rate at some point.

It isn't. The digital channel is 64 kbits. Robbed-bit signaling has already been explained to you.

Not for your dial-up service. Te reason has already been explained to you.

Reply to
Don Bowey

"Green Xenon [Radium]" hath wroth:

Yep. I've dealt with communications systems that initially send more errors than usable data. Viturbi Decoders work that way. They reconfigure themselves on the fly and optimize their decoding characteristics based upon the channel characteristics of the moment. Comm systems designed to operate in the presence of jamming and interference also work that way. Take the bits you can get through and retransmit the rest. For example, 802.11 was designed to interleave with microwave oven 60Hz interference. Although thruput drops drastically in the presense of microwave oven junk, quite a few packets get through. Some of the original space communications systems had more error correction code being transmitted, than data. Calculating the optimium ECC code, retransmission rate, and optimum baud rate is a major challenge.

Sure, no problem. 1024 QAM bursts with heavy ECC will do it just fine. However, with a probable 99.9999% error rate (I can work out the exact number later if anyone really wants it), requiring mutliple retransmissions, you might be lucky and get perhaps 10Kbits/sec thruput. As a general rule, if you can get about 75% of the packet through without error, you have a workable system. Your 1 Gigabit system doesn't even come close.

Because a DS0 channel is 64Kbits/sec by order of Ma Bell in her manifestation as Bellcore and as inscribed in voluminous ANSI and ITU specifications. So it is written, so it must be.

However, it's really a voice communications standard, which requires some borrowed inband bandwidth for signalling. So, bit robbing reduces the bandwidth to 56Kbit/sec.

This is the 2nd time I explained this. Is there a problem?

I suppose an act of God or miracle that modifies physics might change it. Also, some rather bizarre quantum effects imply that it's possible for the data to arrive before it's sent, thus increasing the channel bandwidth. Perhaps a wormhole might help. Otherwise, I don't think there's anything that you can do to inspire Ma Bell or have her change the way the telephone network operates.

Think of it this way. Ma Bell operates a sewer system. It had big pipes, medium size pipes, and small pipes. Amazingly, by the judicious application of acronyms, it is possible for your home drain pipe to shove more sewerage at the CO (central office) switch than it can handle. It's pipe is slightly smaller than your home drain pipe. It doesn't matter how much drek you shove down your sewer pipe, the size of Ma Bell's pipe at the CO limits your capacity. You could infinitely increase the size of your drain pipe, but Ma Bell will only pass a limited amount. If your data doesn't pass the first time, flush again later when the channel clears.

--
Jeff Liebermann     jeffl@cruzio.com
150 Felker St #D    http://www.LearnByDestroying.com
Santa Cruz CA 95060 http://802.11junk.com
Skype: JeffLiebermann     AE6KS    831-336-2558
Reply to
Jeff Liebermann

Jeff Liebermann hath wroth:

Oops. Using a 600 baud base modulation rate, 1 Gigabit/sec will require: 1*10^9 bits / 600 baud = 1.66*10^6 bits/baud which will require 21 bits of resolution. That would be: 2,097,152 QAM which isn't going to happen over any kind of real audio channel.

We return you now to the Sci-Fi channel.

--
Jeff Liebermann     jeffl@cruzio.com
150 Felker St #D    http://www.LearnByDestroying.com
Santa Cruz CA 95060 http://802.11junk.com
Skype: JeffLiebermann     AE6KS    831-336-2558
Reply to
Jeff Liebermann

"Green Xenon [Radium]" hath wroth:

I think I also answered that. It's because Ma Bell, when they set the specifications for the characteristics of MUX channels, groups, super-groups, and digital technologies that they use, decided that

1/12th of a T1 is going to be a single DS0 voice channel with a bandwidth of 64Kbits/sec. These standards are not arbitrary and were (hopefully) carefully calculated to maximize the number of voice channels in the available bandwidth. This was long before modems became popular. Even today, Ma Bell will not guarantee anything faster than v.32 (9600 baud) speeds. Some telcos still only guarantee v.22 (1200 baud).
--
Jeff Liebermann     jeffl@cruzio.com
150 Felker St #D    http://www.LearnByDestroying.com
Santa Cruz CA 95060 http://802.11junk.com
Skype: JeffLiebermann     AE6KS    831-336-2558
Reply to
Jeff Liebermann

"Green Xenon [Radium]" hath wroth:

Your modem will transmogrify into a quantum black hole and suck both you and your ideas into another dimension.

This is a clone of the question you asked about 2 months ago. Same answer as before. You cannot easily modulate a low frequency carrier (1Hz), with a very high frequency modulation frequency (1GHz). Well, actually you can do it, it's just that the results will be worthless for doing anything useful.

For starters, the short term frequency stability (jitter) of the 1Hz carrier will need to be at least half the modulation frequency in accuracy. Otherwise, the modulation bits cannot maintain their position in the constellation diagram. That's at least 1 part in

10^22 accuracy. A good cesium atomic clock might be good for 1 part in 10^14, so this isn't going to happen: (see chart)

This might be of some interest: "List of Device Bandwidths".

--
Jeff Liebermann     jeffl@cruzio.com
150 Felker St #D    http://www.LearnByDestroying.com
Santa Cruz CA 95060 http://802.11junk.com
Skype: JeffLiebermann     AE6KS    831-336-2558
Reply to
Jeff Liebermann

"Green Xenon [Radium]" hath wroth:

Yes and no.

Yes, they could change out all the telephone infrastructure to support such data rates over what is now a DS0.

The fact is they just won't. There are other more practical and cost effective means to get high-speed data over twisted pair than "opening up" the voice channel bandwidth.

Reply to
Gary Tait

Actully, being able to achieve a *bit* error rate of 100%would be wonderful -- to fix it you just flip the bit!

:-)

Bit Error rates of 50% are essentially the same as "random noise," of course.

Reply to
Joel Kolstad

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.