Subscribe Twitter Twitter

Wednesday, October 27, 2010

Computer Network Speed

Computer Network Speed

Bandwidth in computer networking refers to the data rate supported by a network connection or interface. Network bandwidth is not the only factor that contributes to the perceived speed of a network. A lesser known but other key element of network performance - latency - also plays an important role.

What Is Network Bandwidth?

Bandwidth is the primary measure of computer network speed. Virtually everyone knows the bandwidth rating of their modem or their Internet service that is prominently advertised on network products sold today.

In networking, bandwidth represents the overall capacity of the connection. The greater the capacity, the more likely that better performance will result. Bandwidth is the amount of data that passes through a network connection over time as measured in bps.

Bandwidth can refer to both actual and theoretical throughput, and it is important to distinguish between the two. For example, a standard dialup modem supports 56 Kbps of peak bandwidth, but due to physical limitations of telephone lines and other factors, a dialup connection cannot support more than 53 Kbps of bandwidth (about 10% less than maximum) in practice. Likewise a traditional Ethernet network theoretically supports 100 Mbps of bandwidth, but this maximum amount cannot reasonably be achieved due to overhead in the computer hardware and operating systems.

Broadband and Other High Bandwidth Connections

The term high bandwidth is sometimes used to distinguish faster broadband Internet connections from traditional dialup or cellular network speeds. Definitions vary, but high bandwidth connections generally support data rates of minimum 64 Kbps (and usually 300 Kbps or higher). Broadband is just one type of high bandwidth network communication method.

0 comments:

Post a Comment

lijit search