Bandwidth (computers): Difference between revisions

From Citizendium
Jump to navigation Jump to search
imported>Larry Sanger
No edit summary
imported>David E. Volk
No edit summary
Line 1: Line 1:
{{subpages}}
In the field of [[computers]], '''bandwidth''' refers to the maximum amount of data that can moved through a connection.  In the sense that it literally means "width of the band", it can refer to how many [[bit (computing)|bits]] wide a connection is.  For instance, a connection that is 16 bits wide must split up a 32-bit [[word (computing)|double word]] in order to transmit it.
In the field of [[computers]], '''bandwidth''' refers to the maximum amount of data that can moved through a connection.  In the sense that it literally means "width of the band", it can refer to how many [[bit (computing)|bits]] wide a connection is.  For instance, a connection that is 16 bits wide must split up a 32-bit [[word (computing)|double word]] in order to transmit it.


Line 9: Line 10:


Bandwidth is often referred to as the "speed" of a connection.  However, [[latency]] plays as large of a role (if not a larger one) in the speed of many connections.  Latency measures the amount of time required to cross a bus.  The effect of latency can be seen in a brief (very simplified) example:  ''A user wishes to view a 4 MB file located on another computer.  The user sends the request, which takes .5 seconds to reach the host.  The host then sends the file back along the same route, at 1 MB/s''.  In this example, it will take the user 5 seconds to get the file - .5 seconds for the request to reach the host computer, .5 for the first pieces of the file to arrive after being sent, and 4 seconds for the entire file to move through the connection.
Bandwidth is often referred to as the "speed" of a connection.  However, [[latency]] plays as large of a role (if not a larger one) in the speed of many connections.  Latency measures the amount of time required to cross a bus.  The effect of latency can be seen in a brief (very simplified) example:  ''A user wishes to view a 4 MB file located on another computer.  The user sends the request, which takes .5 seconds to reach the host.  The host then sends the file back along the same route, at 1 MB/s''.  In this example, it will take the user 5 seconds to get the file - .5 seconds for the request to reach the host computer, .5 for the first pieces of the file to arrive after being sent, and 4 seconds for the entire file to move through the connection.
[[Category:CZ Live]]
[[Category:Computers Workgroup]]
[[Category:Stub Articles]]

Revision as of 16:11, 23 December 2007

This article is a stub and thus not approved.
Main Article
Discussion
Related Articles  [?]
Bibliography  [?]
External Links  [?]
Citable Version  [?]
 
This editable Main Article is under development and subject to a disclaimer.

In the field of computers, bandwidth refers to the maximum amount of data that can moved through a connection. In the sense that it literally means "width of the band", it can refer to how many bits wide a connection is. For instance, a connection that is 16 bits wide must split up a 32-bit double word in order to transmit it.

Confusion with Throughput

Bandwidth is sometimes used informally to signify throughput, the amount of data that can be transfered within a set amount of time. For instance, a USB2 port has a theoretical bandwidth of 480 MB/s, or an internet connection has a bandwidth of 1.5 Megabits/second. In these instances, while throughput may be the technically accurate term, bandwidth is more commonly used.

Throughput is bandwidth/transfer * transfers/second in a synchronous system. A 64-bit (8 byte) bus operating at 200 Megahertz would be capable of 1.6 Gigabytes of throughput.

Comparison with Latency

Bandwidth is often referred to as the "speed" of a connection. However, latency plays as large of a role (if not a larger one) in the speed of many connections. Latency measures the amount of time required to cross a bus. The effect of latency can be seen in a brief (very simplified) example: A user wishes to view a 4 MB file located on another computer. The user sends the request, which takes .5 seconds to reach the host. The host then sends the file back along the same route, at 1 MB/s. In this example, it will take the user 5 seconds to get the file - .5 seconds for the request to reach the host computer, .5 for the first pieces of the file to arrive after being sent, and 4 seconds for the entire file to move through the connection.