Bandwidth (computing)

Bandwidth (computing)

Bandwidth refers to the maximum rate at which data can be transferred across a path. It is used in computing to characterize network, data, and digital capacity. In signal processing, bandwidth instead refers to the frequency range of an analog signal.

1 courses cover this concept

CS 149 PARALLEL COMPUTING

Stanford University

Fall 2022

Focused on principles and trade-offs in designing modern parallel computing systems, this course also teaches parallel programming techniques. It is intended for students looking to understand both parallel hardware and software design. Prerequisite knowledge in computer systems is required.

No concepts data

+ 45 more concepts