Given a channel with particular bandwidth and noise characteristics, Claude Shannon SM ’37, PhD ’40 showed how to calculate the maximum rate at which data can be sent over it with zero error. He called that rate the Channel Capacity, but today, it’s just as often called the Shannon Limit.
Shannon, who taught at MIT from 1956 until his retirement in 1978, showed that any communications channel — a telephone line, a radio band, a fiber-optic cable — could be characterized by two factors: bandwidth and noise. Bandwidth is the range of electronic, optical or electromagnetic frequencies that can be used to transmit a signal; noise is anything that can disturb that signal. —MIT News, “Explained: The Shannon limit”
The Shannon Limit or Shannon capacity of a communication channel refers to the maximum rate of error-free data that can theoretically be transferred over the channel if the link is subject to random data transmission errors, for a particular noise level. It was first described by Shannon (1948), and shortly after published in a book by Shannon and Warren Weaver entitled The Mathematical Theory of Communication (1949). This founded the modern discipline of information theory. —Wikipedia, “Noisy-channel coding theorem“