Ken, good question!
They both measure the same thing - bitrate and bandwidth both express data used over time.
In common usage, bandwidth is more often refers to network speeds (100Mb/s, Gigabit, 10Gig, etc.), while bitrate refers to bandwidth consumption of a device (this camera has a bitrate of 2Mb/s).
Generally, you can use the terms interchangeably and most people in the industry will know what you are referring to.
Btw, people generally don't use the expression "0,950 bitrate". Also, rarely (in 2015) are you going to have a bitrate / stream / bandwidth under 1kb/s anyway.
IPVMU Certified | 05/10/15 12:54am
Without taking anything from John's explanation, I would also add that bitrate and bandwidth do have some nuanced distinctions that can help when deciding on which word to use.
One difference is that the actual bitrate is never more than the actual bandwidth used. Bandwidth is often used to mean the maximum bitrate that a medium is capable of, giving rise to the common usage John cited above regarding devices and networks.
It may also refer to a portion of that total capacity, e.g. used/unused bandwidth.
So you will typically hear:
- We purchased more bandwidth bitrate for the WAN.
- The camera was streaming at a bitrate bandwidth of 2 Mbps.
- The total bandwidth bitrate is insufficient to support the higher bitrates bandwidth created by the cameras.