How Do I Explain Analog To A Millennial?

I was recently talking with a non-technical lady in her mid-20's about analog vs digital surveillance cameras. She flawed me by asking, "What's analog?"

Instinctively I thought it was an easy question to answer but the words didn't come easily. I can point at plenty of examples of analog and digital technologies but was frustrated that I didn't have a neat, conceptual way to explain, "What is analog and what is digital?"

I'd like to find a good way to describe the difference without doing either of the following:

  1. pointing at something like a bulky CRT TV vs a flat screen TV and saying, "That's the difference." This doesn't really answer the fundamental question of what it means for a device to be analog or digital? This is complicated by all the shades of gray, such as an LCD TV supporting analog and digital signals!
  2. an overly technical lesson in analog vs digital electronics which will leave non-technical customers with glazed eyes.

Is there a neat, conceptual way to describe analog vs digital technologies to customers? Perhaps it is it too difficult unless one narrows down the topic to the use of these technologies in video surveillance cameras?

I'd greatly appreciate hearing from any wordsmiths who know how to answer this seemingly simple question.


How many surveillance system buyers are in their mid 20s? I am not trying to belittle the question, it just strikes me not being very common.

To the extent that it happens, I would say something, "You know the VCR at gradma's house, that's like analog" or "You know how when you watch really old YouTube videos, like from way back in 2011, that's like analog."

Of course, I am omitting the NTSC / PAL SD analog vs HD analog, but I am presuming the question is more about the old legacy stuff.

"You know how when you watch really old YouTube videos, like from way back in 2011, that's like analog."

Old surveillance videos? Outside of that, 99% of YouTube videos are digital since capture, 2011 same as today, no?

I see little point trying to teach a non technical end user about the technical issues of analog. The main material difference for end users is resolution / pixel count / quality, ergo the 2011 YouTube videos that were encoded at ~240p then.

Though simplifying the message to Digital = Hi-res = Good and Analog = Low-res = Bad, runs right smack into the objection of AnalogHD.

While it's true that the highest resolution cameras are digital, the majority of cameras a 'non-technical end-user' are likely to be considering are of HD resolution, and rendered equally well by both technologies, and therefore the 'main material difference for end-users' would not be resolution. Practically speaking.

Yes, I am presuming she was asking / talking about traditional SD analog.

For a non technical end user, I doubt they care about transmission technology as much as they do about image quality, ergo the YouTube 2011 vs YouTube 2014 (where I would put MP IP and analog HD in)

How many surveillance system buyers are in their mid 20s?

Probably not a lot but the lady's parents have a very successful business and have alarmingly good connections so I was keen to give good answers.

I greatly appreciate all the effort everyone has put in to answering my questions in this thread. As I read all the posts, I realized the answer I would like to have given should have focussed on what difference analog and digital technologies would make to the user rather than a more technical explanation. To that end, I believe the following general explanations might suffice:

  1. Michael Bailey's post reminded me that once a signal is stored digitally, it can be transmitted and retransmitted and won't change. It will be perfect at the receiving end. By contrast, analog signals are subject to noise and interference and will change as they are transmitted. These changes lead to a degradation in image quality.
  2. Each analog signal needs its own cable. An analog camera may need one cable for composite analog video, two cables for left and right audio, and another for command and control signals between the video recorder and the camera. By comparison, a digital transmission can simultaneously send multiple types of signals along a single cable. This is much neater and helps keep signals in sync with each other, e.g. so the audio matches the video.

While exceptions to both of these can be found, I believe they are generally true and should cover the main differences that a user might notice. Please let me know if you think I have overlooked any other important differences, or if you think this information just isn't right.

Glad I could help. I'll provide an address to send the fabulous prize that I'm sure is involved ;-)

One other point that helps explain the difference between analog and IP cameras is key: The very best 750 tvl analog cameras available are essentially 0.4MP or thereabout. IP cameras start at 1.3MP, effectively 4x the pixels of detail (yes, John, there a many many other variables!). New HD-TVI and HD-CVI cameras top out at 2MP in case the discussion takes that route.

The other main selling point I've found when explaining is this: Analog camera DVRs typically come in 8- and 16-channel versions, and that hardware is added in these larger 'chunks' (the 17th camera is the most expensive!). Plus they are cabled singley back to the headend.

IP cameras can be added one at a time, and connected to the LAN pretty much anywhere. Much, much more flexible and cost-effective.

Hi Michael,

Thank you for your three extra points.

The difference in resolution is a valid point although it can get tricky as standard definition isn't limited to traditional analog as standard definition IP cameras exist. Dahua have publicly responded to the idea of developing 4K cameras using HD-CVI so the 2MP limitation may only be temporary.

The latter two points are also good ones to mention. Thanks very much! You are definitely worthy of an elephant stamp!

Elephant stamp

Sweet!

Analog = elephant stamp

Digital = time stamp ?

Hi Tedor, in breaking news, Dahua have just released an elephant with a time stamp on it. No one is quite sure what to call it.

Try this easy analogy:

Analog is a direct representation of the pictures and sound, like the way you can see a speaker wiggle when the volume gets loud, or the way a ripple moves across the water when you throw a rock in a pond. The recording, storing, transmitting and reproduction of those waves can never be 100% perfect, and get rounded-down and diluted pretty quickly. The level of detail originally captured (resolution) is limited by the equipment involved in every step of the process. These recordings are essentialy stored miniatures of the original, with distortion and noise messing with their accuracy at every step of the process.

Digital, on the other hand, is a coded representation of the original, like the dots & dashes of Morse Code is an interpretation of an orginal communication. If it is accruate, it stays accurate. No matter how it is stored, transmitted, or reproduced, the code's on/off bits are either right or wrong, not smoothed or rounded-down. There are far fewer places for noise or 'close enough' interpretations* to reduce the accuracy of the code. New technologies, whether MP3 files stored on a CD instead of grooves in vinyl albums, or MPEG files on a DVD rather than videotape, are all about holding the 'coded representation' accurate to the original rather than a miniature interpretation of the original. Digital cameras have run away with the level of detail captured by the imager because 1) it can handle the detail, and 2) it is where all the interest and investment is.

(*compression, etc, obviously bring in distortion, but that's too much information for the layman)

"Longest snapchat ever, grandpa...."

Analog - Vinyl record

Digital - CD

Not perfect, but usually close enough.

Analog = Pulse of electricity

Digital (IP) = Stream of data

Yet Ethernet over copper is 'stream of data' represented using 'pulses of electricity', true?

The OP asked for a basic contrast between two technologies. I'm sure the world is chock full of weird niche crossover exceptions.

Bringing those up here is a sure way to kill any future basic questions.

Sorry, you're right, not helpful. More directly I should have said that the analogy

Analog = Pulse of electricity

may not be the best, since pulses of electricity are commonly associated with digital data, where pulse=bit*. Even non-technically, pulse does little to convey the sense of continuously variable values, like those in a sine wave, would you agree?

*Like PCM or even PWM.

Analog = wave, where the information is in the shape of the wave, it IS the message

Digital = pulse, where the info is coded, and the pulse just communicates the message

Thanks Brian, that's about as clear and simple an explanation as one could possibly make.

There are a number of analog vs. digital appliances, most are obsolete:

Record vs. CD

35mm film vs digital camera

a clock with sweep hands vs. a digital display

VHS vs Blueray

Analog meter display vs. a digital display

Then you can explain signal to noise ratio vs. bit error rate.

Well the original vidicon tube cameras formed an image, and encoded the images without ever digitizing them. So no A/D. I can see where solid state analog cameras get into some gray areas. For an analogy, if she ever saw an old car radio with AM and FM, that was analog. Satelitte radio is digital. Good luck!

Thank you Skip. That's an interesting point and there are a lot of gray areas.

No matter the examples of analog (old) and digital (new) devices, the difference boils down to this:

  • Analog
    • a fluctuating signal with the swing of signal representing amplitude (strength, volume, saturation, etc) and the speed of fluctuations cooresponding to frequency (pitch, color, etc).
    • Analog means 'analogy', or a representation of something.
    • It is like a physical blueprint, and will lose detail as it gets photocopied repeatedly.
  • Digital
    • a coded record of something (music, moving pictures) where the on/off code describes something (this loud, this color, this change from before, etc).
    • It is a digitized description of something, like an original CAD file of the blueprint above.

Hi Michael,

Thank you. That's a very clear description.

Good work star

I wouldn't use analog vs digital as a selling reference. It isnt worth the effort to explain and really even an analog camera today goes analog -> digital -> analog -> then converted to digital at the recorder no? I use terms like traditional or standard definition. I also dont use digital video recorder or network video recorder and instead just use recorder. IMO the things the customer needs are how good and useful is the video they are likely to get. This can be explained without education on the difference of analog vs digital.

For the record though, being in the security field as well, we have had to over the years explain analog vs digital as the nations POTS system slowly fades away. The easiest way I have found to explain this is by explaining that as information is sent in an analog system everything has to be in order. If information is lost because of noise or some other interference analog cant figure out what is missing and must start over. With digital this order does not matter and the recieving end is smart enough to rebuild information and put it in order, even if it was recieved out of order and it can request missing information.

Not perfectly accurate but usually getst he point accross.

Hi David, thank you very much for your considered thoughts.

Did you try explaining analog as a variation of a wave, and digital as a variation of 1's and 0's?

Now that there has been several decent responses to the 'basic questions', I wanted to mention a common, if slightly technical misconception regarding A vs D in general.

One would imagine that if after soaking in the various answers and resolution analogies, Luke's friend probably would get the impression that digital has a higher informational capacity (greater bandwidth) than analog. Though that's not the case at all. And Michael is right when he says:

Digital, is a coded representation of the original, like the dots & dashes of Morse Code is an interpretation of an orginal communication. If it is accruate, it stays accurate. There are far fewer places for noise or 'close enough' interpretations* to reduce the accuracy of the code.

but this 'coded representation' also comes at a significant price in bandwidth, precisely due to the drastic reduction of gray areas in signal semantics, i.e., today's typical bit encoding adds a lot of room in between values to reduce the error rate.

It's similar to writing a number in Base 10 (what we all use) vs Base 2 (binary). For example 15835, expressed in base 10 is simply 15835, in base 2 it's 11110111011011. Obviously base 10 is far more concise than base 2, but if you photocopied photocopies of both representations a few times, you would likely be able to reproduce the degraded base2 digits more accurately than the degraded base10 ones. Mainly because you only have the two choices, and if you can tell it's not a zero, then it must be a one... This is an oversimplification of course, but hopefully the general idea comes across.

In general computing we value error-reduction far greater than bandwidth utilization, since a spreadsheet copied even 99.99% correctly is virtually useless. Video on the other hand, no big deal. That's why Analog transmission of HD is viable. It's also the reason that Digital can use lossy compression algorithms and get away with it.

So digital sacrifices bandwidth for error-reduction...

Hi Roger, thank you for your detailed response. I found it very interesting but I don't think I'd mention it to a non-technical customer!

I also like to explain that with analog, if you lose signal you get noise or static. We all are used to this from old cell phones, snow on TV, noise on a record. In a digital world, your cell phone drops a few seconds, your tv shows garbled/colored squares or loses the signal all together. Analog our brian could usually figure out what the data was with all of the noise (maybe squinting at the tv). With digital, you usually get nothing recognizable until the signal refreshed.

Kids nowadays won't understand if you say there is snow on a tv. I love all of the movies with high tech tv or CCTV, but still show snow, because it is more recgnizable than "NO SIGNAL".

Thanks Aaron. Snow on TV is one of those things I had taken for granted. Ahh, the good ol' days!