Understand the fundamentals of transmitting and receiving data
When discussing data in the context of data communications, we can define data as a rawcollection of 0’s and 1’s, with “information” being the result of converting the data into a meaningful form. In short, data has to be interpreted, managed or converted by software instructions so that it becomes information.
Electronic transmissions have to be encoded into a language that can be transmitted. Signals are the electric or electromagnetic impulses used to encode and transmit data. Data is then transmitted through some medium, such as a cable or the airwaves. The receiving node then reverses the conversion and turns the electronic pulses or waveforms back into the 0’s and 1’s that represent the original data.
Both the sending and receiving node have to understand the encoding technique, or language, that is applied during the conversions. Encoding can be proprietary, such as the language used by a DSL modem or a microwave satellite, or universal and based on a known open source standard.
The next critical concept is the difference between analog and digital signals. Analog waveforms are continuous and infinite in scope. All sounds that we hear are analog. Our eardrum and ear bones vibrate with the frequencies of the sound, and then deliver nerve impulses to our brain where the data is interpreted. Telephone systems also use analog waveforms, as do radio and television signals that move through the air.
Digital signals are quite different from analog signals. Digital signals are not continuous or infinite — rather they are voltage or light pulse conversions of analog or digital data into specific 0’s or 1’s. These pulses are easily transmitted and regenerated through network mediums.
Like analog encoding concepts, digital signals also have to be encoded with languages or controls that both the sending and receiving peer nodes can interpret.
No matter which signal type is used, two problems have to be managed: noise and attenuation.
Noise — unwanted electrical or electromagnetic energy that degrades the quality of signals and data — is always present in all signals. It is especially damaging to analog signals, as noise can distort and even overwhelm an analog signal wave. As an analog signal grows weaker we can amplify the signal to strengthen it, but the noise that accompanies the analog signal is also amplified.
Digital signals have an advantage over analog signals when managing noise, as digital devices can filter and removing noise as they regenerate signals into new 0’s and 1’s. Unless it is excessive, digital signals usually do not suffer from noise distortion, and that is one of the biggest advantages digital signals have over analog signals.
Attenuation — or signal loss over distance, is a problem that affects all transmission medium. Both analog and digital signals dissipate as they move further and further away from their signal source. Every transmission medium has a published maximum segment length over which it can carry a signal without attenuation.
Decibels (db) are an important value when measuring signal strength. Decibel measurements are relative and logarithmic. That means that they are comparison values between original signal strength, and the loss or gain of signal strength.
Other important values include amplitude, frequency, spectrum, bandwidth and phase. These terms define “properties” of a signal, and also provide areas for signal modifications that need to be performed when multiple signals share one transmission media or network.
It’s tempting to ignore the fundamental components of data transmissions and signaling, but if we truly need to diagnose and evaluate networking communications it begins at the bit signal levels. To be efficient at network administration you’ll need to understand the various types of data streams and encodings that your networks utilize.
As services continue to be delivered across different platforms, we will continue to see stress points placed on the encoding devices that convert signals across different media paths. Many other issues, such as security services, will also impact network communications, and how data is encoded and decoded. We tend to look at these services and applications when trying to define performance metrics, but you will also have to consider your physical layer communications. Older, mismatched or over-utilized network equipment might actually be a problem source for poor application performance.
You would begin your analysis by using sniffing tools such as WireShark or Microsoft Network Monitor. They are free downloads that allow you to monitor network traffic and detect network devices that are producing excessive transmissions. Wireless versions of these tools will be especially important, as wireless networks generally suffer from congestion and collisions.
Detecting and eliminating noise is never an easy task. There are many quality products available for line conditioning and noise removal for power lines, but you’re going to need make some effort to discover tools that can help you gauge and monitor noisy network connections. That said, your network switch and routing vendors will likely provide some tools and utilities.
Before you deploy a network application you might want to use a noise simulator tool that can help you define the applications ability to perform under less than optimal conditions.
Lastly don’t forget to regularly test and analyze your networks. It’s going to take time to determine a benchmark, or expected performance model. Document all your connections, and be sure to update your documents and diagrams during upgrade and replacement cycles.
It’s a reality that equipment will eventually fail, so you’ll need to define your critical failure points ahead of time and have a remediation plan prepared ahead of time for a quick recovery.
Subscribe to our newsletter to occasionally receive Learnit insights directly in your inbox.