What's New
Site Support

Preliminary Introduction
8.2 Communication
8.3 Electrical
8.4 Moving About
8.5 Cosmic Engine

8.2 Worksheets
8.2 Revision



Please read the important "Early Warning" located on the Homepage of this website.  It concerns the future of this site.




Communication at its most basic is the process of transmitting information from a source to a receiver. The rapid transmission of information over long distances and easy access to information have become vital features of the modern world.  Physics and Physicists have been at the forefront of this technological revolution.

In transmitting information from source to receiver, energy is transformed from one form into another.  When we use an ordinary fixed telephone, sound waves cause a diaphragm to vibrate in a magnetic field.  These vibrations are converted into electrical impulses and transmitted along a wire to a receiver.  In the receiver the electrical impulses produce variations in a magnetic field which cause a diaphragm to vibrate and reproduce the original sound.  Energy has been transformed from sound to mechanical to electrical and then back again from electrical to mechanical to sound.

When we use a mobile phone, sound energy is converted to electromagnetic energy (microwaves – high frequency radio waves) and is transferred from source to receiver via radio transmitters.  The electromagnetic energy is then transformed back into sound energy by the receiver.

As in the examples above, when we investigate methods of communication we find that “waves” play an important part.  Current technologies associated with information transfer use waves of one form or another.  Indeed, an understanding of the nature and behaviour of waves is essential to the study of Physics as a whole.  In this topic we will examine in detail the physics of  WAVES.



A wave transports energy from one point in space to another.  Waves do not move matter.

There are two main categories of waves: mechanical and electromagnetic.  Mechanical waves are those that require a physical medium through which to travel eg sound waves, water waves, earthquake waves etc.  Electromagnetic waves require no medium through which to travel and thus can travel through a vacuum eg light, radio waves, gamma rays etc.  In this topic we will study both categories of waves.



There are three types of mechanical waves: Transverse, Longitudinal (or Compression) and Torsional.



Transverse waves are waves in which the particles of the medium through which the waves are traveling vibrate at right angles to the direction of travel of the wave motion.  Eg a wave travelling on a rope, water waves on the surface of a lake, S-waves of an earthquake.  Go to the following Applet page & choose the transverse wave applet from the applet menu on the top left of the page.




Longitudinal waves are waves in which the particles of the medium vibrate parallel to and anti-parallel to the direction of motion of the waves.  Eg sound waves and P-waves of an earthquake.  Go to the following Applet page & choose the longitudinal wave applet from the applet menu on the top left of the page.



These are waves in which the particles of the medium twist clockwise and anticlockwise about the direction of motion of the waves.  Certain types of earthquake waves (not P or S waves) are torsional.  Torsional waves are not part of the current syllabus but are mentioned here for completeness.


Return to The World Communicates Contents List




Consider the following transverse wave:




In the above diagram:

¨      The y-axis = displacement, the distance of a particle from its equilibrium position

¨      The x-axis can represent either time or distance from a specified point within the medium.  A displacement-time graph shows the displacement of one particle of the medium as time goes by.  A displacement-distance graph shows the displacement of all particles of the medium at one instant in time.

¨      A = amplitude, the maximum displacement from equilibrium of any particle

¨      Crest and trough are the points of maximum displacement from equilibrium above and below equilibrium position respectively.

¨      l = wavelength, the distance between two consecutive identical points on the wave eg between two crests or two troughs.

¨      v = velocity, the speed with which the energy is being transferred in the direction of motion.


Two other important terms not in the diagram are:

¨      period, T, which is the time in seconds for one complete wave to pass a given point, or the time for any particle to make one complete vibration.

¨      Frequency, f, which is the number of complete waves that pass a given point in one second or the number of complete vibrations in one second undergone by any particle due to the passing wave.  Frequency has units of s-1 or hertz.

Clearly, T and f are reciprocals of one another and so: T = 1 / f

Since a wave will advance a distance of one wavelength in a time of one period and since velocity is defined as the displacement of a particle per unit time, we have:

Velocity, v = displacement/time = l / T = f . l, since T = 1 / f.

So we have that:  v = f . l    Units of v are m/s or ms-1.


Consider the following representation of a continuous longitudinal wave:


Note that the term compression is used to denote any area where particles of the medium have moved closer together than when they are at equilibrium.  The term rarefaction refers to any area where particles of the medium have moved further apart.

Another way to represent longitudinal waves is to draw several consecutive particles of the medium in their equilibrium positions and then draw these same particles at various times after the wave has begun to move.  In the case below, several particles are shown at equilibrium before the wave has moved through and again a little later after the wave has begun to move.

Note that by definition the amplitude, A, of the longitudinal wave is the maximum displacement from equilibrium of any of the particles.  Likewise, the wavelength is the distance between any two consecutive, identical points on the wave, in this case the centre to centre distance between two consecutive compressions.  Clearly then, the centres of compressions and rarefactions are equally spaced along the wave.



Return to The World Communicates Contents List




Sound is a measurable physical phenomenon and an important stimulus to humans.  It forms a major means of communication in the form of spoken language and both natural and manmade sounds contribute greatly to our environment.  The physics of waves can be used to successfully describe and explain the phenomenon of sound.

All sound waves are produced by the vibrations of particles in a medium.  For instance, in order to speak we must exhale air over vibrating vocal cords in our larynx.  The vocal cords force the air particles to vibrate in the form of a longitudinal wave and this wave moves from our throat, out through our mouth and strikes the ear drum of the person to whom we are speaking.  The eardrum is forced to vibrate with the same frequency as the longitudinal wave and these vibrations are interpreted by the brain as speech.  The human ear can perceive vibrations with frequencies between about 20 Hz and 20000 Hz.

All sound waves are longitudinal waves.  As such, all sound waves require a medium through which to travel.  Whatever the medium, sound waves progress as a series of compressions (high pressure regions) and rarefactions (low pressure regions) produced by the original vibrating source.  When a tuning fork is struck with a rubber hammer, the prongs of the tuning fork initially move towards each other.  This produces a compression of the air molecules between the prongs and a corresponding rarefaction outside the prongs.  As the prongs move apart, a rarefaction is produced between them and a compression outside them.  As this motion continues, the air molecules vibrate with the same frequency as the tuning fork and transfer sound energy from the tuning fork to the listener via a series of collisions.  The air molecules themselves do not undergo any net movement but vibrate about their equilibrium positions.

Return to The World Communicates Contents List


SPEED OF SOUND: The material on the speed of sound is NOT EXAMINABLE in the current syllabus but has been left on this site as extension material.

The universal wave equation v = f . l is used to calculate the speed of a sound of known frequency and wavelength.  The speed of sound is constant in a particular medium at a constant temperature.  However, the speed of sound varies with the medium through which the sound travels.  At 20oC the speed of sound in air is 340 ms-1 while in pure water it is 1440 ms-1.  Why the difference?

In a liquid, where the molecules are much closer together than in a gas, the vibrations are passed on more quickly.  An individual molecule does not have to move as far to push its neighbour.  So, the speed of sound is higher in a liquid than in a gas and by the same argument, higher still in a solid.

The speed of sound waves in various media can be related to the density and elasticity of the medium.  In general, the speed of sound in a medium is given by:


where d = density of the medium and E = the bulk modulus of elasticity of the medium (the higher E, the more difficult the medium is to compress).  Clearly, the equation above suggests that sound travels fastest in media which are least compressible and which have low density.

At first glance this statement seems to disagree with our assertion that sound travels faster in solids than in liquids and faster in liquids than in gases, since in general gases have lower densities than liquids and liquids have lower densities than solids.  This apparent contradiction is resolved when we examine the relative elasticity values of the different states of matter.  In general, solids have much higher values of elasticity than liquids, which in turn have much higher values of elasticity than gases.  In fact, it is the elasticity values that have the main effect on the speed of sound in materials.  Thus, the above equation supports our assertion that sound travels faster in solids than in liquids and faster in liquids than in gases.  The equation shows, however, that where two materials have similar elasticity values, the relative densities of the two materials will determine which material provides the higher speed of sound.  For example, sound travels almost three times as fast in helium than it does in air at the same temperature mainly due to the much lower density of helium compared to air.

When the medium under consideration is air, it is possible to express E in terms of the undisturbed air pressure, P, since it is this pressure that supplies the restoring force which returns the air molecules to their equilibrium positions.  The speed of sound in air is:


where d = air density & P = air pressure (a measure of  the elasticity of the air).  It is interesting to note that in air an increase in pressure causes a corresponding increase in density and so the ratio of  P to d remains constant.  Since P/d can be shown to be proportional to the temperature (in kelvin), the speed of sound in air is actually proportional to the square root of the kelvin temperature.

Return to The World Communicates Contents List



The pitch of a sound (how high or low it is) depends on its frequency.  The higher the frequency, the higher the pitch.  For a sound or note of specific frequency, like that produced by a tuning fork, the pitch is the same as the frequency.  However, for a complex sound such as a chord played on a piano, the pitch is not so easily defined.  It can no longer be taken as identical to the frequency of the sound, since the sound contains several nearly equal amplitude waves of various frequencies.

The loudness of a sound depends upon the amplitude of the wave that produces it.  The greater the amplitude, the louder the note, because more energy is used to produce a larger amplitude.  The relationship between loudness and amplitude is not a simple one.  The term volume is sometimes used instead of loudness.

Return to The World Communicates Contents List



It is usually more difficult to draw a longitudinal wave than a transverse one.  This is because for a longitudinal wave, the particle displacements lie in the same direction as the wave travels.  So, it is often convenient to represent such a wave as a transverse wave equivalent.  This is accomplished by simply using a vertical axis to represent the longitudinal displacements of the particles from equilibrium.  Longitudinal displacements to the right are represented as vertical displacements upwards.  Longitudinal displacements to the left are represented as vertical displacements downwards.

In the diagram that follows, a longitudinal wave and its transverse wave equivalent are shown together.  The numbers at the top indicate the longitudinal displacements (in cm) of the particles from their indicated rest positions at an instant in time.  Minus means to the left, plus to the right.  The numbers at the bottom indicate the corresponding vertical displacements (in cm) used to produce the transverse wave equivalent.  Minus means down, plus means up.  Note that the compressions and rarefactions in a longitudinal wave are NOT analogous to the crests and troughs in a transverse wave (inspite of the Syllabus stating otherwise).  The compression and rarefaction centres of the longitudinal wave occur at positions of zero displacement of the particles and therefore correspond to the zero displacement points of the transverse wave.  The points on the longitudinal wave where the particle displacement from equilibrium is maximum, correspond to the crests and troughs of the transverse wave equivalent.



Return to The World Communicates Contents List




When any wave strikes the boundary between the medium in which it is travelling and a different medium, three phenomena occur.  Some of the wave is transmitted across the boundary into the new medium.  Some of the wave is reflected (turned back) into the medium through which it has just come.  Some of the wave is absorbed by the boundary.  The extent to which any of these happen depends on the nature of the wave, the media and the boundary.  Let us examine the reflection of sound.

Reflection occurs when a wave incident on a boundary is forced to return into the medium in which it was originally travelling.  In the diagram below an incident sound wave strikes the boundary surface at X and is reflected along the line shown.  Note the use of rays, lines with arrows, to show the direction of travel of the waves.



It can be shown experimentally and theoretically that the reflection will obey the following laws:

1.       The incident ray, normal and reflected ray are in the same plane; and

2.       The angle of incidence, i, is equal to the angle of reflection, r.

These findings together are called the Laws of Reflection and apply to both longitudinal and transverse waves.  Note that a wave incident on the boundary surface with an angle of incidence of zero degrees (i = 0o) will be reflected back along the same line.

When a sound wave is reflected back to its source, it is known as an echo.  Echoes are used in a wide variety of applications.  Sonar (SOund Navigation And Ranging) is a method of finding the depth of water or the size and shape of objects under the water by sending out ultrasonic (> 20000Hz) pulses and measuring the time of travel and angle of return of the echoes.  Ultrasound is used in medicine to produce images of internal body organs and babies in the womb and in industry to detect flaws in metal.

Return to The World Communicates Contents List





When two or more sound waves travel through the same medium at the same time they produce effects on each other.  This is called interference.

The Principle of Superposition states that when waves interfere, the total displacement of the medium at any point is the algebraic sum of the individual displacements at that point.  Note that in all the graphs that follow in this section, the horizontal axis represents time and the vertical axis represents displacement of particles of the medium from their equilibrium positions.  Examine the example below:




Graphs 1 (blue) & 2 (red) represent two sound waves of equal wavelength and frequency passing through the same medium simultaneously.  Wave 2 has 1.5 times the amplitude of wave 1.  Wave 3 (pink) represents the resultant wave produced by adding the individual displacements of waves 1 & 2 at each point.  The resultant sound that would be heard would have the same wavelength and frequency as waves 1 & 2 but would have higher amplitude and would therefore be a louder sound.  In interference terms waves 1 & 2 have interfered constructively to produce wave 3.

In the following example, Graphs 1 (blue) & 2 (red) represent two sound waves of equal wavelength, frequency and amplitude passing through the same medium simultaneously.  Note that at all points in the medium the two waves interfere with each other destructively.  The result is a series of nodes (points of zero displacement) and no sound would be heard (Graph 3 - pink).





To study sound interference effects, the waves must have the same frequencies and wavelengths.  To allow complete destruction, as in the case above, the amplitudes must also be the same.  When sound sources of different frequencies, wavelengths and amplitudes interfere, the result is just noise.  In the special case below, where the two waves have slightly different frequencies, beats are produced.





The two interfering waves travelling in the same direction have been drawn to a different scale than the resultant wave, shown below.  This complex waveform represents a beat – a periodic fluctuation in sound intensity or loudness.  The graph clearly shows a gradual increase in loudness up to a maximum followed by a gradual decrease to zero.  The pattern then repeats.  The audible beat frequency is the difference between the frequencies of the interfering waves.

                                                            Fb = f2 – f1 


Go to the following link & choose the Beats Applet from the Applet Menu at the top left of the page.  Look under the Waves heading in the menu.  Beats Applet Link.


Return to The World Communicates Contents List





You will recall that the two basic categories of waves are mechanical and electromagnetic.  Let us say a little about the latter.  Electromagnetic radiation consists of waves of energy that are caused by the acceleration of charged particles. Electromagnetic waves (or radiation) consist of electric and magnetic fields vibrating transversely and sinusoidally at right angles to each other and to the direction of travel of the waves.


Go to:


for an excellent java applet of an EM wave in action.


EM waves require no medium through which to travel and thus can travel through a vacuum.  In free space all EM waves move with the same speed 3 x 108 ms-1.


The wide range of wavelengths (and corresponding frequencies) over which EM waves exist in nature is called the electromagnetic spectrum.  This spectrum is as follows:


The cut-off wavelengths or frequencies for each of the different types of EM radiation are not precise.  There is some overlap.  Some types of EM radiation can be further broken down into sub-types.  The radio wave band of the spectrum contains the AM radio communications band at its higher wavelength end, followed by the TV band and then the radar and microwave bands at the lower wavelength end.  The very narrow visible light band contains all the visible colours: red, orange, yellow, green, blue, indigo and violet, in order from higher to lower wavelength.  The visible light band occupies the position between about 780nm and 380nm wavelength.

Return to The World Communicates Contents List




EM radiation has many effects and uses in everyday life.  As mentioned above, the radio band is used extensively for communications of all kinds.  The Ultra-High Frequency (UHF) band, ranging from 300 megahertz (MHz) to 3,000 MHz is used mainly for communication with guided missiles, in aircraft navigation, radar, and in the transmission of television.  FM radio stations use the Very High Frequency (VHF) band from 30 MHz to 300 MHz.  Short wave radio uses the High Frequency (HF) band from 3 MHz to 30 MHz because waves in this band are easily reflected by the Kennelly-Heaviside layer (the E-layer) of the ionosphere, allowing very long distance communication by short wave radio.  AM radio broadcasts use the Medium, Low and Very Low Frequency (MF, LF, VLF) bands from 3000 kHz down to 3 kHz.  The ionosphere also reflects these waves.  Note that the exact allocation of frequency bands varies from country to country and is usually controlled by government authorities.

Radio waves can be detected by the combination of (i) an aerial for receiving the electromagnetic waves and converting them into electrical oscillations and (ii) diodes in appropriately tuned electronic circuits in the receiver that produce an audio-frequency signal.

Microwaves, which occupy the very top of the radio wave band from 3GHz up to 300 GHz, can pass through the ionosphere and are used in radar, space communication such as with satellites, radio and television, meteorology, microwave landing system (MLS) for aircraft, distance measuring, materials research and even ordinary old cooking.  Microwaves can be detected using a waveguide.  This is a hollow conducting tube containing a dielectric (insulator) and is used to guide UHF EM waves along its length by reflection off the internal walls.  A cavity resonator may be added to collect the energy.

Infrared radiation is heat radiation and is used in guidance systems of missiles, for linking computers in networks, as a diagnostic tool in medicine (thermography), in remote sensing aerial and satellite IR photography to search for minerals or monitor crops, in night-vision goggles, in cooking, heating, drying and so on.  IR can be detected by a thermopile or a photo transistor.

Visible light is the means by which we view the world, mainly by reflection.  It is also used in communication to transport huge volumes of information over very large distances by internal reflection of light in optical fibres.  Light waves have high frequencies and the information-carrying capacity of a signal increases with frequency, making light perfect for the job.  Light is detected by our eyes, by photo cells, cameras and light sensitive diodes.

Ultraviolet radiation is largely responsible for damage to skin and eyes exposed to sunlight for too long.  It is used in the treatment of skin complaints, for killing bacteria, for fluorescent lighting, in burglar alarms, automatic door openers and counters and a host of other applications.  UV radiation can be detected by photographic film, photovoltaic cells and by the fluorescence it causes in ZnS and other salts.

X rays are used in medicine both to supply images of internal body structures and to destroy tumours, in industry for detecting cracks in metal and in research laboratories for determining crystal structure by diffraction.  X rays can be detected by photographic plates and film, ionization of gases and by the photoelectric effect, where the X rays knock electrons out of a metal surface.

Gamma rays (g-rays) can be used to destroy cancerous tumours, to detect flaws in metals and to sterilize equipment.  g-rays can be detected by Geiger-Muller tubes and photographic plates and film.


Have a look at this interesting video - Unlocking a car with your brain.



The energy carried by an EM wave is related to its frequency.  An EM wave of frequency f, has an energy E, given by Planck’s Law: E = hf, where h = Planck’s constant (6.63 x 10-34 Js).  (As an aside, this law forms the basis of Quantum Theory.)  A quick look at the EM spectrum diagram shows that g-rays (high frequency) are the most energetic EM radiation and that radio waves (low frequency) are the least energetic.

Another frequency related characteristic of EM radiation is its penetration power through the Earth’s atmosphere.  EM radiation of different frequencies is scattered, reflected and absorbed by different amounts in the atmosphere.  Of the EM radiation that falls on Earth from space, only the visible and radio bands make it all the way to the ground without much attenuation taking place on the way down.  Some low frequency ultraviolet radiation and some regions in the infrared are able to traverse the atmosphere but other frequencies of EM radiation are completely blocked.  For all intents and purposes most of the UV and all of the X-ray and gamma-ray wavebands of the EM spectrum are effectively filtered out by the atmosphere well before they reach the ground.

It is useful to know how the intensity of EM radiation varies with distance from the source.  Intensity is defined as the rate of energy transfer per unit area normal to the direction of travel of the wave at any given point.  It can be shown experimentally, that the intensity I, of light falling on a surface varies inversely with the square of the distance d, between the source and the surface.  That is, if the distance between the source and the surface doubles, the illuminance (the intensity of illumination on the surface) decreases by a factor of 4.  This relationship is called the inverse square law and applies only where the distance is large compared with the size of the source.

For example, if a surface receives 1 lux of light at a distance of 2 metres from a source and the surface is then moved to be 4 metres from the source, that surface will then receive (1/2)-squared, or 1/4, lux of light.

It can be further shown that this inverse square law applies to all EM radiation, not just to light.  Therefore, in general,  for EM radiation:



In words, I is proportional to the reciprocal of the square of the distance d.

Return to The World Communicates Contents List




Modulation is the process of impressing one wave system upon another of higher frequency.  Audio-frequency (AF) waves such as speech and music from a tape or microphone must be combined with radio-frequency (RF) carrier waves in order to be transmitted over the radio. Either the frequency (rate of oscillation) or the amplitude (height) of the carrier waves may be modified in a process called modulation. The AF waves enter the modulator and interact with the carrier to determine either the amplitude of the carrier wave (amplitude modulation – AM) or the frequency of the carrier wave (frequency modulation – FM).  The modulated carrier wave can then be transmitted to its destination.  Once it is received, the modulated carrier wave is fed into a decoding device or de-modulator that extracts the original AF wave from it.

Let us examine frequency modulation as an example.  In this type of modulation the frequency of the carrier wave is varied above and below its unmodulated value by an amount that is proportional to the amplitude of the modulating signal and at a frequency equal to that of the modulating signal.  The amplitude of the carrier wave remains constant.

As an example, the instantaneous amplitude of a frequency modulated wave in which the modulating signal is sinusoidal may be represented by:

where Em = amplitude of the carrier wave, F = frequency of the unmodulated carrier wave, DF = the peak variation of the carrier wave frequency away from the frequency F, caused by the modulation, f = frequency of the modulating signal.  Note that this example is simply meant to emphasize that there is a clearly defined mathematical process behind signal modulation.  You do not have to remember or even be able to use such equations in this course.  An example of frequency modulation is shown below.  The waveforms are not drawn to scale.




Compared with amplitude modulation, frequency modulation has several advantages.  The FM signal is not susceptible to electrical interference, unlike that for AM, and a properly tuned receiving-set can take advantage of its larger frequency range and dynamic range to reproduce high-fidelity sound. Also, FM signals are broadcast in the VHF short wave band and such waves are not reflected by the Earth’s ionosphere.  This means that FM signals can only travel as far as the horizon, which has the advantage of reducing interference, and coverage is therefore more stable than with AM.

The same modulation processes outlined above are used with microwaves and visible light to transmit information from one place to another.  Narrow-band frequency modulation is the most common mode of transmission for the microwave signals used with mobile phones.  Each call is assigned a carrier wave unique to the transmitter from which it is sent.  Frequency-modulated radar can determine the distance to a moving or stationary object.

Optical glass fibres are rapidly becoming common features of communications systems around the world.  Visible light is used as the carrier of information in optical fibres.  Light can be amplitude or frequency modulated and then transmitted over huge distances with little loss in intensity.  It should be noted however, that analogue systems such as AM or FM, where the signal consists of a continuously changing pattern, are not the primary transmission modes in fibre optics systems. Despite the huge bandwidth available, it is almost impossible to handle large numbers of channels (conversations) with acceptably low levels of distortion. A digital system, in which information is transmitted as a series of on-off pulses (pulse modulation), is used for high volume transmission of information through optical fibers.

Just as an aside, it is interesting to ask why we need carrier waves at all?  In radio transmission, you could theoretically transmit radio signals at audio frequencies.  However, because the wavelength of electromagnetic waves at audio-like frequencies is huge and the frequency of a radio transmitter dictates the size of the antenna and the power requirement, you would need a very big antenna and a very big power supply to do this. So, we've learned to transmit at higher "carrier" frequencies, modulating either the amplitude or frequency of the carrier signal with our audio and subtracting the carrier at the receiver end.  (Basically, for an antenna, the lower the frequency to be transmitted or received, the larger the physical size of the antenna.  For example, a VHF half-wave dipole will be about three times the size of a UHF dipole.)

Return to The World Communicates Contents List




As we have seen, a large portion of the EM spectrum is used for communication purposes.  However, since each particular type of communication medium, AM radio, FM radio, TV and so on, requires a certain minimum range of frequencies to ensure successful transmission, an obvious problem arises.  The EM spectrum used for communication purposes has a finite range.

The technical name for the range of frequencies that an EM signal occupies on a given transmission medium is bandwidth.  So, for example, a typical VHF-FM radio broadcast signal has a bandwidth of about 200 kHz (0.2 MHz), while a typical analogue television broadcast video signal has a bandwidth of 6 MHz.  In Australia VHF-FM radio stations are allocated a 200 kHz bandwidth between 88 and 108 MHz. So the available radio channel frequencies are 88.1 MHz, 88.3 MHz and so on up to 107.9 MHz.  Obviously there is a limit to the number of channels available and therefore to the number of FM radio stations that can broadcast a signal.

The same problem exists for all forms of communication that make use of EM radiation transmitted through the atmosphere or free space.  A government authority strictly controls access to the available bandwidths in each particular band of the spectrum (AM, FM, TV, mobile phones, microwave, etc) and competition for bandwidth allocation is intense.  From time to time people or organizations that can no longer demonstrate efficient & effective use of their allocated bandwidth are not re-allocated that bandwidth when their license comes up for renewal.

Research scientists are constantly trying to expand the range of the EM spectrum that can be used for communication purposes.  For instance much work is being done at present on carrier frequencies in the millimetre wave region (near-infrared).

Just in passing, it should be stated that this bandwidth limitation does not apply to hard-wired systems such as digital cable and fibre optic systems.  Available bandwidth in such systems can be expanded without limit by installing more cable.

Return to The World Communicates Contents List



LIMITATIONS OF EM WAVES FOR COMMUNICATION - This section is no longer required by the Syllabus but has been left here as extension material.

All EM waves suffer attenuation (reduction in intensity) as they pass through the atmosphere or through other material.  Solar flares released during the 11 year cycle of sunspots on the sun, can disrupt radio communications.  Both AM and FM radio signals are unavoidably vulnerable to all sorts of distortions which restrict their ability to carry information without degradation.  AM signals are particularly susceptible to electrical interference.   Hence, the move from analogue to digital techniques for encoding information.  Digital processing, which breaks a continuous signal down into a sequence of binary code, can resist distortion and convey far more information.  The use of visible light signals in the form of laser light within the earth’s atmosphere is limited by weather conditions - fog, snow, rain, and smoke can absorb and scatter light from the signal.  Hence, the push towards the use of optical fibres for visible light signals.  Inside the optical fibre there is vitually no attenuation of the signal.

Although short wave radio waves in the High Frequency (HF) band can be transmitted over large distances around the world by reflection off the ionosphere, shorter wavelength radio signals (VHF, UHF, SHF) cannot in general be sent over the horizon (as seen from the aerial).  Also, FM reception is hindered by the signal from transmitters reflecting from solid objects such as buildings, trees, and mountains.  On a clear, dry night, electromagnetic wave signals such as the microwaves used in radar can be sent to the moon and back with greater fidelity than from one end of Sydney to the other.  So, for long distance transmission of FM radio, TV signals and microwaves, a series of relay stations is required at regular intervals (about every 40 km for microwaves).

Other limitations involve the amount of information that can be carried using the various forms of EM radiation.  The information-carrying capacity of a signal increases with frequency.  Clearly, communication systems using optical fibres to carry visible light signals provide the best information carrying capacity.

There is also research at present into the health risks presented by the constant use of mobile phones, which transmit and receive microwave signals, and into the effects on health of people living close to FM and microwave relay transmitters.

Return to The World Communicates Contents List




The laws of reflection as stated in the section on reflection of sound, apply to EM waves as well.  They will not be re-stated here.  The only further comment required is to stress that when EM waves reflect from a plane surface, they may suffer a p phase change.  Sound waves do not.

Two particles that move in step with each other on a wave, that is have the same displacement and move in the same direction at the same time, are said to be in phase.  If two particles A and B are simultaneously located at the top of crests on the same wave, they are in phase.  As A moves back down to equilibrium and then down to a trough, so too does B. 

On reflection from a plane surface EM waves undergo a 180o or p phase change, if they strike the plane surface from the side of lower optical density (eg light travelling in air & reflecting off glass).  That is, a crest striking the surface is reflected as a trough.  Likewise a trough becomes a crest.  This does not happen with sound waves.  For instance, a compression striking a plane surface is reflected as a compression.

Examples of the use of reflection of EM waves in the transfer of information are many.  Reflection of short wave radio waves by the ionosphere and the internal reflection of light through optical fibres have already been mentioned.  Another example is that of Radar (RAdio Direction And Ranging) for locating distant objects by the reflection of microwaves.  Pulses or continuous waves of microwaves are broadcast, reflect off a distant object and the reflections are picked up by a receiving aerial.  The distance and direction to the object are given by the direction of the receiving aerial and the time between the transmission of the wave and the reception of its reflection. The transmitting and receiving aerials can be made to rotate to scan an area.  The reflected pulses are recorded by a cathode ray tube circularly scanned in synchronization to produce an echo map of the scanned area.

Other examples of the application of reflection include:

¨      A plane mirror – usually consists of a coating of metallic silver at the back of a flat sheet of glass.  Reflections from this surface produce images of objects in front of the mirror.  These images are called virtual images, since the rays of light reaching our eyes do not actually come from the point where we see the image.  See Diagram (a) below.

¨      Parabolic reflectors – are parabolic concave mirrors that focus parallel beams of light at a single point.  They are used in solar furnaces, reflecting telescopes, car headlights and many other applications.  See Diagram (b) below.

¨      Diverging mirrors – these are convex mirrors and cause parallel beams of light to spread apart.  The image is always upright and smaller than the object, which allows the observer to see a wide-angle view.  They are used to help people see “around corners” in driveways and shops and as rear view mirrors on trucks and buses.  See Diagram (c) below.





Doubtless we have all seen examples of the light bending properties of water.  Recall the experiment where a ruler is placed in a beaker of water and appears to bend upwards.  The end of the ruler in the water appears to be higher in the water than it actually is.  The reason for this is that the light reflecting from the end of the ruler bends down towards the water surface as it passes from water to air.  When it enters our eyes, the light appears to have come from a position in the water above the actual position of the end of the ruler.  This bending of light rays as they pass from one medium to another is called refraction.

For the rest of this section we will use light as an example of EM waves.  The velocity of light in a medium depends on the optical density of the medium.  The higher the optical density, the lower the velocity of light.  Water is more optically dense than air and so the velocity of light in water is lower than its value in air.  It is this difference in the velocity of light in different media that causes the light to bend as it passes across the boundary between two media.

In the following diagram several wavefronts (lines of crests) of light are shown travelling towards the boundary between two media of different optical density.  Their direction is shown by the ray (arrowed line) at right angles to the wavefront.  The waves have a velocity v1 in medium 1 and a velocity v2 in medium 2.  Note that we will assume that the density of medium 1 is less than that of medium 2 and therefore that v1 > v2.  The waves strike the boundary at an angle of incidence i, measured as always from the normal to the boundary around to the incident ray.  As the waves move across into medium 2, they slow down and therefore their direction changes.  As indicated by the ray, the waves bend towards the normal and are transmitted into medium 2 with an angle of refraction r, measured from the normal to the refracted ray.

Notice also that because the velocity has decreased as the waves pass from medium 1 to medium 2, so too has the wavelength.  This happens because the frequency of a wave remains the same as it passes across the boundary between two media.  Therefore, from v = f l, since v decreases and f remains constant, l must decrease.



The relationship between the velocities of light in the two media and the angles of incidence and refraction is given by Snell’s Law:




It can be shown that the ratio of the velocity of the wave in medium 1 to the velocity of the wave in medium 2 is a constant.  This constant is called the relative refractive index for waves travelling from medium 1 into medium 2 and is a measure of the amount of bending of the waves that occurs as the waves move from medium 1 into medium 2.

Every material has a specific refractive index (m) value.  This is called the absolute refractive index of the material and is defined as the index of refraction of light going from a vacuum into the medium in question.  A more complete statement of Snell’s Law can then be written as:




Where 1m2 = the relative refractive index for waves moving from medium 1 into medium 2, m1 = the absolute refractive index of medium 1 and m2 = the absolute refractive index of medium 2.

Return to The World Communicates Contents List





Let us outline how the change in velocity that a wavefront experiences as it passes across the boundary between two media of different optical densities causes the wavefront to change its direction of travel and therefore bend.  Consider the following diagram that shows a single plane wavefront striking the boundary between two media at point W.



Wavefront WY strikes the boundary and moves into a medium in which its velocity is reduced.  From W, the wavefront travels to X in the same time as Y takes to reach the boundary.  Since WX < YZ, clearly WY cannot be parallel to XZ.  The change in velocity of the wavefront has caused the wavefront to bend.

Return to The World Communicates Contents List





Clearly, if a ray of light travels from a slower (more dense) to a faster (less dense) medium, as in the example used earlier of the ruler in the beaker of water, the ray of light bends away from the normal towards the boundary surface.  As the angle of incidence increases from zero, there comes a case where the angle of refraction is 90o, ie the ray travels along the boundary between the two media.  This angle of incidence is called the critical angle.  Any ray having an angle of incidence greater than the critical angle, is totally reflected back into medium 1.  This phenomenon is called total internal reflection and plays an important role in several areas of physics and particularly in communication technology such as the transmission of light through optical fibres.

From Snell’s Law we can write:


and therefore that:



Where ic = the critical angle and medium 2, the faster medium, is a vacuum or air.  Note that m for a vacuum is defined as 1, while m for air is close enough to 1 for most purposes.

Total internal reflection can only occur for light passing from a more optically dense medium to a less optically dense one.  Typical critical angles include 49o for water, 42o for crown glass and 24o for diamond.

One application of total internal reflection is found in fibre optics.  Good quality glass of high refractive index is coated with a thin layer of glass of lower refractive index.  Light is passed into the end of the thin fibre.  Any ray of light striking the boundary between the two glass media at an angle greater than the critical angle, will be totally internally reflected along the whole length of the fibre.  Light can therefore travel from one end of the fibre to the other without loss.  See below.


Optical fibres were developed in the 1950’s and found applications in industry and medicine.  In 1970 optical fibres suitable for long distance communication were developed.


Return to The World Communicates Contents List




Many types of communication data are stored or transmitted in digital form:

¨      Fibre optics communication data – phone calls, computer data

¨      Mobile telephone calls

¨      Sound and picture recordings on magnetic tape, Compact Discs (CD’s) and Digital Versatile Discs (DVD’s)

¨      Computer data itself – the huge volume of data available on the internet, computerized records kept by businesses, banks, governments, local councils, the police and military and so on.

¨      Digital TV signals & Digital Audio Broadcasting (DAB) signals - DAB combines two technologies – digital sound recording & data compression. 

¨      Communications satellites utilise very small aperture terminals (VSATs) which relay digital data for a multitude of business services.

¨      Holographic data - holograms can store large quantities of data by varying the recording angle relative to the photographic plate.  To retrieve the data the hologram must be illuminated with a laser beam at different angles.

¨      “Smart” weapons – eg tomahawk missiles can be launched over 1000 km from the target and follow precise directional instructions to reach its target.


Return to The World Communicates Contents List


The following is NOT EXAMINABLE in the current syllabus but makes interesting reading:

Many developments in technology had to take place to enable the production of communication technologies such as those mentioned above and others.  Some of the important developments would include:

¨      The development of the electronic computer.  Colossus was the first fully operational electronic computer and was developed in England during World War II for use in code cracking.  It came on-line in December 1943. ENIAC followed in 1945, constructed in the USA.  Developments in electronics, materials, miniaturization, programming languages, data compression and other areas has seen the computer rise to its present day, indispensable role in communication technologies.  Computers are used throughout the communications industry for a huge range of applications – controlling telephone exchanges, switching mobile phone calls from one cell to another, as a tool in the digital recording of signals, controlling the operation of satellites and satellite communications, controlling complex weaponry for the military, as a rapid information storage & retrieval system and as an access tool to the Internet, to mention a few.  Developments are continuing – microminiaturization and nanotechnology promise to further compress circuits, superconductivity research may speed up circuits by lowering electrical resistance of circuit elements, research into quantum computers may produce enormously powerful parallel computers.

¨      The invention of the transistor in 1947 revolutionized the electronics industry by replacing the large and somewhat unreliable vacuum tubes used at the time. Transistors are used for the switching and amplification of electrical signals. Transistors in computers perform the high speed switching operations that control the electric charges that represent information as binary code.  Electronic computers became faster and more reliable. 

¨      Miniaturization of whole circuits consisting of transistors and other electronic parts was accomplished in the early 1960’s.  These circuits were named integrated circuits (IC’s) and enabled the development of microprocessors in the 1970’s.  A microprocessor is an integrated circuit that can perform the arithmetic, logic, and control functions of a computer.  During the 1970’s other integrated circuits with large memory storage capacities were developed.  These advances reduced the physical size of computers and other electronic equipment.

¨      Laser (light amplification by stimulated emission of radiation) devices were also first developed in the early 1960’s.  Optical fibres suitable for long distance communication were developed in 1970.  When combined, these two technologies produce a system of communication that can transmit higher quantities of information than is possible via electrical pulses through copper wires in cables.  Lasers are also essential in the reading of CD’s.

¨      The advent in the early 1980’s of digital audio recording technology, which could capture a continuous audio signal as a sequence of binary code, led to the first production of compact discs (CD’s).  CD’s greatly improved the storage capacity of recording media and the digital recording greatly improved the quality of the sound on play back.  When a CD is played, it spins and a laser reads the digital code recorded as tiny divots in the surface of the disc.

¨      Developments in the way data is handled have assisted in the production of communication technologies.  Data Compression allows for more efficient storage and transmission of data and is used in the production of CD-ROM’s for computers among other things.  For example, key-word encoding replaces frequently occurring words such as “and” or “but” with a 2-byte symbol, saving one or more bytes of storage for every occurrence of that word in a text file.

¨      The development of rocket and satellite science and the production and deployment of satellites has allowed the creation of world-wide communications networks for the transmission of various types of data around the world eg phone calls, television programs, telefacsimile data, radio and digital data. Communications satellites are placed in various orbits and relay data between ground stations within their view.  The world-wide Global Positioning System (GPS), uses 24 satellites in six orbits.  The satellites broadcast a continuous signal containing information on the position of the satellite and the time the signal was sent.  People use special receivers to calculate their position, velocity and local time very accurately from the signals from at least three satellites.


Return to The World Communicates Contents List


Last updated:

© Robert Emery 2002 - view the Terms of Use of this site.