In order to figure out whether
radiation is dangerous, we need to begin by understanding what it is made of. You may have heard of electromagnetic (EM) waves, which are electric and magnetic fields that vary in time and space. To qualify as
radiation, the EM fields just need to radiate from a source. Two examples of EM radiation you are familiar with are cellphone and wi-fi, whose transmitters emit in the radio part of the EM spectrum. The concept of EM waves is a classical one, in the sense that it was well understood in the late 1800s, before the advent of quantum physics. Quantum physics fundamentally changed how we view matter and energy on small scales -- atomic and subatomic scales -- and it changed the way we think about the dangers of radiation.
Quantum physics is a discipline that grew out of a lot of theorizing and an even greater amount of experimenting, on atoms and subatomic particles like electrons and neutrons. Since the 1920s, it has shown us that everything in Nature comes in teeny little lumps of energy called quanta. In other words, everything is granular at the subatomic level. This actually includes the EM field, quanta of which are known as photons. You can think of one big EM wave coming out of the CBC Radio One transmitters as being composed of millions and millions of individual photons, each carrying the right frequency to represent CBC Radio One: 99.1 Megahertz. Other types of photons you see and feel every day come from the sun, e.g. for visible light their frequency is between 430-770 Terahertz.
Photons have an energy E which is proportional to their frequency f, via the equation E=hf, where h is Planck's constant. What this means in words is that a higher frequency means a higher energy. So a higher frequency photon would pack more of a punch if it slammed right into you. If you are more familiar with photon wavelength λ, it is related to frequency by λ=c/f, where c is the speed of light. Now, since individual photons are actually very tiny, each individual one hitting your skin does not hurt (but it does contribute to the tiny pressure of light). Suppose that you accumulated a lot of photon hits -- what would the dangers be?
It turns out that the answer to that question depends sensitively on how energetic the photons are. There is a distinct borderline for danger with photons, and it is called ionization. Ionization means that an atomic electron gets kicked out of the atom by a marauding photon, in a one-for-one trade. A small percentage of ionized atoms can go on to damage DNA, and in a small percentage of cases this can go on to cause cancerous cell changes.
How energetic does a photon need to be to ionize your atoms? Basically, ultraviolet or higher -- the part of the spectrum left of visible light in the picture below. It needs to be a UV photon, say from the sun, or an X-ray from a machine, or gamma-ray photon from a radioactive source. All of those kinds of ionizing EM radiation are dangerous. But a photon with a smaller frequency on the spectrum than visible light -- to the right of visible light in the picture -- is harmless. (Well, unless you are foolish enough to deliberately roast yourself in an oven by focusing zillions of infrared heat photons!)
The key thing that Einstein explained is why (unfocused) non-ionizing photons are harmless, even in aggregate. This was an extremely important conceptual point and it mightily confused physicists of the day. What was the confusion? Well, you might worry that a photon which has 1/100th the energy of an ionizing photon might be 1/100 as dangerous, still worth keeping an eye on. But that's not how the world works at the microscopic scale, because life is granular in the quantum realm. Even if you can muster 100 photons with 1/100th of the energy required to ionize the atom, they still won't hurt you. So the best advice physics can give you overall is this: fear ionizing radiation, not the non-ionizing kind.
In order to understand why we can relax about nonionizing radiation, even in aggregate, we must delve a little deeper into the physics and discuss the PhotoElectric Effect.
Contrary to popular opinion, Albert Einstein did not win his Nobel Prize in Physics for discovering special or general relativity. He actually won it for explaining a huge experimental puzzle called the Photoelectric Effect. What is this phenomenon?
Suppose that you are interested in how light interacts with metals. Metals are a good choice for something relatively simple to study, because they have electrons that are free to move about the metal. Those electrons can then conduct electricity and heat as they move around. Suppose further that you build the following kind of apparatus for studying the interaction of light with metals.
What did this famous experiment find? If you shone ultraviolet light on a metal, you got an electric current to flow. This process liberated electrons, which distributed themselves around both the zinc plate and the gold leaf. This in turn resulted in the gold leaf moving away from the zinc plate, because of electric repulsion. (Like charges repel, like static electricity in your hair in winter.) That is already quite interesting. But what really took physicists of the day by surprise was the fact that if you shone light of longer wavelength on the zinc plate then nothing happened at all. Nothing. No movement. No electrons liberated. Despite using arbitrary intensity of impinging light. So: why the hell not?
Well, metals are atoms, and atoms are nuclei plus electron clouds. EM radiation consists of photons -- quanta of the EM field. These photons have specific frequencies, and only photons with sufficient energy to actually knock electrons out of orbit are able to make current flow in the Photoelectric Effect. A single photon knocks a single electron out of orbit: it is a one-for-one trade.
If you have a photon of longer wavelength, then it has lower frequency, because frequency times wavelength always has to equal the wave speed. So longer wavelength photons have less energy. You might think that flooding your atom with a hundred photons, each of which has only a hundredth of the energy required to kick an atomic electron out of orbit, might be enough. Not so. The probability of being able to line up those hundred photons all on top of one another so that they made the transition as a group can be calculated, and is vanishingly small. So it never happens in the real world.
What is the take-home message from the Photoelectric Effect? Quantum mechanics rules. Only UV and X-ray photons have sufficient energy to kick atomic electrons out of orbit. Lower frequency photons can't clear the ionization energy hurdle, no matter how many photons you try to use. It's like trying to win the high jump world record by lining up a hundred short people, each of which can only jump 1/100 of the required height. This doesn't cut the mustard. You only win the gold Olympic medal if you jump the highest in a single jump.
So the quantum hypothesis -- the idea that every hunk of stuff is granular, not continuous -- is enough to (a) explain the observed experimental results of the photoelectric effect and (b) keep you safe from cellphone/wifi
radiation. It is the insight (a) that earned Einstein his Nobel Prize.
The moral of our story is that you only need to fear ionizing radiation. It therefore makes good sense to minimize your exposure by putting on sunscreen that protects you from UV rays, and not using tanning booths. Also, in consultation with your doctor/dentist, limit your X-rays and CT scans (which are just bunches of X-rays knitted together to make a 3D image). That's it! You don't need to expend any energy worrying about the
radiation (nonionizing photons) emitted by your cellphone or wifi router or TV or microwave oven -- and you CERTAINLY don't need to worry about them hurting your children's brains. Their little minds can take pools and pools of (unfocused) nonionizing photons without any harm whatsoever. Just make sure to put sunscreen on their little faces. :D