Have you ever looked at something and thought you saw colors that weren’t really there? This happens because your eyes and brain work together in a unique way, different from how a camera captures images. Let’s explore how our eyes and cameras are similar and different, and why our eyes sometimes play tricks on us.
Both eyes and cameras have lenses that focus light and sensors that capture it. However, they work in different ways. A camera lens moves to focus on objects, while the lens in your eye changes shape to do the same. Most camera lenses are designed to focus different colors of light, like red and blue, at the same point. But in your eye, when red light is in focus, blue light might be a bit blurry.
Both eyes and cameras use photoreceptors to detect light. Cameras have one type of photoreceptor with filters for red, green, and blue light. Your eyes have several types of photoreceptors. In normal light, your eyes use three types, but in low light, only one type works, which is why you can’t see colors well in the dark.
Unlike cameras, your eyes don’t need color filters because the photoreceptors already respond to different light wavelengths. The distribution of these photoreceptors is uneven in your eyes. For example, the center of your vision has no receptors for dim light, which is why faint stars disappear when you look directly at them. Also, there are few receptors for blue light in the center, so you might not notice blurry blue images. Your brain helps by filling in the missing details.
The edges of your retina have fewer receptors, so your ability to see clearly and in color decreases away from the center of your vision. There’s also a blind spot in each eye where there are no photoreceptors. You don’t notice it because your brain fills in the missing information. This shows that we see with our brains as much as with our eyes.
Because our brains are so involved in seeing, we can experience visual illusions. For example, if an image seems to be moving, it might be because your eyes are constantly making tiny movements. If your eyes didn’t move, your vision would fade because the nerves in your retina stop responding to a still image. Unlike cameras, you can’t see your eyes move when you look in a mirror because your vision briefly stops during larger eye movements.
Cameras can capture details that our eyes might miss, zoom in on distant objects, and record exactly what they see. But our eyes are incredibly efficient, having evolved alongside our brains over millions of years. Even if we don’t always see the world perfectly, there’s something special about how our eyes and brains work together. Observing the world, even with its illusions, can be a joyful experience, and perhaps it even offers some evolutionary benefits. But that’s a topic for another time.
Try this fun experiment to understand how your eyes perceive colors differently than a camera. Gather colored paper or objects in red, blue, and green. In a dimly lit room, observe how the colors appear to your eyes. Then, take a photo with a camera and compare the results. Discuss why the colors might look different in the photo compared to what you see with your eyes.
Discover your blind spot with a simple activity. On a piece of paper, draw a small dot and a cross about 10 cm apart. Close your left eye and focus your right eye on the cross. Slowly move the paper closer to your face. At a certain point, the dot will disappear. This is your blind spot! Discuss why this happens and how your brain compensates for it.
Test your peripheral vision with a partner. Have them hold up different colored cards or objects at the edge of your vision while you look straight ahead. Try to identify the colors without moving your eyes. Discuss how your peripheral vision differs from your central vision and why this might be important.
Explore visual illusions to see how your brain interprets images. Find examples of optical illusions online or in books. Try to figure out why the illusions trick your eyes and brain. Discuss how these illusions demonstrate the differences between how cameras capture images and how our eyes perceive them.
Observe your eye movements with a simple activity. Stand in front of a mirror and try to watch your eyes move as you shift your gaze from side to side. Notice how you can’t see your eyes move. Discuss why this happens and how it relates to the way our vision works compared to a camera.
Here’s a sanitized version of the transcript:
—
Watch the center of this disk. You may start to feel sleepy. Just kidding! I’m not going to hypnotize you. But are you beginning to see colors in the rings? If so, your eyes are playing tricks on you. The disk was only ever black and white. You see, your eyes don’t always capture the world like a video camera would. In fact, there are several differences due to the anatomy of your eye and the processing that occurs in your brain and its outgrowth, the retina.
Let’s start with some similarities. Both have lenses to focus light and sensors to capture it, but they behave differently. The lens in a camera moves to stay focused on an object approaching it, while the one in your eye changes shape. Most camera lenses are achromatic, meaning they focus both red and blue light to the same point. Your eye is different; when red light from an object is in focus, the blue light is out of focus.
So why don’t things look partially out of focus all the time? To answer that, we need to examine how your eye and the camera capture light: photoreceptors. The light-sensitive surface in a camera has one type of photoreceptor that is evenly distributed throughout. An array of red, green, and blue filters on top of these photoreceptors allows them to respond selectively to different wavelengths of light. Your eye’s retina, on the other hand, has several types of photoreceptors—usually three for normal light conditions and only one type for low light, which is why we are color blind in the dark.
In normal light, unlike a camera, we have no need for a color filter because our photoreceptors already respond selectively to different wavelengths of light. Additionally, your photoreceptors are unevenly distributed, with no receptors for dim light in the very center. This is why faint stars seem to disappear when you look directly at them. The center also has very few receptors that can detect blue light, which is why you might not notice the blurred blue image from earlier. However, you still perceive blue there because your brain fills it in from context.
Moreover, the edges of our retinas have relatively few receptors for any wavelength of light. Thus, our visual acuity and ability to see color fall off rapidly from the center of our vision. There is also an area in our eyes called the blind spot, where there are no photoreceptors. We don’t notice a lack of vision there because our brain fills in the gaps. In a very real sense, we see with our brains, not just our eyes.
Because our brains, including the retinas, are so involved in the process, we are susceptible to visual illusions. Here’s another illusion caused by the eye itself: does the center of this image look like it’s jittering around? That’s because your eye actually jiggles most of the time. If it didn’t, your vision would eventually shut down because the nerves on the retina stop responding to a stationary image of constant intensity. Unlike a camera, you briefly stop seeing whenever you make a larger movement with your eyes. That’s why you can’t see your own eyes shift as you look from one to the other in a mirror.
Video cameras can capture details our eyes miss, magnify distant objects, and accurately record what they see. But our eyes are remarkably efficient adaptations, the result of hundreds of millions of years of coevolution with our brains. So what if we don’t always see the world exactly as it is? There’s a certain joy to be found in observing stationary leaves waving in an illusive breeze, and maybe even an evolutionary advantage. But that’s a lesson for another day.
—
This version maintains the original content while removing informal language and jokes for a more straightforward presentation.
Eyes – Organs that detect light and allow us to see. – The eyes of a cat are adapted to see well in low light conditions.
Cameras – Devices that capture images by focusing light onto a sensor or film. – Cameras work similarly to eyes by using lenses to focus light and create clear pictures.
Light – Electromagnetic radiation that is visible to the human eye and is responsible for the sense of sight. – Plants need light to perform photosynthesis and produce energy.
Photoreceptors – Cells in the retina of the eye that detect light and convert it into signals for the brain. – Photoreceptors in our eyes help us see by converting light into electrical signals.
Vision – The ability to see; the sense that allows us to perceive light and interpret it as images. – Vision is crucial for many animals to find food and avoid predators.
Colors – Different wavelengths of light that are perceived by the eyes as various hues. – The rainbow shows the spectrum of colors that sunlight is made of.
Brain – The organ that processes information from the senses, including sight, and controls the body. – The brain interprets signals from the eyes to create the images we see.
Lenses – Curved pieces of glass or other transparent materials that bend light rays to focus them. – The lenses in our eyes adjust to focus on objects at different distances.
Receptors – Specialized cells or proteins that detect specific stimuli, such as light, and send signals to the brain. – Receptors in the skin can detect changes in temperature and pressure.
Illusions – Perceptions that do not match reality, often caused by the way the brain interprets sensory information. – Optical illusions can trick our eyes into seeing things that aren’t really there.