Light, Optics & Vision
Light reveals the world to us — not just through sight, but through science. It behaves as both wave and particle, travels across the universe, and bends, scatters, interferes, and reflects in ways that hold the key to understanding matter, motion, and energy. In this chapter, we explore the nature of light and vision through hands-on experiments that connect the everyday and the extraordinary.
With just a smartphone and simple materials like water, foil, or a drop of milk, you can observe phenomena once reserved for physics labs: diffraction patterns, polarization, total internal reflection, and even molecular fingerprints through Raman scattering. You’ll build pinhole cameras and telescopes, analyze color pixels and soap films, and recreate historic experiments like Herschel’s discovery of infrared light.
This is also a chapter about perception — how light interacts with materials, and how our instruments (and eyes) interpret it. Using your phone’s sensors, screen, and camera, you’ll investigate the hidden structure of crystals, track the invisible paths of polarized or scattered light, and measure how glasses, filters, and even soot affect what we can see.
Whether you’re probing the atomic structure of salt with a laser pointer or constructing an optical microphone from a Michelson interferometer, these experiments let you explore light not just as illumination, but as information. The invisible becomes visible, and the abstract becomes hands-on.
Let the physics of light sharpen your vision.
Basic Principles
Inverse-Square Law with Light (OPTI-01)
Sensors Used: Light sensor (lux meter)
What’s Measured: Light intensity as a function of distance
Description
The inverse-square law describes how light intensity diminishes with distance from a point source, and with a smartphone’s light sensor or a lux meter app, this relationship becomes remarkably tangible. Set up a small, consistent light source — such as an LED bulb or flashlight — and measure the illumination at various distances from it. As you move your phone farther away, record the lux readings at regular intervals. When you plot the intensity values against the inverse square of the distance (1/r²), a clear pattern emerges: light spreads out as it travels, and its intensity drops predictably.
This experiment offers more than just confirmation of a physics equation — it’s a gateway to discussions about energy dispersion, three-dimensional space, and even how stars and galaxies emit light across the cosmos. You may also notice slight deviations from the ideal curve due to the shape of the beam or nearby reflections, which open the door to real-world complexity in optical measurement.
Measuring Laser Light Transmission Through Colored Glasses Using a Smartphone (OPTI-02)
Sensors Used: Ambient light sensor, camera (optional), light meter app
What’s Measured: Light intensity after transmission through filters; transmission coefficient
Description
This experiment explores how different materials filter light, revealing how color, transparency, and wavelength interact in the design of optical filters and protective eyewear. Using a smartphone equipped with an ambient light sensor or a light meter app that utilizes the camera, students measure how much laser light passes through various colored filters or safety glasses. Begin with a green laser pointer and a pair of green laser safety glasses. Shine the laser through the lenses and measure the light intensity before and after. Then repeat the experiment with red and blue lasers, or with filters of different colors, to investigate how well each one blocks or transmits light of specific wavelengths.
The results make clear that optical filters are not all-purpose blockers — they are designed to attenuate particular regions of the electromagnetic spectrum. By taking a baseline measurement (without a filter) and comparing it to the filtered intensity, students can calculate the relative transmission of each material. A simple calibration step — like using known neutral density filters — can even allow for quantitative transmission coefficients to be calculated.
Beyond exploring basic optics, this experiment introduces real-world considerations in safety and instrument design, especially in fields like laser physics, photography, and spectroscopy. It also encourages precision: students quickly learn that even filters that appear dark to the eye may pass significant amounts of infrared or ultraviolet light — wavelengths that are invisible but potentially hazardous. With just a smartphone and a handful of filters or glasses, this experiment brings wavelength-selective optics into focus.
Smartphone Sextant: Measuring Angles and Distances (OPTI-03)
Sensors Used: Camera, orientation sensor (TYPE_GAME_ROTATION_VECTOR)
What’s Measured: Angular elevation and angular size; object height or distance
Description
With a sextant app or an augmented reality overlay, your smartphone becomes a precision tool for measuring angles — a digital version of the navigational instruments once used by astronomers and sailors. Point your phone’s camera at an object and use the built-in orientation sensors to measure its elevation or the angular separation between two points. Adding a crosshair or marker in the center of the display helps target specific features, such as the top and bottom of a building or the Moon’s upper and lower limb.
This tool has practical applications: if you know the real-world height of a structure (like the height between two floors, typically around 3 meters), you can estimate your distance to the building by measuring the angle it subtends and applying simple trigonometry. Conversely, if the distance is known, you can calculate the height.
In more advanced setups, students can use the phone like an astrolabe or digital theodolite — ideal for tracking celestial objects or mapping terrestrial landmarks. These angular measurements are foundational in astronomy, surveying, and optics, and with modern sensor technology, they’re now in the palm of your hand.
References:
[1] “Iphone Sextant Project,” https://www.instructables.com/id/ISextant-Project/
[2] “Smart measurements of the heavens,” https://www.scienceinschool.org/content/smart-measurements-heavens
DIY Pinhole Camera and Spatial Filter (OPTI-04)
Sensors Used: Smartphone camera
What’s Measured: Image formation, focal sharpness
Description
The simplest camera is just a box with a tiny hole — and with a smartphone and some aluminum foil, you can build your own pinhole camera in minutes. Cover your phone’s camera lens with a piece of foil and pierce a tiny hole using a needle. The smaller and cleaner the hole, the sharper the resulting image. When pointed at a bright outdoor scene, the camera will record an inverted image formed entirely by geometry, not lenses — a living example of the camera obscura principle.
To fine-tune the aperture, stack several layers of aluminum foil (five to ten works well) and pierce them gently with a needle. Peeling the layers apart gives you a range of slightly different pinholes, each with unique sharpness and brightness characteristics. These foil layers can be mounted on cardboard or a 3D-printed frame to create swappable pinholes, ideal for experimenting with focus depth and exposure.
DIY Smartphone Microscope with Droplet Lens (OPTI-05)
Sensors Used: Smartphone camera
What’s Measured: Visual magnification, microscopic structures, subpixel geometry
Description
Transform your smartphone into a microscope using nothing more than a droplet of water. Carefully place a small droplet directly onto the phone’s camera lens, or onto a piece of transparent plastic film above it. The curved shape of the droplet acts as a convex lens, dramatically magnifying nearby objects. This simple lens allows you to examine fine textures and tiny structures — from paper fibers and sugar crystals to dust mites and mold spores.
One particularly revealing application is to use your water droplet microscope to explore a phone or computer screen. Take a photo or video of a white or colored area on the screen, then zoom in and examine the red, green, and blue subpixels that blend to produce full-color images. This hands-on activity gives you a direct view into how modern displays work and makes the concept of additive color mixing visible and tangible.
Smaller droplets increase magnification but decrease depth of focus. For a more permanent setup, you can use a clear acrylic bead or glass sphere instead of water. With nothing more than a bit of water and a smartphone, you can explore the microscopic world — and the digital one — up close.
References:
[1] “Drop Magnifier,” https://www.thenakedscientists.com/get-naked/experiments/drop-magnifier
Microscopic Laser Shadow Imaging: Seeing the Invisible with Light and Shadows (OPTI-06)
Sensors Used: Smartphone camera (video or slow motion)
What’s Measured: Shadow patterns and diffraction caused by microscopic particles
Description
This elegant experiment turns a laser pointer and a drop of water into a minimalist, lens-free microscope. Place a droplet of water — ideally containing fine particles like dust, fibers, or pond water with plankton — on a transparent surface or suspended from a thin membrane. Shine a laser pointer through the drop and project the resulting light onto a white screen or directly onto your smartphone camera sensor. Tiny particles suspended in the water will cast sharp shadows or create dynamic diffraction patterns, revealing their motion and structure in real time.
By recording the laser-illuminated region using your phone’s video mode — especially in slow motion — you can capture the intricate ballet of microscopic life and matter. The variations in shadow sharpness and motion provide insights into particle size, motion through the fluid, and the nature of light-matter interaction on very small scales.
Though technically simple, this setup touches on profound concepts in optics: coherence, diffraction, and the physics of imaging. It demonstrates how coherent light interacts with small-scale objects to produce visible effects, and it opens the door to creative exploration — from fluid dynamics to microbiology — with tools as basic as a laser and a smartphone.
References:
[1] “Microscopic Laser Shadow Imaging,” The Naked Scientists, https://www.thenakedscientists.com/get-naked/experiments/microscopic-laser-shadow-imaging
Building a Simple Refracting Telescope and Smartphone Spyglass (OPTI-07)
Sensors Used: Smartphone camera (optional)
What’s Measured: Image formation, magnification, optical alignment
Description
This experiment invites students to construct a basic refracting telescope using just two magnifying glasses — one to serve as the objective lens, the other as the eyepiece. By selecting lenses with different focal lengths and holding them at the correct distance apart, students can form a magnified, inverted image of distant objects, just as Galileo did centuries ago. Adjusting the separation between the lenses brings the image into focus and illustrates the optical principles of focal length, image inversion, and angular magnification.
To extend this experiment, a smartphone can be integrated into the setup. By aligning its camera with the eyepiece, students can turn the homemade telescope into a digital viewing device — a modern spyglass. Whether capturing birds across a field or features on the Moon, this hybrid of classical optics and smartphone imaging demonstrates how even simple lenses can reveal the distant world in sharp detail. This project not only teaches geometric optics in a hands-on way but also connects historical instruments with contemporary technology.
References:
[1] “Seeing further - DIY telescope,” https://www.thenakedscientists.com/get-naked/experiments/seeing-further-diy-telescope
[2] “How to build a $40 USB spy telescope,” www.youtube.com/watch?v=qBKDgpI5ano&feature=related
Camera-Based Refraction Measurement - Snell’s Law (OPTI-08)
Sensors Used: Smartphone camera
What’s Measured: Apparent displacement, refractive index
Description
This simple experiment lets you estimate the refractive index of liquids using only a smartphone camera and a printed ruler. Begin by filling a clear glass or plastic cup with water and placing it in front of a ruler or grid with known spacing. Using your smartphone, take a photo through the side of the cup. You’ll observe that the ruler appears shifted or distorted behind the curved glass—an effect caused by refraction, as light bends passing through different materials.
You can analyze the apparent displacement using image editing tools or by comparing printed and photographed scales. With a known geometry and a bit of trigonometry, you can estimate the angles involved and apply Snell’s Law to calculate the refractive index of the liquid.
To extend the experiment, try replacing the water with other transparent liquids like vegetable oil, sugar water, or glycerin. Denser liquids will bend light more, leading to larger displacements and higher refractive indices. This variation offers a compelling way to explore how molecular composition affects optical behavior, and provides a hands-on demonstration of how lenses and even our own eyes manipulate light.
Seeing the Invisible: Visualizing CO₂ with Light Refraction (OPTI-09)
Sensors Used: Smartphone camera (optional), laser pointer or flashlight
What’s Measured: Visual distortion due to refractive index differences
Description
This visually striking experiment reveals the presence of an otherwise invisible gas — carbon dioxide — through the bending of light. Begin by filling a fish tank or large transparent container with CO₂, either by allowing dry ice to sublimate or by generating the gas using a reaction between baking soda and vinegar. Once the tank fills with the heavier gas, shine a laser pointer or a strong flashlight across the tank in a darkened room.
Though CO₂ is colorless, it has a higher refractive index than air, and this difference bends the light as it passes through the gas layers. As the CO₂ flows and swirls, you’ll see shimmering distortions, light ripples, or refracted beam paths — subtle clues that make the gas visually detectable. Capture the effect using your phone’s camera or slow-motion video for added clarity. The result is a compelling demonstration of how physics can render the invisible visible, showing that even transparent gases can leave optical fingerprints when the right conditions — and the right light — are applied.
References:
[1] “Seeing the Invisible,” The Naked Scientists, https://www.thenakedscientists.com/get-naked/experiments/seeing-invisible
Total Internal Reflection
Total Internal Reflection with a Laser and Water Stream (OPTI-10)
Sensors Used: Smartphone camera (optional for documentation)
What’s Measured: Light path behavior, angle of incidence for total internal reflection
Description
This classic experiment turns an everyday stream of water into a living example of fiber optics. Begin by filling a clear plastic bottle with water and poking a small hole near the bottom to allow a smooth, steady stream to flow out. In a darkened room, shine a laser pointer into the bottle from above at an angle so that the beam enters the falling stream. If the angle is right, the laser beam will curve and travel along the stream, seemingly trapped inside the flowing water.
This captivating effect is known as total internal reflection. As the laser light enters the water stream, it repeatedly reflects off the interface between the water and air, never escaping—just like in a fiber optic cable. The phenomenon occurs when light hits a boundary at an angle greater than the critical angle, beyond which all light is reflected internally rather than refracted out.
Try adjusting the laser’s angle or moving the stream slightly to see how the path changes. You can even gently wiggle the bottle to create a curving stream, and observe how the laser beam faithfully follows the contour. If you record the experiment using your smartphone camera, you’ll capture a vivid, real-time demonstration of how light can be guided along a medium by reflection alone—an elegant connection between everyday materials and high-tech communication systems.
References:
[1] “Total Internal Reflection (Laser Waterfall)” https://www.youtube.com/watch?v=Z9O5xY3Z1WE
[2] “Water Fibre Optics,” The Naked Scientists, https://www.thenakedscientists.com/get-naked/experiments/water-fibre-optics
Creating a Home-Made Material That Bends Light (OPTI-11)
Sensors Used: Smartphone camera (optional, for documenting effects)
What’s Measured: Light path bending, visual distortion, qualitative refraction behavior
Description
Can you build a material that bends light in surprising ways—right at home? In this visually stunning experiment, inspired by a demonstration from The Action Lab, you explore how light travels through substances with different refractive indices, and how layering them carefully can produce dramatic optical effects.
Begin by stacking sheets of clear plastic and layering them with mineral oil, water, or other transparent fluids of varying refractive index. When light passes through this homemade “composite,” it encounters shifts in optical density at each interface. These variations cause the light rays to bend—not once, but multiple times—following a curved path through the layered medium.
The result is a lens-like material that can distort or focus light depending on your configuration. Shine a laser beam or flashlight through the layered structure and observe how the beam curves or appears displaced. Viewed from the side or through a smartphone camera, the light may even seem to “curve around” an object, evoking comparisons to gravitational lensing in astronomy.
This experiment not only offers a hands-on way to understand Snell’s Law and refraction, but also opens a door to advanced topics like gradient-index optics and artificial metamaterials. With just plastic sheets, oil, and a flashlight, students can explore how light responds to engineered optical environments—and how those principles relate to both everyday lenses and the bending of starlight in the cosmos, that is gravitational lensing.
References:
[1] “Crazy Material That You Can Make at Home That Actually Bends Light!” https://www.youtube.com/watch?v=968gVUAY9Mg
Diffraction & Interference Basics
Why LEDs (Even with Lenses) Don’t Show Interference Patterns (OPTI-12)
Safety in optics is crucial, especially for younger students or casual learners. The idea of replacing lasers with LEDs + straw + lens is pedagogically sound, but it comes with an important limitation — no interference. Here’s a concise explanation of why LED setups can’t demonstrate interference (and how we can turn that into a teaching moment): To observe interference — like in a double-slit experiment — your light source must meet two critical conditions:
What’s Happening Physically? Lasers emit coherent light: the light waves are aligned in phase and frequency, making constructive/destructive interference stable and visible. LEDs emit incoherent light: the light waves have random phase relationships and come in a range of wavelengths, which causes interference effects to wash out and average away.
Even very bright LEDs, especially when focused through a straw or lens, will only show basic shadow and refraction effects — not stable fringes or patterns. What LEDs Can Be Used For is Refraction and lens experiments, Color mixing and absorption, Total internal reflection (TIR), Polarization (with filters), Beam direction and ray tracing, Basic diffraction with gratings.
Razor Blade Diffraction: Comparing Slit and Edge Interference (OPTI-13)
Sensors Used: Camera (photo or video capture)
What’s Measured: Fringe spacing, diffraction pattern shape
Description
Light behaves as a wave, and nowhere is this more elegantly visible than in the patterns it forms when it interacts with narrow structures. In this experiment, you’ll explore two distinct types of diffraction using common razor blades and a laser pointer: single-slit interference and edge diffraction.
Begin by arranging two razor blades very close together to form a narrow slit, just a fraction of a millimeter wide. Shine a laser beam through the slit and project the light onto a wall or screen a few meters away. You’ll observe a symmetric diffraction pattern — a broad central maximum flanked by alternating bright and dark fringes. This is classic single-slit diffraction, where light waves emanating from different parts of the slit interfere constructively and destructively. The narrower the slit, the wider the central fringe becomes, vividly demonstrating the inverse relationship between slit width and diffraction angle.
Next, switch to a single razor blade. Shine the same laser so that it just grazes the sharp edge, casting a partial shadow on your projection surface. This time, instead of symmetric fringes, you’ll see a soft, asymmetric fading along the edge of the shadow. This is edge diffraction, a form of Fresnel diffraction, where light bends around the obstacle and interferes with itself in the near field. Though subtler, this pattern still reveals the wave nature of light — particularly how it behaves in the boundary region between light and shadow.
Together, these two configurations illustrate the different ways light interacts with narrow structures. Whether passing through or around, the wave nature of light always finds a way to make itself seen.
Twin Pinhole Interference: Light as a Wave (OPTI-14)
Sensors Used: Camera (photo or video), optionally screen brightness analyzer
What’s Measured: Interference pattern, fringe contrast, coherence effects
Description
While a single pinhole yields a crisp, inverted image by channeling rays geometrically (as in OPTI-03), two closely spaced pinholes invite the wave nature of light to take center stage. This experiment recreates a version of the classic Young’s double-slit setup — not with slits, but with small circular apertures.
Using a thin sheet of aluminum foil stretched over a flat surface, gently press a needle to create two adjacent, tiny holes spaced about a millimeter apart. Alternatively, a fine kitchen sieve can serve as a source of multiple hole pairs with suitable spacing. Then shine a coherent light source, such as a laser pointer, through the foil onto a distant white surface or wall. You should observe a distinctive interference pattern — a set of alternating bright and dark fringes, spaced according to the wavelength of the light and the distance between the holes.
This pattern arises because light from the two holes travels slightly different paths to any given point on the wall. Where those paths differ by a multiple of the wavelength, constructive interference creates a bright fringe. Where they differ by a half-wavelength, destructive interference cancels the light, producing darkness.
Unlike the single pinhole of a camera obscura, which only reveals the ray-like behavior of light, this setup makes the wave behavior visible. It’s a vivid reminder that light — even in its simplest form — is never just a straight line.
References:
[1] “Diffraction patterns using sieves,” http://practicalphysics.org/diffraction-patterns-using-sieves.html
[2] “Fine cloth as a grating,” http://practicalphysics.org/fine-cloth-grating.html
Laser Diffraction to Probe Structure: Crystalline vs. Amorphous Materials (OPTI-15)
Sensors Used: Camera (photo/video), optional spectrogram or intensity analyzer
What’s Measured: Diffraction pattern structure, symmetry, and coherence
Description
By shining a laser pointer through fine powdered materials, we can reveal a hidden world of order and disorder — a visual echo of atomic arrangement. This experiment explores how the internal structure of a material — whether crystalline or amorphous — shapes the way it interacts with light.
Begin with a sample of table salt or granulated sugar, both of which have well-defined crystal lattices. Spread a small amount on a glass slide or piece of clear tape, and shine a laser pointer through it onto a nearby wall. You’ll observe striking speckle patterns or symmetric diffraction spots, caused by the regular spacing of particles scattering the coherent laser light in predictable directions. This is a simple analog to X-ray crystallography, where scientists probe crystal structure using much shorter-wavelength radiation.
Now try the same setup with an amorphous material — such as flour, chalk powder, or baking soda. The laser light still scatters, but the pattern appears diffuse and irregular. Without long-range order, there’s no consistent interference — just random diffusion.
You can also test solid materials: pass the laser through a sugar cube, crystalline calcite, or even transparent filters like diffraction gratings, CDs, or bathroom glass. Each structure scatters light in a characteristic way — from chaos to symmetry — revealing hidden geometric properties without breaking anything open.
This experiment is not only visually rich but conceptually deep. It illustrates how matter’s microscopic architecture — ordered or disordered — leaves a signature in the light it diffracts. It’s wave physics made visible, right in the palm of your hand.
Bubble Colors: Interference in Soap Films (OPTI-16)
Sensors Used: Smartphone camera
What’s Measured: Interference pattern, qualitative film thickness, color distribution
Description
Soap films are more than just playful bubbles — they’re natural canvases for the physics of light. When a thin film of soap is stretched across a wire loop or frame and illuminated by white light, it creates vivid, swirling bands of color that shift and flow in mesmerizing patterns. These colors arise from thin-film interference: light reflecting from the front and back surfaces of the film overlaps, and depending on the film’s thickness and the light’s wavelength, certain colors are amplified while others cancel out.
By observing the soap film through your smartphone camera, you can enhance the visibility of the interference patterns, capturing details the eye might miss. The ever-changing shapes and hues trace variations in the film’s thickness as gravity pulls it downward and evaporation subtly alters its structure. In areas where the film becomes extremely thin, the colors vanish altogether, revealing “black film” regions where complete destructive interference occurs.
This experiment beautifully demonstrates how light waves interact and interfere, making abstract wave principles visible and dynamic. It also draws connections to optical technologies — from anti-reflective coatings to oil slicks — and provides a visually stunning introduction to the concept of optical path differences.
References:
[1] “Soap film,” http://practicalphysics.org/soap-film.html
DIY Spectrometer
DIY Spectrometer with CD/DVD (OPTI-17)
Sensors Used: Smartphone camera
What’s Measured: Light spectrum (qualitative), spectral lines, relative intensity
Description
Light may seem white to our eyes, but with the right tools, we can reveal the hidden rainbows within. One of the most accessible and powerful tools for this is a DIY spectrometer — and you can build one with nothing more than a cereal box, a slit, and a piece of a CD or DVD.
To begin, cut a narrow slit in one end of a dark box (like a cereal box), and position a fragment of a CD or DVD at an angle inside. This disc acts as a diffraction grating, separating light into its constituent wavelengths. Point your smartphone camera through a viewing hole toward the grating, and aim the slit at a light source. You’ll see a colorful spectrum appear on your screen.
This setup can resolve the continuous spectrum of sunlight or incandescent bulbs, and more interestingly, the discrete lines from sodium lamps or fluorescent tubes. Try photographing each spectrum and comparing them — some lights show a rainbow fade, others a series of sharp lines or missing bands. Filters placed in front of the slit can isolate portions of the spectrum, revealing how different materials absorb or transmit light.
With this simple tool, you’re essentially doing atomic spectroscopy — identifying the “fingerprints” of elements based on the light they emit or absorb. Try using it to compare natural sunlight to artificial light sources, or even starlight, if your camera and alignment are good enough. For a deeper dive, image analysis software or a spectral analysis app can help you measure line positions and intensities.
This experiment opens a vivid window into atomic physics, quantum energy levels, and the hidden structure of everyday light. All with just a phone and a CD.
Bonus Exploration: While our eyes can’t see infrared light, your smartphone camera might. Try pointing an infrared source — like a TV remote or IR LED — through the slit and observe the pattern on your screen. You may be able to spot a faint infrared diffraction pattern, especially if you use a DVD fragment (which has finer grooves). This is a fascinating way to reveal the invisible using everyday technology — a peek into the world just beyond red.
References:
[1] “Diffraction on a CD,” https://physicsexperiments.eu/1704/diffraction-on-a-cd
[2] “A fresh look at light: build your own spectrometer,” https://www.scienceinschool.org/article/2007/spectrometer/
DIY Spectrometer with Grating (OPTI-18)
Sensors Used: Smartphone camera
What’s Measured: Light spectra, wavelength distribution, emission/absorption lines
Description
A compact, high-resolution spectrometer can be built using a diffraction grating and a smartphone—offering a hands-on way to explore the spectral composition of light from everyday sources. Unlike CD- or DVD-based setups, this version uses a professional or repurposed transmission grating, which yields cleaner and more stable spectra, especially when paired with a narrow slit and a light-proof enclosure.
To assemble the device, create a darkened box with a narrow vertical slit at the front. Inside the box, position the diffraction grating at an angle—typically around 45 degrees—so that incoming light is spread into its component wavelengths. Place your smartphone camera at the back of the box, aligned to record the diffracted spectrum. When a white light source, such as sunlight or an incandescent bulb, shines through the slit, the smartphone captures a band of color corresponding to the light’s spectral content.
This setup enables students to record emission lines from gas lamps, observe absorption effects using filters, and compare the spectra of different types of artificial lighting. With careful alignment and calibration—matching known spectral lines to pixel positions—it becomes possible to extract quantitative wavelength data. The experiment demonstrates how light, when passed through a diffraction element, reveals the rich structure of the electromagnetic spectrum, and how smartphones can serve as precise scientific instruments with just a bit of optical engineering.
This method offers finer resolution than CD-based spectrometers, and is especially useful for detecting narrowband emissions or subtle absorption features. Grating films can often be obtained from educational kits, online science suppliers, or repurposed from older optical equipment.
References:
[1] DIY Webcam Diffraction Grating Spectrometer, http://physicsopenlab.org/2015/11/26/webcam-diffraction-grating-spectrometer/
Experiments with DIY Spectrometer (OPTI-19)
Sensors Used: Smartphone camera
What’s Measured: Emission spectra, absorption lines, fluorescence wavelengths
Description
Once your DIY spectrometer is assembled—whether from a CD, DVD, or diffraction grating—it becomes a gateway to a wide range of real-world investigations into light and matter. This experiment focuses on exploring the composition and behavior of different light sources through atomic spectroscopy, astrophysical observation, and fluorescence analysis.
Start by observing emission spectra from gas-discharge lamps such as sodium, neon, or mercury vapor. Each element emits light at specific wavelengths, resulting in sharp, characteristic lines that appear like a barcode through your spectrometer. These “fingerprints” provide direct evidence of atomic transitions and are the basis of atomic spectroscopy—used in everything from streetlamp engineering to chemical analysis.
Next, take your spectrometer outside at night to point it at bright stars or planets. Although faint, the spectra from celestial objects contain absorption lines caused by elements in their atmospheres. With patience, long exposure settings, and image stacking techniques, you can begin to analyze the composition of stars—just as astronomers have done for over a century. This transforms your smartphone into a tool of astrophysics, demonstrating how we know “what stars are made of” without ever visiting them.
For a hands-on laboratory application, explore fluorescence using your spectrometer and a UV light source. Shine ultraviolet light on materials such as chlorophyll extract, tonic water, or certain edible oils. Fluorescent substances absorb high-energy UV photons and re-emit visible light at longer wavelengths. Your spectrometer will reveal this shift in the emission spectrum—often appearing as a bright line or band distinct from the original UV source.
These experiments showcase the power of spectral analysis to uncover hidden structures in light. From classroom demonstrations to nighttime stargazing and fluorescence testing, the DIY spectrometer transforms abstract physics into vivid, colorful, and discoverable patterns.
References:
[1] “Atomic Spectroscopy,” https://physicsopenlab.org/2015/11/30/atomic-spectroscopy/
[2] “What are stars made of?” https://www.scienceinschool.org/article/2016/what-are-stars-made/
[3] “Fluorimetry of Edible Oils,” https://physicsopenlab.org/2015/12/18/fluorimetry-of-edible-oils/
Beyond the Visible: Observing Infrared and Ultraviolet Light (OPTI-20)
Sensors Used: Smartphone camera, light sensor, thermometer, thermal camera, UV-sensitive materials or sensors
What’s Measured: Infrared and ultraviolet radiation, thermal effects, wavelength-dependent responses
Description
We often speak of light as if it ends at the colors we can see—but in reality, the electromagnetic spectrum stretches far beyond. This experiment invites students to step outside the narrow range of visible light and explore the hidden regions of infrared and ultraviolet, using nothing more than a smartphone, simple tools, and a little clever observation.
Begin with a recreation of Herschel’s historic discovery of infrared light. Direct sunlight through a prism to disperse it into a visible spectrum on a sheet of paper. Place thermometers just beyond the red edge of the spectrum, and observe how the temperature rises—proof of invisible infrared radiation. A similar setup on the violet end can be used to explore ultraviolet light, particularly if you introduce UV-sensitive materials such as beads or sensors like the GUVA-S12SD. These materials change color or output voltage when exposed to UV, revealing a world beyond human perception.
Next, use your smartphone camera to explore modern IR detection. While invisible to the eye, many smartphone cameras—especially those without strong IR filters—can detect near-infrared light. Point a remote control or IR LED at the camera, press a button, and you’ll see the emitter blink in ghostly white or purple light. This simple test makes the idea of non-visible radiation instantly real.
For a more hands-on exploration, pass sunlight through filters or prisms and place UV-reactive beads or photosensitive paper at various points to observe the effects of ultraviolet radiation. Conversely, place different materials—plastic, glass, sunscreen—between the light source and the sensor to investigate how well they block or transmit these hidden wavelengths.
Together, these activities offer a compelling window into parts of the spectrum we cannot see, but which shape everything from climate to technology. They also spark deeper discussions about how light energy varies with wavelength, and how modern tools allow us to detect what our senses cannot. As Herschel once proved with a thermometer, the edge of sight is not the edge of knowledge.
References:
[1] “Infra-red and ultraviolet radiation,” http://practicalphysics.org/waves-Infra-red.html
[2] “Beyond the visible spectrum,” http://practicalphysics.org/beyond-visible-spectrum.html
[3] “Herschel’s infra-red experiment,” http://practicalphysics.org/herschel’s-infra-red-experiment.html
Exploring Fluorescence with a UV Lamp: Identifying Hidden Properties of Materials (OPTI-21)
Sensors Used: Smartphone camera
What’s Measured: Emission of visible light under UV excitation
Description
Fluorescence is one of nature’s most visually striking phenomena — the spontaneous emission of visible light by certain materials when illuminated by invisible ultraviolet (UV) light. In this experiment, students use a handheld UV lamp, ideally one equipped with a filter to remove residual visible light, to explore this hidden world.
When UV light shines on fluorescent substances like tonic water, highlighter ink, laundry detergent, or fluorescent minerals such as fluorite and calcite, the absorbed energy excites electrons to higher states. As these electrons relax back to their ground state, they emit light in the visible spectrum, producing an unmistakable glow. This glowing fluorescence offers a direct and memorable demonstration of energy absorption and emission — and gives students a tangible feel for the quantum behavior of light and matter.
With the aid of a smartphone camera in a darkened room, students can photograph and record this process. They may even compare different brands of products to see how ingredients like whitening agents or dyes affect fluorescence. The experiment opens the door to deeper discussions about photonics, luminescence, and how this principle underpins everything from UV security markings to biological imaging and forensic science.
References:
[1] “What is Fluorescence?” https://physicsopenlab.org/2015/12/15/what-is-fluorescence/
Candle Soot and Infrared Absorption (OPTI-22)
Sensors Used: Smartphone camera (for IR detection)
What’s Measured: Infrared absorption (qualitative)
Description
This experiment offers a hands-on way to explore how particulate matter, such as soot, interacts with infrared radiation. Begin by holding a clean, cool glass slide above a candle flame for a few seconds. A thin, dark layer of soot will collect on the slide. Once cooled, shine an infrared light source—such as a TV remote control—through the sooty portion of the slide and observe the result using your smartphone camera.
Most smartphone cameras can detect near-infrared light that’s invisible to the human eye. Point the remote directly at the camera to confirm this: pressing a button will make the IR LED on the remote visibly flash in the camera preview. When you now place the sooty slide between the IR source and the camera, you’ll notice a significant dimming or complete blocking of the infrared light.
This simple but powerful demonstration reveals how soot — made of carbon nanoparticles — strongly absorbs IR radiation, which is why it’s used in industrial heat shielding and blackbody experiments. It also connects to real-world topics like climate science and atmospheric physics, where soot and aerosols play a major role in absorbing thermal radiation.
Observing Sodium’s Light Absorption (OPTI-23)
Sensors Used: Eyes, smartphone camera (optional)
What’s Measured: Light absorption and re-emission at specific wavelengths
Description
This visually striking experiment demonstrates atomic absorption and emission using sodium vapor. When a sodium flame—created by introducing sodium salts into a flame—is placed in front of a sodium streetlamp or other strong yellow sodium light source, something surprising happens: the flame appears darker, not brighter, against the background light.
This effect occurs because sodium atoms in the flame absorb the same characteristic yellow wavelength (around 589 nm) emitted by the streetlamp. Instead of passing directly through, the absorbed light is re-emitted in all directions, effectively reducing the intensity of light reaching the observer along the original line of sight. The result is a flame that appears as a dark “shadow” against the light source, even though it’s actively glowing.
This experiment offers an elegant and accessible way to explore the concept of resonance absorption—how atoms selectively absorb light at specific wavelengths corresponding to electronic transitions. It provides a real-world connection to the working principles behind atomic absorption spectroscopy, a technique widely used in chemical analysis, atmospheric studies, and astronomy to determine the composition of distant stars and gases.
References:
[1] “Dark Flames,” https://www.thenakedscientists.com/get-naked/experiments/dark-flames
Polarization
Polarization with Sunglasses (OPTI-24)
Sensors Used: Smartphone camera
What’s Measured: Light intensity as a function of polarization angle (Malus’s Law)
Description
Polarization is one of the most visually striking properties of light, and it can be explored with nothing more than a smartphone and a pair of polarized sunglasses. This experiment reveals how light waves can be filtered based on their orientation, and it offers a vivid introduction to wave optics—no lasers required.
Start by taking two pairs of polarized sunglasses, preferably the inexpensive kind used in 3D movie theaters. Hold one lens in front of your smartphone camera and slowly rotate the second lens in front of it. You’ll see the screen darken dramatically at certain angles—most notably when the two lenses are oriented at 90 degrees to each other. This occurs because polarized filters only transmit light oscillating in a specific direction, and when those directions are perpendicular, almost all light is blocked.
To go further, point your phone-camera setup at a reflective surface, such as water or a car window. Rotate the polarizing lens and observe how the brightness of the reflected light changes. This demonstrates that reflected light is often partially polarized—a phenomenon known as Brewster’s angle—and that polarizing filters can reduce glare by blocking these components.
This experiment offers a hands-on exploration of Malus’s Law, which describes how the transmitted light intensity varies with the cosine squared of the angle between two polarizers. With a little effort, students can even quantify this relationship by using a light meter app or analyzing pixel brightness.
Important Note: Not all polarization filters are created equal. Most sunglasses use linear polarization, which blocks light oscillating in a specific plane. However, some glasses used in newer 3D movie systems use circular polarization, which involves a helical twist in the light wave. These will not work for this experiment. If you’re unsure, try rotating the glasses in front of a smartphone screen: if the image darkens at certain angles, you’re dealing with linear polarization.
References:
[1] “Polarized Light,” https://www.youtube.com/watch?v=gP751qpm4n4
[2] “Making a Cell Phone Camera Polarizing Filter From Sunglasses,” https://www.instructables.com/id/Making-a-Cell-Phone-Camera-Polarizing-Filter-from-/
Polarization of Smartphone Screen (OPTI-25)
Sensors Used: Smartphone camera, polarizing filters
What’s Measured: Polarization direction and transmitted light intensity
Description
Most modern smartphone screens—and many LCDs, including calculator displays—emit linearly polarized light. This subtle but important feature allows for a simple and engaging demonstration of polarization using nothing but your own phone and a pair of polarized glasses.
Begin by taking a pair of linear polarized glasses, such as the inexpensive ones given out in many 3D movie theaters. Hold the glasses in front of your smartphone screen and rotate either the glasses or the phone itself. At two angles during a 360-degree rotation, you’ll notice the screen becomes almost completely dark. These positions reveal the alignment of the screen’s internal polarizing filter—usually set around 45 degrees—to minimize conflicts with everyday eyewear.
This experiment vividly illustrates how polarization filters interact. For an even more striking version, place a polarizing filter in front of your smartphone camera such that the screen becomes black. Then introduce a second filter between the first one and the screen. As you rotate this middle filter, the screen will reappear—brightest when the angle between the filters is 45 degrees. This effect demonstrates how multiple polarization layers interact according to Malus’s Law, and can be used to reinforce the concept of wave orientation and vector components.
Important Note: Some modern 3D glasses use circular polarization rather than linear. These will not work for this experiment, as circular polarization doesn’t block linearly polarized light in the same way.
References:
[1] “Disappearing monitors - the power of polarisation,” https://www.thenakedscientists.com/get-naked/experiments/disappearing-monitors-power-polarisation
Exploring Polarization Effects in Materials (OPTI-26)
Sensors Used: Smartphone camera, polarizing filters
What’s Measured: Polarization rotation, birefringence, optical activity, reflection suppression
Description
Once you’ve explored basic polarization with sunglasses and screens, it’s time to apply that knowledge to a fascinating range of real-world materials. This experiment collects several striking effects into a single exploration of how different substances interact with polarized light.
Begin by placing a polarizing filter in front of your smartphone screen and adjusting it until the screen appears black—this is your baseline setup. Now insert different materials between the screen and the filter and observe how the polarization of light is altered.
A glass of sugar water offers one of the most classic demonstrations of optical activity. As you dissolve increasing amounts of sugar into a thick glass of water (ideally 10 cm wide), the polarization of the transmitted light slowly rotates. You’ll see the dark screen reappear through the filter as the angle of polarization changes. This effect is subtle but striking, and it scales with both concentration and path length. You’re effectively replicating an optical polarimeter—a tool once vital for sugar content analysis.
Next, try materials that exhibit birefringence, such as plastic wrap or transparent CD cases. Stretch a piece of saran wrap tightly over a clear jar, or simply hold a cracked CD case in the path of polarized light. You’ll observe stress patterns and vivid color fringes, thanks to how these materials split light into two different polarization modes traveling at slightly different speeds. This phenomenon, known as photoelasticity, offers a glimpse into internal stress patterns invisible to the naked eye.
You can also explore polarization by reflection. Shine polarized light at a shallow angle onto a glass surface or water and look at the reflection through your polarizing filter. At a specific angle—called Brewster’s angle—the reflected light becomes fully polarized and disappears through the filter, demonstrating one of the key principles behind polarized sunglasses and glare reduction in photography.
Finally, examine a small piece of calcite, a natural crystal known for its strong birefringence. Calcit you can get on the internet for $5. When placed between your phone and a polarizer, calcite splits a single image into two. Rotating the crystal or the filter reveals and hides these duplicate images, offering a tangible example of polarization splitting. For an added extension, try shining a laser through the crystal to explore the directional nature of its optical behavior.
Together, these variations reveal just how much hidden structure lies in the way materials shape, rotate, or split polarized light—linking wave optics to everything from geology and chemistry to industrial design and vision science.
References:
[1] “Measuring the concentration of sugar with polarized light,” https://van.physics.illinois.edu/qa/listing.php?id=42894
[2] “Photoelasticimetry,” physicsexperiments.eu/1951/photoelasticimetry
Faraday Effect with Olive Oil and Kerr Effect Demo (OPTI-27)
Sensors Used: Eyes, smartphone camera (optional)
What’s Measured: Rotation of polarization angle due to magnetic field
Description
This experiment explores the remarkable ability of magnetic fields to influence light, a phenomenon known as magneto-optics. In the first part, students can observe the Faraday effect using nothing more than olive oil, a strong magnet, and a polarized light source such as an LCD screen or a smartphone with a polarizing filter. When polarized light passes through a transparent medium like olive oil that is subjected to a magnetic field, the plane of polarization rotates. By placing a second polarizing filter (analyzer) after the olive oil and adjusting its orientation, students can detect and measure this rotation, which increases with magnetic field strength and the length of the light path. This effect beautifully illustrates how magnetism and light can interact in matter.
The second part of the activity is more conceptual and introduces the magneto-optical Kerr effect (MOKE), in which the polarization of light is altered upon reflection from a magnetized surface. Although demonstrating MOKE typically requires specialized magneto-optical disks or materials, the underlying idea is accessible: light interacts with magnetically ordered materials and picks up information about their magnetization state. This effect is used in advanced optical data storage and in characterizing magnetic properties of materials.
Together, these phenomena connect electromagnetism, optics, and materials science, offering a glimpse into how magnetism can control light — not just in laboratories, but potentially in optical communication and quantum technologies.
References:
[1] “Control light with magnets and olive oil?! (Faraday effect),” https://www.youtube.com/watch?v=XhU-nNiAgtI
[2] “Control light with heat and magnets (Magneto-optical Kerr effect),” https://www.youtube.com/watch?v=UTquUbvzJII
Tyndall and Rayleigh Scattering and Raman Spectrometer
The Tyndall Effect: Visualizing Light Scattering in Colloids (OPTI-28)
Sensors Used: Smartphone camera
What’s Measured: Scattering visibility, beam profile, hue variation
Description
When a laser beam or bright smartphone flashlight is passed through a colloidal suspension—such as water mixed with a small amount of milk, flour, or soap—the beam becomes visible inside the liquid. This phenomenon, known as the Tyndall effect, demonstrates how particles too small to settle out can scatter light. Unlike true solutions like sugar water, which remain transparent, these colloids reveal the beam’s path dramatically. Filming the effect with a smartphone camera captures subtle hues and intensity shifts, especially with red or green lasers. Varying the medium—such as cloudy juice or dusty air—further highlights how suspended particles affect visibility and light transport.
References:
[1] “The Tyndall Effect,” Hackaday.io, https://hackaday.io/project/4736-the-tyndall-effect
Rayleigh Scattering and Sunset in a Jar (OPTI-29)
Sensors Used: Smartphone camera
What’s Measured: Color changes in scattered and transmitted light
Description
By dissolving only a trace amount of milk or soap in pure water, and then illuminating the mixture from the side, students can simulate atmospheric Rayleigh scattering. From the side, the scattered light appears bluish; when viewed head-on through the jar, it shifts toward orange or red. This elegant experiment mimics why the sky appears blue during the day and reddish at sunset. The setup requires only a glass jar, a side-facing beam (from a flashlight or laser), and a smartphone camera to capture the changing color gradients. It provides a vivid analog of atmospheric optics in a controlled, tabletop format.
Polarization of Scattered Light (OPTI-30)
Sensors Used: Smartphone camera, polarizing filter
What’s Measured: Intensity modulation with filter angle, polarization axis of scattered light
Description
Scattering doesn’t just depend on particle size—it can also polarize light. In this continuation of the Tyndall setup, a polarizing filter placed at 90° to the beam reveals how side-scattered light becomes partially polarized. As the filter is rotated, the intensity of the observed beam changes, demonstrating the directional dependence of polarization. Repeating the experiment with different laser colors, or measuring at different angles, reveals subtle variations in polarization strength and orientation—offering a hands-on way to explore wave behavior and optical vector properties.
DIY Raman Spectrometer and Molecular Fingerprints (OPTI-31)
Sensors Used: Modified smartphone or webcam (IR filter removed), laser, diffraction grating
What’s Measured: Raman shift spectra, molecular vibrational modes
Description
Raman spectroscopy is a powerful analytical technique that provides insights into the molecular composition of substances by observing inelastic scattering of monochromatic light. By constructing a DIY Raman spectrometer, enthusiasts can explore this method using accessible components. Key elements include a stable laser source (commonly 532 nm), optical components to direct and focus the laser, a diffraction grating to disperse the scattered light, and a sensitive detector, such as a modified camera with its infrared filter removed. Notch or edge filters are crucial to isolate the weak Raman signals from the intense Rayleigh scattered light. Calibration tools and software, like Octave, assist in validating collected data and matching molecular signatures accurately. This hands-on project offers a unique opportunity to delve into molecular spectroscopy and understand the principles behind Raman scattering.
References:
[1] “Intro to DIY Raman Spectroscopy,” https://www.youtube.com/watch?v=tRrOdKW06sk
Michelson Interferometer
Michelson Interferometer: Measuring Tiny Distances with Light (OPTI-32)
Sensors Used: Laser pointer (light source), smartphone camera (to record fringes)
What’s Measured: Number of fringe shifts, inferred displacement
Description
The Michelson interferometer is a beautiful way to “see” the invisible — detecting motions so small that rulers can’t measure them. In this simplified experiment, students build a basic interferometer and use it to measure tiny displacements, such as pressing gently on a mirror or vibrating a speaker membrane.
A laser pointer provides a steady beam, which is split into two paths by a simple beam splitter (even a thin glass slide can work). Each beam reflects off a mirror and returns to a screen or camera, where they interfere to form bright and dark fringes. One of the mirrors is mounted on a movable surface — perhaps resting on a spring, or simply pressed with a finger or a screw.
As the movable mirror shifts — even by a fraction of a wavelength — the interference pattern changes. Each fringe that appears or disappears corresponds to a change in path length of half a wavelength. By counting the number of shifting fringes, students can measure how far the mirror moved, even if it’s just a few micrometers.
This version avoids heating or metal expansion and focuses purely on displacement. Students can try vibrating the mirror with a piezo buzzer, a tuning fork, or a gentle tap, and record the resulting fringe shifts using a smartphone camera in slow motion or video mode.
References:
[1] “Using a Michelson Interferometer to Measure Coefficient of Thermal Expansion of Copper,” https://physlab.org/wp-content/uploads/2016/04/Scholl_liby.pdf
Using a Michelson Interferometer as an Optical Microphone (OPTI-33)
Sensors Used: Laser pointer (light source), smartphone camera or photodiode (detector)
What’s Measured: Fringe movement caused by acoustic or thermal vibrations
Description
A Michelson interferometer is not just a precision ruler for measuring distance—it can also become a microphone made of light. In this experiment, students transform the interferometer into an optophone, detecting sound vibrations through changes in light interference.
The basic interferometer setup remains the same: a laser is split into two beams by a beam splitter, reflected by mirrors, and then recombined to form an interference pattern. But instead of a mirror on a rigid mount, one mirror is attached to a flexible diaphragm (such as a balloon membrane, a thin foil, or even a speaker cone). When a sound wave reaches this diaphragm—whether from a tuning fork, voice, or music—it vibrates slightly, moving the mirror and changing the optical path length. This causes subtle shifts in the interference fringes.
These variations can be recorded with a smartphone camera, or measured in real time with a photodiode connected to a sound visualization app or oscilloscope. The setup effectively converts sound pressure into optical signals without any wires touching the source — ideal for demonstrating principles of non-contact sensing and wave interference.
An additional variation involves placing a candle flame near one arm of the interferometer. As the hot air above the flame rises and fluctuates, it changes the refractive index of the air in that path. These microscopic changes distort the beam’s path just enough to cause visible fringe shifts — a powerful visualization of how temperature gradients and acoustic waves can influence light.
Measuring Sub-Micron Displacements with Laser Interferometry and a Piezo Speaker (OPTI-34)
Sensors Used: Laser pointer, piezoelectric actuator, smartphone camera
What’s Measured: Nanometer-scale displacement through fringe shift count
Description
This experiment reveals just how sensitive light can be to motion — even movements smaller than the width of a human hair. By combining a Michelson interferometer with a piezoelectric element, students can measure displacements on the scale of nanometers using nothing but light and a smartphone camera.
A small mirror is mounted onto a piezoelectric speaker or actuator, which is placed in one arm of the interferometer. When a voltage is applied to the piezo, it expands or contracts by a tiny, controlled amount — sometimes just a fraction of a micron. Because the laser’s interference fringes are sensitive to changes in path length on the order of half a wavelength (e.g., ~300 nm for visible light), even the smallest shift in mirror position causes visible fringe movement.
By slowly varying the voltage and recording the resulting fringe shifts, students can count how many full fringes pass and use the laser’s wavelength to determine the total change in length. Each complete fringe cycle corresponds to a path length change of one wavelength, which allows sub-micron displacements to be quantified with remarkable precision.
This is not just an optics experiment — it’s a gateway into metrology, materials science, and precision engineering. The same principles underpin vibration sensors in seismology, scanning probe microscopes, and high-end industrial equipment.
Laser Diode Self-Mixing for Range-Finding and Sub-Micron Vibration Measurement (OPTI-35)
Sensors Used: Laser diode with built-in photodetector, optional photodiode circuit, smartphone or computer for signal analysis
What’s Measured: Vibration or displacement based on laser intensity modulation
Description
Self-mixing interferometry is an elegant twist on the traditional interferometer — the interference occurs inside the laser itself. In this experiment, students explore how a laser diode with an integrated photodiode can be turned into a precise tool for measuring distance and detecting tiny vibrations, all without external optics or beam splitters.
When a portion of the laser light reflects off a nearby surface and re-enters the laser cavity, it interferes with the light already oscillating inside. This changes the laser’s output power in a measurable way, producing a modulation pattern that depends on the distance and motion of the reflecting surface. By analyzing these output fluctuations — either with an oscilloscope, smartphone-based photodetector, or computer audio input — students can extract sub-micron displacement data from simple setups involving a moving mirror, vibrating membrane, or even a speaker diaphragm.
Because the measurement sensitivity is tied to the laser’s wavelength, each oscillation in the signal corresponds to a movement of half a wavelength — around 300-500 nanometers for visible light. Applications of this method range from non-contact vibration sensing in engineering to bio-diagnostics and materials research.
What makes this technique so compelling is its minimalism: there’s no need for elaborate interferometers. The laser, the surface, and a basic detector are enough. With some ingenuity, even consumer-grade laser diodes — such as those found in laser pointers or DVD drives — can be adapted for educational demonstrations.
References:
[1] “Laser diode self-mixing: Range-finding and sub-micron vibration measurement,” https://www.youtube.com/watch?v=MUdro-6u2Zg
Advanced Concepts
Exploring Fourier Optics and the 4F Correlator (OPTI-36)
Sensors Used: Smartphone camera or webcam (optional CCD), laser pointer
What’s Measured: Spatial frequency filtering, Fourier image transformations
Description
The 4F correlator is an elegant optical setup that demonstrates how lenses can perform real-time Fourier transforms. By placing an object (such as a transparency slide with a fine pattern) one focal length in front of a converging lens, the pattern’s Fourier transform is projected into the back focal plane. Here, spatial frequencies are separated in space: high frequencies appear farther from the center, while low frequencies cluster near the origin. A second lens performs an inverse Fourier transform, reconstructing the image at a second focal plane. If a mask or spatial filter is introduced at the Fourier plane — for example, blocking certain frequencies or directions — the output image is altered in real time, revealing how different features of the object are encoded in frequency space.
This system allows for physical manipulation of image content, enabling tasks like edge detection, directional filtering, or isolating specific components of complex patterns. It’s a powerful visual demonstration of the connection between image structure and its spectral decomposition. Using a webcam with its lens removed can help capture the processed image and reveal subtle details. The setup also lends itself well to demonstrations of spatial filtering and resolution limits — showing how even small pinholes or phase masks can dramatically reshape an image.
One fascinating aspect of Fourier optics is the presence of both amplitude and phase information in the Fourier plane. While light in this region carries a complex field — meaning it contains both the amplitude (which frequencies are present) and the phase (how those frequencies are arranged) — our eyes and cameras only detect intensity, which is proportional to the square of the amplitude ( | A | ²). This means that although phase is not directly visible, it is essential for reconstructing the image properly. If the phase is removed or scrambled, the result can be a completely unrecognizable blur, even if all the amplitudes are preserved. In short: amplitude tells you what’s in the image, but phase tells you where. This interplay is at the heart of optical imaging — and the 4F correlator makes it visible. |
References:
[1] “Intro to Fourier Optics and the 4F correlator,” https://www.youtube.com/watch?v=wcRB3TWIAXE