Invented by Geoffrey B. Rhoads, Hugh L. Brunk, John D. Lord, Digimarc Corp
The Digimarc Corp invention works as follows
A spectral imager is configured to capture colour images in synchrony with controlled illumination by different color light-emitting diodes. The device uses a processor to apply a coupling coefficient to color images sampled to convert the sampled pixels to spectral channels that correspond to LED color and filter. The multi-spectral spectricity vectors generated at pixel positions are combined with spatial information in order to classify objects such as produce.Background for Sensor-synchronized spectrally-structured-light imaging
Both natural light (?ambient?) Flash-assisted photography (read in broader terms: “human assisted light supplementation?) Since the Daguerreotype, photography has been a thing. Present technology focuses on the latter type of lighting called ‘flash. The present technology concerns how primarily the latter form of lighting, which we call ‘flash’ “Or ‘hyperspectral imaging.?
In essence, by illuminating the scene with a series of brief (frame-synchronized),?spectrally? Even a Bayer pattern CMOS can be used to create an imaging spectrometer that has ‘N bands. N is currently around 5-10 bands but this will likely increase as the technology advances.
An introduction to the technology must note multi-chip leds (see, e.g. The Federal FM series from Edison, which dates back to 2012, is shown in FIG. 7 as at least a seeds for what the doctor prescribed regarding “spectrally structured lights.” The core idea, and the current preferred embodiment, is to synchronize pulsing different LED light sources with each frame of a CMOS camera in order to create informational base for N-band imagery. Other light sources can be considered, but 2012 standards require multi-chip or dual? LEDs are the leading candidates for this technology.
The 3 very well-known CIE color matching functions from 1931 and/or their orthogonally transformed functions are a particularly interesting choice of “bands”. The 3 well-known CIE color matching functions from 1931 and/or their orthogonally shifted functions are the most intriguing choice of “bands”. With such choices, the stage is set for taking the beyond-religiously-fervent universe of color photography to its multiverse destiny: blandly referred to as ?direct chromaticity capture? This disclosure.
The bulk of the disclosure focuses on the design principles and actual realizations that can turn virtually any electronic image sensor into an imaging spectrum analyzer via coordination with a supplemental light source. The core ‘how’ is now clear. The core?how? will then be clarified. Four essentially distinct applications, such as A) hyper-spectral image processing, B) medical imaging and C) radically improved color photography, which has been a culturally volatile topic, will also be discussed. Then, four essentially distinct applications will be presented and described. These include A) the niche application of hyper-spectral imaging, B) the medical imaging potential of this technology, C), the previously alluded to culturally volatile topic of radically improved color photography for both?digital cameras?
This disclosure has been significantly expanded in many areas since the initial disclosure.
Below are described many more configurations of system, lighting, sensing, pixel post-processing techniques, and devices. The disclosure is not limited to any particular embodiment, but contemplates a wide range of inventive combinations. As examples, we provide source code. The signal processing described can be implemented in software instructions that are executed on special-purpose processors or general-purpose computing devices, such as devices with DSPs and GPUs. These software instructions can be ported to processor device-specific firmware versions, ASICs or FPGAs. Cloud computing services can be used to execute the software in various combinations and for different purposes (such as training, classification, and recognition).
The following Detailed Description will make it easier to understand the foregoing, as well as other features and benefits of the current technology. It is based on the drawings that accompany the Detailed description.
FIG. “FIG.
The image shows a 2012 iPhone upper-left rearside, 40 with the camera aperture to the left (50) and the small flash unit aperture to the right (60), along with the simplified Bayer pattern of the sensor 70 depicted over the iPhone. After a ten- or fifteen-minute discussion with Applicant?s young grade school nephews and nieces, it is easy to explain that the red apple (20) lights up the small red sensors of the camera, and the green one (30) tends to illuminate the green ones. [See FIG. [See FIG.
The simplest thing to remember is that lighting matters and any?normal? The spectral characteristics used to light a ‘fixed’ scene will cause the digitalized output of a camera sensor to behave in a measurable way differently. scene. It is better to make this simple point now than later. As always, “range” is a fundamental issue. The distance between an object and a flash is fundamental to this technology. Practically all commercial flash photography is limited to a few meters, or maybe five or ten for certain types of photography. This technology will also have the same ranges, generally, and we will try to touch on how’spectral fidelity’ is affected by these ranges. “Spectral fidelity” will usually decrease with increasing range.
Concluding initial discussion of FIG. We can see in Figure 1 that the apple has two lighting sources, the sun 10, and maybe our flash unit from the smart phone 60. These may be used individually or together. There are many different types of ambient lighting. There are many other forms of “ambient” lighting than the sun, and digital cameras have also taken the technology behind the “flash unit?” to a level of sophistication and expense that is quite remarkable.” The technology of?the flash unit? has reached a level of sophistication and cost that is quite astounding.
FIG. The second part of the 101-level overview of technology continues with a plot that is both generic and typical of the three spectral profiles of an CMOS Bayer pattern sensor. The X axis represents the rainbow from blue (400 nanometer wavelength) to red (710 nm). The Y-axis has the label?relative responses? For this summary, it can simply mean the intensity of light at a specific wavelength that can produce signals on a modern detector (as shown by digital values after A/D conversion). These curves will be familiar to color camera designers, sensor designers etc. These curves are also familiar to photographers who have a more technical bent. These curves have a great deal of subtlety and variability. Manufacturers of cameras and sensors take a lot time to study and redesign how they manifest themselves. This technology introduces a new level of variability to the relatively mature and “stable” industry. As will be shown, this technology adds new and potent variability to the fairly mature and?stable? The initial discussion on FIG. It is worth noting that, in general, these filters are tuned to the best of their ability so that digital cameras will be able to?match? It is important to note that these filters have been and continue to be tuned in such a way that digital cameras can best?match? Natural colors as we see them. These curves are a close match to what color scientists call the CIE color matching function (and its many subtle variations).
FIG. “FIG. A new green curve is shown by the label 90. This represents an idealized “spectral reflectance”. A profile of a green Apple, and also a curve red, labeled 100, which represents the same for a red Apple. Scientists in color know that these curves will never reach zero at any wavelength and that they correspond to the “G” spectral shape. The spectral shapes are not likely to correspond with the Bayer filter?G? It’s pretty unlikely. “How do you like the apples?
For the sake of intuition, we could imagine that close-ups on our Bayer-pattern sensors in a smartphone camera or digital camera would be?lit up.? The green pixels will ‘light up’, 110 when they correspond to patches on the apple. Likewise, the red pixels will?light-up?,? 120 is for the patches of the sensor that view the red apple. This ‘lighting up’ is well known to imaging engineers and others. This is a simple correlation between the innate spectral profiles of an object and the spectral profiles of the sensor. The resultant digital signal values are much higher. This ‘correlation’ is a fact. It is widely accepted that the spectral flux of light quantified from a patch is multiplied by the spectral profile also quantified from the sensor. In other words, and as described in every book on color science, it is an integral multiplication between two spectral profiles, one based on the light flux of an object, and the second based upon spectral quantum efficiency for a pixel. This multiplication produces the well-known digital signal outputs of pixels. It also takes into consideration well-known issues such as analog signal to digital value factors. All this information is probably too much for a simple summary. . . We’re only showing that green apple tends to light up green-filtered pixel and red red !!).
FIG. The fourth figure now presents a highly-idealized “ambient”? Lighting source spectral curvature, 130. This simple diagram is meant to show that all light sources have a spectral structure. Photographers learn this from diapers. There is no such thing, as a streetwise expression goes: “There ain’t nothing like white light.
The second point is that this ambient white-ish illumination, which is generally unknown and ALWAYS DIFFERENT, will produce slightly different output values to our R G and B pixels of the Bayer (or other) types of filtered pixels. This ambient white-ish light, which is generally unknown but ALWAYS different, will produce slightly different values for our Bayer (or any other type of filtered pixels) R G B pixels. This is also well-known to engineers and photographers. The detailed point in FIG. In this example, FIG. 4 shows how the B pixels are a little lower in digital values when compared to the G pixels, if an object is illuminated with this type of illumination. This example could result in a 20%-30% reduction in signal in the B pixels compared to a pure ‘white’ illumination. The effect of this displayed example might be on the order of 20% to 30% less signal showing up in the B pixels than would otherwise show up with purely?white?
FIG. The main summary line from FIG. Now presenting a similar idealized, but nonetheless instructive example of illumination called here?slightly green-ish mainly bluish? The coordinate background is represented by 140 as a perfect straight line running from upper left to lower right. This figure is most interesting because it shows that the spectrum of light can be structured. As every lighting engineer knows. The raw physics of a light source can severely limit the ability to create a spectral pattern. This perfect line between 400 nanometers and 700 nanometers is theoretically possible (within 5-10% of 100%) with tungsten lamps and a sequence of five or ten well-chosen filters. However, it’s not easy to manipulate the spectrum of the tungsten bulb to do what you want. It has its own innate physics, thank you, and this is the palette that we are given. In later sections, we will focus on the many options available to manipulate modern LEDs’ “raw physics”. The next section will focus on modern LEDs and the many ways to manipulate their?raw physics?
Now, let’s return to FIG. “Now, let’s return to FIG. 150, G,? 150, G and R? The spectral response functions of the Bayer pixels are represented by R,?nderneinfachen?? Effective spectral responses of the Bayer Pixels. It is true that the physics behind Bayer pixels won’t change but now you can?know? If you know that a certain kind of light spectrum will be used to illuminate an object or scene, then it is possible to predict how they will respond. This could be expressed in English as: “OK, Mr. Apple. I know that under pure white light, my Bayer pattern pixels will display the colors and signals exactly the way they should. But in this new light, where I have modified the illumination profile, my raw pixel signals will look more like ‘effective? The profiles 150, 160, and 170 are the most effective. Again, FIG. FIG. The convention is to place a prime symbol on each of the three curves in FIG. 2.?
FIG. The 6th image continues the summary by showing our red apple. If we don’t inform our Bayer camera we are using funky lights to illuminate the fruit, it will display the apple in yellow on the screen of a digital camera or smart phone! Yellow is due to the idea that the apple’s actual reflective spectrum has not changed since curve 100 in FIG. Its?coupling’? The new response curves R? The new spectrally-shaped response curves G? The digital response of FIG. The digital response of FIG. The digital response of the G? channel. The R? The R? The red apple spectral curvature already had some coupling with the G channel (even though it’s a “red”? The resulting yellow would be described as a “dark yellow” As a matter of nitpicking. The point of FIG. As virtually every professional photographer knows, lighting is a major factor in capturing “true” color. color. FIG. FIG. “Knowing” what the spectral properties of the illumination are is important.
FIGS. The general summary for certain aspects of technology is probably 7 and 8. Replace what is in 2012 either a white LED or single LED with a multiLED. Dual-LED and synchronize the flashing with captured frames of the sensor. This is usually a Bayer-sensor, at least on smart phones.