Would this work with the cameras in the HTC One and Nokia 1020? If so, Aptima could be sitting on an IP goldmine - build their own 13MP sensors, and still license the technology to HTC and Nokia for lots o' money...
Human eye's sensitivity peaks in the green area of the spectrum, but I don't know enough about the tech to say how they take that knowledge and turn it into a useful image.
Green sits in the middle, frequency-wise. Red is lower-frequency, blue is higher. I imagine that makes the math easier, having it in the middle rather than at one of the ends. Although I would have to wonder about out-of-range colors - does the clear also pick up infrared or ultraviolet?
P&S/DSLR cameras use separate IR/UV filters. Given the thinness requirements in mobile; if I had to guess I'd say the coatings that absorb them are applied on top of an existing element. Probably part of the lens assembly since it's made of the same materials as a discrete filter.
The filter itself is needed; otherwise you get screwed up color balance. ex plant leaves tend to be bright in IR.
They replaced green instead of red or blue because G is the largest contributor to L* to begin with. Intuitively (i.e. without going into trade secret knowledge) G is therefore the channel that they can "recover" from L* while causing the least noise.
L* is simply perceptually linearized Y (as in CIEXYZ), and Y is about 60% green depending on the source RGB chromaticities. Blue would actually be the *worse* choice as Y is <10% B.
lol the new CEO helps get a marketing push on PC hardware sites. So is this what's in the next Nexus phone or they are trying to push this because they lost that one?
Correct me if I'm wrong, but cyan and magenta are pigment/paint primary colours (i.e you can't make them from other pigment colours). Cyan and magenta can be created from Primary light colours of RGB, but you can't creat RGB from cyan and magenta light colors (or it is comparatively difficult and will use processing which you can save by sticking to RGB).
And then there is luminosity. Primary pigment colours all merge to form black colour. However all primary light colours merge to form white colour...
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
14 Comments
Back to Article
voodoobunny - Wednesday, July 17, 2013 - link
Would this work with the cameras in the HTC One and Nokia 1020? If so, Aptima could be sitting on an IP goldmine - build their own 13MP sensors, and still license the technology to HTC and Nokia for lots o' money...Krysto - Friday, July 19, 2013 - link
Thos companies are kind of pursuing their own ways for camera quality. You will probably see something like this in the Motorola X, though.blanarahul - Wednesday, July 17, 2013 - link
Cool! But why green? Why not make the blue pixels clear pixels?A5 - Wednesday, July 17, 2013 - link
Human eye's sensitivity peaks in the green area of the spectrum, but I don't know enough about the tech to say how they take that knowledge and turn it into a useful image.makerofthegames - Wednesday, July 17, 2013 - link
Green sits in the middle, frequency-wise. Red is lower-frequency, blue is higher. I imagine that makes the math easier, having it in the middle rather than at one of the ends. Although I would have to wonder about out-of-range colors - does the clear also pick up infrared or ultraviolet?DanNeely - Wednesday, July 17, 2013 - link
P&S/DSLR cameras use separate IR/UV filters. Given the thinness requirements in mobile; if I had to guess I'd say the coatings that absorb them are applied on top of an existing element. Probably part of the lens assembly since it's made of the same materials as a discrete filter.The filter itself is needed; otherwise you get screwed up color balance. ex plant leaves tend to be bright in IR.
DanNeely - Wednesday, July 17, 2013 - link
Because removing the green filters gets 50% of the light into the luminance channel instead of only 25%patrickjchase - Wednesday, July 17, 2013 - link
They replaced green instead of red or blue because G is the largest contributor to L* to begin with. Intuitively (i.e. without going into trade secret knowledge) G is therefore the channel that they can "recover" from L* while causing the least noise.L* is simply perceptually linearized Y (as in CIEXYZ), and Y is about 60% green depending on the source RGB chromaticities. Blue would actually be the *worse* choice as Y is <10% B.
thesavvymage - Wednesday, July 17, 2013 - link
wooshMayuyu - Wednesday, July 17, 2013 - link
Less luminance noise in exchange for chroma noise probably.jjj - Wednesday, July 17, 2013 - link
lol the new CEO helps get a marketing push on PC hardware sites.So is this what's in the next Nexus phone or they are trying to push this because they lost that one?
eio - Thursday, July 18, 2013 - link
why pair clear pixel with red & blue? shouldn't cyan & magenta be more efficient?SeleniumGlow - Thursday, July 18, 2013 - link
Correct me if I'm wrong, but cyan and magenta are pigment/paint primary colours (i.e you can't make them from other pigment colours). Cyan and magenta can be created from Primary light colours of RGB, but you can't creat RGB from cyan and magenta light colors (or it is comparatively difficult and will use processing which you can save by sticking to RGB).And then there is luminosity. Primary pigment colours all merge to form black colour. However all primary light colours merge to form white colour...
piroroadkill - Friday, July 19, 2013 - link
Seems pretty obvious to me.Not a bad idea at all.