E90Post
 


Coby Wheel
 
BMW 3-Series (E90 E92) Forum > BIMMERPOST Universal Forums > Off-Topic Discussions Board > I'm not blinking! Im asian.



Reply
 
Thread Tools Search this Thread
      01-22-2010, 11:44 AM   #1
Sara
Lieutenant General
5791
Rep
17,879
Posts

Drives: A car
Join Date: Aug 2007
Location: Nola

iTrader: (6)

I'm not blinking! Im asian.

When Joz Wang and her brother bought their mom a Nikon Coolpix S630 digital camera for Mother's Day last year, they discovered what seemed to be a malfunction. Every time they took a portrait of each other smiling, a message flashed across the screen asking, "Did someone blink?" No one had. "I thought the camera was broken!" Wang, 33, recalls. But when her brother posed with his eyes open so wide that he looked "bug-eyed," the messages stopped.
Wang, a Taiwanese-American strategy consultant who goes by the Web handle "jozjozjoz," thought it was funny that the camera had difficulties figuring out when her family had their eyes open. So she posted a photo of the blink warning on her blog under the title, "Racist Camera! No, I did not blink... I'm just Asian!" The post was picked up by Gizmodo and Boing Boing, and prompted at least one commenter to note, "You would think that Nikon, being a Japanese company, would have designed this with Asian eyes in mind." (See Techland's top 10 gadgets of 2009.)
Nikon isn't the only big brand whose consumer cameras have displayed an occasional - though clearly unintentional - bias toward Caucasian faces. Face detection, which is one of the latest "intelligent" technologies to trickle down to consumer cameras, is supposed to make photography more convenient. Some cameras with face detection are designed to warn you when someone blinks; others are programmed to automatically take a picture when somebody smiles - a feature that, theoretically, makes the whole problem of timing your shot to catch the brief glimpse of a grin obsolete. Face detection has also found its way into computer webcams, where it can track a person's face during a video conference or enable face-recognition software to prevent unauthorized access.
The principle behind face detection is relatively simple, even if the math involved can be complex. Most people have two eyes, eyebrows, a nose and lips - and an algorithm can be trained to look for those common features, or more specifically, their shadows. (For instance, when you take a normal image and heighten the contrast, eye sockets can look like two dark circles.) But even if face detection seems pretty straightforward, the execution isn't always smooth.
Indeed, just last month, a white employee at an RV dealership in Texas posted a YouTube video showing a black co-worker trying to get the built-in webcam on an HP Pavilion laptop to detect his face and track his movements. The camera zoomed in on the white employee and panned to follow her, but whenever the black employee came into the frame, the webcam stopped dead in its tracks. "I think my blackness is interfering with the computer's ability to follow me," the black employee jokingly concludes in the video. "Hewlett-Packard computers are racist." (See the 50 best inventions of 2009.)
The "HP computers are racist" video went viral, with almost 2 million views, and HP, naturally, was quick to respond. "Everything we do is focused on ensuring that we provide a high-quality experience for all our customers, who are ethnically diverse and live and work around the world," HP's lead social-media strategist Tony Welch wrote on a company blog within a week of the video's posting. "We are working with our partners to learn more." The post linked to instructions on adjusting the camera settings, something both Consumer Reports and Laptop Magazine tested successfully in Web videos they put online.
Still, some engineers question how a webcam even made it onto the market with this seemingly glaring flaw. "It's surprising HP didn't get this right," says Bill Anderson, president of Oculis Labs in Hunt Valley, Md., a company that develops security software that uses face recognition to protect work computers from prying eyes. "These things are solvable." Case in point: Sensible Vision, which develops the face-recognition security software that comes with some Dell computers, said their software had no trouble picking up the black employee's face when they tested the YouTube video.
YouTube commenters expressed what was on a lot of people's minds. "Seems they rushed the product to market before testing thoroughly enough," wrote one. "I'm guessing it's because all the people who tested the software were white," wrote another. HP declined to comment on their methods for testing the webcam or how involved they were in designing the software, but they did say the software was based on "standard algorithms." Often, the manufacturers of the camera parts will also supply the software to well-known brands, which might explain why HP isn't the only company whose cameras have exhibited an accidental prejudice against minorities, since many brands could be using the same flawed code. TIME tested two of Sony's latest Cyber-shot models with face detection (the DSC-TX1 and DSC-WX1) and found they, too, had a tendency to ignore camera subjects with dark complexions.
But why? It's not necessarily the programmers' fault. It comes down to the fact that the software is only as good as its algorithms, or the mathematical rules used to determine what a face is. There are two ways to create them: by hard-coding a list of rules for the computer to follow when looking for a face, or by showing it a sample set of hundreds, if not thousands, of images and letting it figure out what the ones with faces have in common. In this way, a computer can create its own list of rules, and then programmers will tweak them. You might think the more images - and the more diverse the images - that a computer is fed, the better the system will get, but sometimes the opposite is true. The images can begin to generate rules that contradict each other. "If you have a set of 95 images and it recognizes 90 of those, and you feed it five more, you might gain five, but lose three," says Vincent Hubert, a software engineer at Montreal-based Simbioz, a tech company that is developing futuristic hand-gesture technology like the kind seen in Minority Report. It's the same kind of problem speech-recognition software faces in handling unusual accents.
And just as the software is only as good as its code and the hardware it lives in, it's also only as good as the light it's got to work with. As HP noted in its blog post, the lighting in the YouTube video was dim, and, the company said, there wasn't enough contrast to pick up the facial shadows the computer needed for seeing. (An overlit person with a fair complexion might have had the same problem.) A better camera wouldn't necessarily have guaranteed a better result, because there's another bottleneck: computing power. The constant flow of images is usually too much for the software to handle, so it downsamples them, or reduces the level of detail, before analyzing them. That's one reason why a person watching the YouTube video can easily make out the black employee's face, while the computer can't. "A racially inclusive training set won't help if the larger platform is not capable of seeing those details," says Steve Russell, founder and chairman of 3VR, which creates face recognition for security cameras.
The blink problem Wang complained about has less to do with lighting than the plain fact that her Nikon was incapable of distinguishing her narrow eye from a half-closed one. An eye might only be a few pixels wide, and a camera that's downsampling the images can't see the necessary level of detail. So a trade-off has to be made: either the blink warning would have a tendency to miss half blinks or a tendency to trigger for narrow eyes. Nikon did not respond to questions from TIME as to how the blink detection was designed to work.
Why these glitches weren't ironed out before the cameras hit Best Buy is not something that HP, Nikon or Sony, when contacted by TIME, were willing to answer. Perhaps in this market of rapidly developing technologies, consumers who fork over a few hundred dollars for the latest gadget are the test market. A few years ago, speech-recognition software was teeth-gnashingly unreliable. Today, it's up to 99% accurate. With the flurry of consumer complaints out there, most of the companies seem to be responding. HP has offered instructions on how to adjust its webcam's sensitivity to backlighting. Nikon says it's working to improve the accuracy of the blink-warning function on its Coolpix cameras. (Sony wouldn't comment on the performance of its Cyber-shot cameras and said only that it's "not possible to track the face accurately all the time.") Perhaps in a few years' time, the only faces cameras won't be able to pick up will be those of the blue-skinned humanoids from Avatar.
Appreciate 0
      01-22-2010, 12:08 PM   #2
Kensta335
Brigadier General
Kensta335's Avatar
161
Rep
3,637
Posts

Drives: 335i Coupe
Join Date: Mar 2007
Location: So Cal 626

iTrader: (4)

tl;dr
Appreciate 0
      01-22-2010, 12:08 PM   #3
Dan in PA
Captain
Dan in PA's Avatar
United_States
88
Rep
729
Posts

Drives: 2007 E92 335i
Join Date: Sep 2008
Location: PA

iTrader: (0)

Garage List
In paragraph form:

When Joz Wang and her brother bought their mom a Nikon Coolpix S630 digital camera for Mother's Day last year, they discovered what seemed to be a malfunction. Every time they took a portrait of each other smiling, a message flashed across the screen asking, "Did someone blink?" No one had. "I thought the camera was broken!" Wang, 33, recalls. But when her brother posed with his eyes open so wide that he looked "bug-eyed," the messages stopped.

Wang, a Taiwanese-American strategy consultant who goes by the Web handle "jozjozjoz," thought it was funny that the camera had difficulties figuring out when her family had their eyes open. So she posted a photo of the blink warning on her blog under the title, "Racist Camera! No, I did not blink... I'm just Asian!" The post was picked up by Gizmodo and Boing Boing, and prompted at least one commenter to note, "You would think that Nikon, being a Japanese company, would have designed this with Asian eyes in mind." (See Techland's top 10 gadgets of 2009.)

Nikon isn't the only big brand whose consumer cameras have displayed an occasional - though clearly unintentional - bias toward Caucasian faces. Face detection, which is one of the latest "intelligent" technologies to trickle down to consumer cameras, is supposed to make photography more convenient. Some cameras with face detection are designed to warn you when someone blinks; others are programmed to automatically take a picture when somebody smiles - a feature that, theoretically, makes the whole problem of timing your shot to catch the brief glimpse of a grin obsolete. Face detection has also found its way into computer webcams, where it can track a person's face during a video conference or enable face-recognition software to prevent unauthorized access.

The principle behind face detection is relatively simple, even if the math involved can be complex. Most people have two eyes, eyebrows, a nose and lips - and an algorithm can be trained to look for those common features, or more specifically, their shadows. (For instance, when you take a normal image and heighten the contrast, eye sockets can look like two dark circles.) But even if face detection seems pretty straightforward, the execution isn't always smooth.

Indeed, just last month, a white employee at an RV dealership in Texas posted a YouTube video showing a black co-worker trying to get the built-in webcam on an HP Pavilion laptop to detect his face and track his movements. The camera zoomed in on the white employee and panned to follow her, but whenever the black employee came into the frame, the webcam stopped dead in its tracks. "I think my blackness is interfering with the computer's ability to follow me," the black employee jokingly concludes in the video. "Hewlett-Packard computers are racist." (See the 50 best inventions of 2009.)

The "HP computers are racist" video went viral, with almost 2 million views, and HP, naturally, was quick to respond. "Everything we do is focused on ensuring that we provide a high-quality experience for all our customers, who are ethnically diverse and live and work around the world," HP's lead social-media strategist Tony Welch wrote on a company blog within a week of the video's posting. "We are working with our partners to learn more." The post linked to instructions on adjusting the camera settings, something both Consumer Reports and Laptop Magazine tested successfully in Web videos they put online.

Still, some engineers question how a webcam even made it onto the market with this seemingly glaring flaw. "It's surprising HP didn't get this right," says Bill Anderson, president of Oculis Labs in Hunt Valley, Md., a company that develops security software that uses face recognition to protect work computers from prying eyes. "These things are solvable." Case in point: Sensible Vision, which develops the face-recognition security software that comes with some Dell computers, said their software had no trouble picking up the black employee's face when they tested the YouTube video.
YouTube commenters expressed what was on a lot of people's minds.

"Seems they rushed the product to market before testing thoroughly enough," wrote one. "I'm guessing it's because all the people who tested the software were white," wrote another. HP declined to comment on their methods for testing the webcam or how involved they were in designing the software, but they did say the software was based on "standard algorithms." Often, the manufacturers of the camera parts will also supply the software to well-known brands, which might explain why HP isn't the only company whose cameras have exhibited an accidental prejudice against minorities, since many brands could be using the same flawed code. TIME tested two of Sony's latest Cyber-shot models with face detection (the DSC-TX1 and DSC-WX1) and found they, too, had a tendency to ignore camera subjects with dark complexions.

But why? It's not necessarily the programmers' fault. It comes down to the fact that the software is only as good as its algorithms, or the mathematical rules used to determine what a face is. There are two ways to create them: by hard-coding a list of rules for the computer to follow when looking for a face, or by showing it a sample set of hundreds, if not thousands, of images and letting it figure out what the ones with faces have in common. In this way, a computer can create its own list of rules, and then programmers will tweak them. You might think the more images - and the more diverse the images - that a computer is fed, the better the system will get, but sometimes the opposite is true.

The images can begin to generate rules that contradict each other. "If you have a set of 95 images and it recognizes 90 of those, and you feed it five more, you might gain five, but lose three," says Vincent Hubert, a software engineer at Montreal-based Simbioz, a tech company that is developing futuristic hand-gesture technology like the kind seen in Minority Report. It's the same kind of problem speech-recognition software faces in handling unusual accents.

And just as the software is only as good as its code and the hardware it lives in, it's also only as good as the light it's got to work with. As HP noted in its blog post, the lighting in the YouTube video was dim, and, the company said, there wasn't enough contrast to pick up the facial shadows the computer needed for seeing. (An overlit person with a fair complexion might have had the same problem.) A better camera wouldn't necessarily have guaranteed a better result, because there's another bottleneck: computing power. The constant flow of images is usually too much for the software to handle, so it downsamples them, or reduces the level of detail, before analyzing them. That's one reason why a person watching the YouTube video can easily make out the black employee's face, while the computer can't. "A racially inclusive training set won't help if the larger platform is not capable of seeing those details," says Steve Russell, founder and chairman of 3VR, which creates face recognition for security cameras.

The blink problem Wang complained about has less to do with lighting than the plain fact that her Nikon was incapable of distinguishing her narrow eye from a half-closed one. An eye might only be a few pixels wide, and a camera that's downsampling the images can't see the necessary level of detail. So a trade-off has to be made: either the blink warning would have a tendency to miss half blinks or a tendency to trigger for narrow eyes. Nikon did not respond to questions from TIME as to how the blink detection was designed to work.

Why these glitches weren't ironed out before the cameras hit Best Buy is not something that HP, Nikon or Sony, when contacted by TIME, were willing to answer. Perhaps in this market of rapidly developing technologies, consumers who fork over a few hundred dollars for the latest gadget are the test market. A few years ago, speech-recognition software was teeth-gnashingly unreliable. Today, it's up to 99% accurate. With the flurry of consumer complaints out there, most of the companies seem to be responding. HP has offered instructions on how to adjust its webcam's sensitivity to backlighting. Nikon says it's working to improve the accuracy of the blink-warning function on its Coolpix cameras. (Sony wouldn't comment on the performance of its Cyber-shot cameras and said only that it's "not possible to track the face accurately all the time.") Perhaps in a few years' time, the only faces cameras won't be able to pick up will be those of the blue-skinned humanoids from Avatar.
__________________
2013 BMW Z4 sDrive35i|DCT|M-Sport|iDrive|Premium Sound|Heated Seats|Comfort Access|PDC|Deep Sea Blue|Cream
[At the port of exit awaiting a shipping vessel since 8/17/12]
Appreciate 0
      01-22-2010, 12:23 PM   #4
Spec 1
Faster in the Corners
Spec 1's Avatar
United_States
52
Rep
1,062
Posts

Drives: '91 E30, '05 ZX6-R, '06 300C
Join Date: Jul 2008
Location: Portland, OR

iTrader: (0)

Funny.
__________________
'91 M42 E30 - All sorts of goodies.
Appreciate 0
      01-22-2010, 12:30 PM   #5
E90SLAM
Supreme Allied Commander
E90SLAM's Avatar
Hong Kong
1963
Rep
61,781
Posts

Drives: A BBS WHORE
Join Date: Jul 2007
Location: .

iTrader: (6)

Garage List
__________________
Appreciate 0
      01-22-2010, 01:33 PM   #6
samwoo2go
Banned
442
Rep
779
Posts

Drives: E92 335i
Join Date: Sep 2007
Location: SoCal

iTrader: (12)

Appreciate 0
      01-22-2010, 01:51 PM   #7
MCMLXXXIX
wat
MCMLXXXIX's Avatar
235
Rep
3,406
Posts

Drives: a gr3at whit3 5hark
Join Date: Oct 2008
Location: Oct 2008

iTrader: (0)

Quote:
Originally Posted by Kensta335 View Post
tl;dr
lmfao +1
Appreciate 0
      01-22-2010, 02:02 PM   #8
Kiemyster
Bimmerpost Resident Marijuana Consultant
Kiemyster's Avatar
Trinidad_and_tobago
453
Rep
3,197
Posts

Drives: 320i, 325xi, 335Xi, 335i, M3
Join Date: Nov 2006
Location: Queens/NYC

iTrader: (4)

Garage List
wow omg. roflol. the pic is the equivelant of cliff notes, no need to read ahaha
__________________
"AMG What! S-Line Who? If you ain't got that M I got no respect for you!"
'06 Alpine Weiss E90 320i | '06 The Green Machine E90 325xi | '11 Alpine Weiss E90 M3 ZCP | '10 Silverstone X5M | '11 Alpine Weiss E90 335xi
Appreciate 0
      01-22-2010, 02:21 PM   #9
number335
Second Lieutenant
number335's Avatar
20
Rep
254
Posts

Drives: E92 Coupe
Join Date: Mar 2008
Location: Orange County, CA

iTrader: (0)

fuuk you Nikon! fuuk you dolphins!
Appreciate 0
      01-22-2010, 02:28 PM   #10
Laser McCool
Enlisted Member
15
Rep
31
Posts

Drives: e46
Join Date: Nov 2008
Location: MD

iTrader: (0)

__________________
Appreciate 0
      01-22-2010, 04:00 PM   #11
jpsum
Major
jpsum's Avatar
United_States
264
Rep
1,088
Posts

Drives: 2010 TSX
Join Date: Dec 2008
Location: New Haven area

iTrader: (4)

it's funny that Nikon is a japanese company, you would think that their camera would at least work on Asians.
Appreciate 0
      01-24-2010, 04:29 AM   #12
Genro757
///M-Flight VP
Genro757's Avatar
United_States
77
Rep
669
Posts

Drives: '13 Frozen Red Edition E92 M3
Join Date: Oct 2008
Location: NOVA/Virginia Beach, VA

iTrader: (0)

Garage List
2013 E92 M3  [0.00]
Quote:
Originally Posted by number335 View Post
fuuk you Nikon! fuuk you dolphins!
+1
__________________
2013 "Frozen Red Edition" E92 M3, M-DCT with all the 'fixin's, Akrapovic Evolution w/CF Tips, BPM Stage 2 Tune, BPM DCT Software
Appreciate 0
      01-24-2010, 02:06 PM   #13
AllydNYC
Loading...
AllydNYC's Avatar
No_Country
341
Rep
2,187
Posts

Drives: 91' E30 318is
Join Date: Apr 2008
Location: Location

iTrader: (15)

Garage List
1991 BMW 318is  [0.00]
Appreciate 0
      01-24-2010, 05:02 PM   #14
DimSum
Banned
Canada
202
Rep
2,740
Posts

Drives: SLOW
Join Date: May 2008
Location: LITTLEBIMMER.COM

iTrader: (10)

cliff notes please?
damn.
Appreciate 0
      01-24-2010, 05:11 PM   #15
Scorpion
Major
No_Country
162
Rep
1,210
Posts

Drives: Nothing
Join Date: Aug 2008
Location: Nowhere

iTrader: (0)

Haha damn
Appreciate 0
      01-24-2010, 06:33 PM   #16
Kiemyster
Bimmerpost Resident Marijuana Consultant
Kiemyster's Avatar
Trinidad_and_tobago
453
Rep
3,197
Posts

Drives: 320i, 325xi, 335Xi, 335i, M3
Join Date: Nov 2006
Location: Queens/NYC

iTrader: (4)

Garage List
Quote:
Originally Posted by DimSum View Post
cliff notes please?
damn.
look at the pic. asian girl, asian camera brand, but blink detection discriminates against.
__________________
"AMG What! S-Line Who? If you ain't got that M I got no respect for you!"
'06 Alpine Weiss E90 320i | '06 The Green Machine E90 325xi | '11 Alpine Weiss E90 M3 ZCP | '10 Silverstone X5M | '11 Alpine Weiss E90 335xi
Appreciate 0
Reply

Bookmarks


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off



All times are GMT -5. The time now is 07:55 PM.




e90post
Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.
1Addicts.com, BIMMERPOST.com, E90Post.com, F30Post.com, M3Post.com, ZPost.com, 5Post.com, 6Post.com, 7Post.com, XBimmers.com logo and trademark are properties of BIMMERPOST