Google Search

Wednesday, August 19, 2015

Programming and prejudice: Computer scientists discover how to find bias in algorithms

Software may appear to operate without bias because it strictly uses computer code to reach conclusions. That's why many companies use algorithms to help weed out job applicants when hiring for a new position.
But a team of computer scientists from the University of Utah, University of Arizona and Haverford College in Pennsylvania have discovered a way to find out if an algorithm used for hiring decisions, loan approvals and comparably weighty tasks could be biased like a human being.
The researchers, led by Suresh Venkatasubramanian, an associate professor in the University of Utah's School of Computing, have discovered a technique to determine if such software programs discriminate unintentionally and violate the legal standards for fair access to employment, housing and other opportunities. The team also has determined a method to fix these potentially troubled algorithms.
Venkatasubramanian presented his findings Aug. 12 at the 21st Association for Computing Machinery's Conference on Knowledge Discovery and Data Mining in Sydney, Australia.
"There's a growing industry around doing resume filtering and resume scanning to look for job applicants, so there is definitely interest in this," says Venkatasubramanian. "If there are structural aspects of the testing process that would discriminate against one community just because of the nature of that community, that is unfair."
Machine-learning algorithms
Many companies have been using algorithms in software programs to help filter out job applicants in the hiring process, typically because it can be overwhelming to sort through the applications manually if many apply for the same job. A program can do that instead by scanning resumes and searching for keywords or numbers (such as school grade point averages) and then assigning an overall score to the applicant.
These programs also can learn as they analyze more data. Known as machine-learning algorithms, they can change and adapt like humans so they can better predict outcomes. Amazon uses similar algorithms so they can learn the buying habits of customers or more accurately target ads, and Netflix uses them so they can learn the movie tastes of users when recommending new viewing choices.
But there has been a growing debate on whether machine-learning algorithms can introduce unintentional bias much like humans do.
"The irony is that the more we design artificial intelligence technology that successfully mimics humans, the more that A.I. is learning in a way that we do, with all of our biases and limitations," Venkatasubramanian says.
Disparate impact
Venkatasubramanian's research determines if these software algorithms can be biased through the legal definition of disparate impact, a theory in U.S. anti-discrimination law that says a policy may be considered discriminatory if it has an adverse impact on any group based on race, religion, gender, sexual orientation or other protected status.
Venkatasubramanian's research revealed that you can use a test to determine if the algorithm in question is possibly biased. If the test -- which ironically uses another machine-learning algorithm -- can accurately predict a person's race or gender based on the data being analyzed, even though race or gender is hidden from the data, then there is a potential problem for bias based on the definition of disparate impact.
"I'm not saying it's doing it, but I'm saying there is at least a potential for there to be a problem," Venkatasubramanian says.
If the test reveals a possible problem, Venkatasubramanian says it's easy to fix. All you have to do is redistribute the data that is being analyzed -- say the information of the job applicants -- so it will prevent the algorithm from seeing the information that can be used to create the bias.
"It would be ambitious and wonderful if what we did directly fed into better ways of doing hiring practices. But right now it's a proof of concept," Venkatasubramanian says.

University of Utah. "Programming and prejudice: Computer scientists discover how to find bias in algorithms." ScienceDaily. ScienceDaily, 14 August 2015. <www.sciencedaily.com/releases/2015/08/150814193120.htm>.

New Report Casts Blame for Widespread Cyberattacks on Iranian Hackers

A new report contends Iranian hackers stole confidential information from government agencies and major companies in 16 countries during at least the last two years. Security vendor Cylance says the ongoing attacks, which it calls  “Operation Cleaver," stole documents and wrested control of computer networks of organizations located in nations including Canada, China, India, Israel, Mexico, Pakistan, South Korea, Turkey, the United Arab Emirates, and the US. The organizations were in the military, energy, transportation, telecommunications, technology, and other industry sectors. Cylance says it has evidence these intrusions were made by the same Iran-based group responsible for a 2013 attack on the US Navy computer network. Hamid Babaei, spokesperson for Iran's mission to the United Nations, said these claims are “a baseless and unfounded allegation fabricated to tarnish the Iranian government image, particularly aimed at hampering current nuclear talks.”. According to Cylance’s report, the hackers used a combination of off-the-shelf and custom tools to infiltrate target computer systems. “We discovered the scope and damage of these operations during investigations of what we thought were separate cases,” said company CEO Stuart McClure. “Due to the choice of critical infrastructure victims and the Iranian team’s quickly improving skillset, we are compelled to publish this report.” Although based in Tehran, the company said, the hackers receive help from people in Canada, the Netherlands, and the UK. Cylance said it has traced the attacks to June 2012, although they may have begun as early as 2010. Cylance shared its findings with the victims and the US Federal Bureau of Investigation. (PC Mag)(USA Today)(Reuters)

Tuesday, September 30, 2014

Smart Chopsticks Function as Food-Safety Sensor

Chinese tech company Baidu is developing smart chopsticks that can act as a sensor telling users whether the food they’re eating is unsafe. The Baidu Kuaisou can detect issues such as the use of unsanitary cooking oil, a prevalent concern when eating Chinese street food. Kuaisou can also measure metrics such as food temperature, food nutrients, and product expiration dates. The device connects to computers that analyze sensor data via Wi-Fi and Bluetooth. Baidu is still testing the Kuaisou. A price tag for the chopsticks hasn’t yet been announced, and the company said the product isn’t yet ready for release commercially. (BBC)(TIME)(Business Insider)(The Wall Street Journal)


View the original article here

Monday, September 29, 2014

Daimler Enters the Ride-Service Application Market

Daimler, the manufacturer of Mercedes-Benz vehicles, has bought two smartphone applications designed to help users obtain car services. The company’s moovel subsidiary purchased mytaxi, which lets smartphone users hail a cab, track its progress, rate the service, and pay for the ride. moovel also bought RideScout, which helps users find ways to get places using public and private transportation, including car-sharing services. Previously, moovel invested in Blacklane, a limousine-booking service. It also owns car2go and Park2gether, a service that helps users find vacant parking spaces. Daimler, which manufactures cabs, says its new applications will not disrupt the taxi industry. (Reuters)(The Wall Street Journal)


View the original article here

Sunday, September 28, 2014

China Preparing National Operating System

China is preparing to launch an operating system to end its reliance on imported technologies such as Android, the Mac OS, and Windows. The Chinese Academy of Engineering system should be released in October 2014, according to the state-run Xinhua News Agency. The desktop version is expected to be released first, followed by the mobile OS, according to the Chinese Ministry of Industry and Information Technology. The goal is to replace foreign-made desktop OSs within one or two years and mobile operating systems within three to five years. Ni Guangnan, a professor with the Chinese Academy of Engineering, heads a government development alliance for the academy and told Xinhua that a lack of research funds and too many independent developers working in different directions have hampered the new design effort and that the government should direct the project. The Chinese government has been pushing the use of domestic technology to avoid what it says are the cybersecurity risks of using foreign technology. In May 2014, it banned the use of Windows 8. (Reuters)(PC Mag)(BBC)(Xinhua News Agency)


View the original article here

Saturday, September 27, 2014

Verizon Settles US Privacy Complaints for $7.4 Million

Verizon Communications is paying the US government a settlement of $7.4 million following an investigation into how the company notifies customers of their privacy rights before using their information for marketing, according to the US Federal Communications Commission. This marks the largest fine relating to phone customers’ privacy in FCC history. The agency’s investigation discovered that beginning in 2006, the company didn’t provide roughly 2 million new landline telephone customers with proper privacy notices—explaining how to opt out of having their personal information provided for marketing offers—in their first bill. Under US law, phone companies cannot use customers’ personal data for marketing without their permission. Under the terms of the settlement, Verizon will send opt-out notices with every telephone bill. Verizon said that it advertently violated FCC rules that it takes the agency’s regulations seriously, and that it has implemented measures to avoid a recurrence. (Reuters)


View the original article here

Friday, September 26, 2014

Sony Joins AllSeen Alliance

Sony has become the latest member of the Linux Foundation’s AllSeen Alliance, which is working on open standards and a development framework for the Internet of Things, in which everyday objects connect to the Internet and can communicate with people and one another. Sony joins the group, which now has 64 members, as a Premier Member, which means it will contribute $300,000 in the first year and $250,000 annually thereafter. Sony has not indicated what its technical contribution might include. The alliance is building on from Qualcomm’s AllJoyn project. So far, it has released version 14.06 of the AllJoyn framework and is working on targeted interoperability projects such as Internet-connected interior lighting. (Datamation)(AllSeen Alliance)


View the original article here