(Getty Images)
  • Apple is going to start using phone call and email metadata in an attempt to combat fraud.
  • The data will give devices a "trust score," and it could help Apple detect fraudulent transactions, reviews, and accounts.
  • The plans were spotted when iTunes quietly updated its privacy policy.

Apple will start assigning so-called "trust scores" for Apple devices in a bid to combat fraud.

First spotted by VentureBeat, a new provision has quietly appeared updated in the iTunes Store privacy page:

"To help identify and prevent fraud, information about how you use your device, including the approximate number of phone calls or emails you send and receive, will be used to compute a device trust score when you attempt a purchase. The submissions are designed so Apple cannot learn the real values on your device. The scores are stored for a fixed time on our servers."

Essentially, Apple will assign devices "trust scores" based on information including phone call and email metadata. This trust score helps the company identify scammers who are using Apple's services and devices as part of their schemes.

The so-called "trust score" only takes into account usage patterns, or metadata, and it's sent to Apple when a purchase is made on the app store.

The content of the communications isn't used, so Apple doesn't know who you called or emailed or what you discussed, and the score isn't used for any advertising purposes, just to identify fraud, Apple told Business Insider. 

For example, earlier this year scammers in China were discovered hijacking people's Apple IDs and making purchases via an iPhone and Mac feature called "Family Sharing," Business Insider reported. While it's unclear whether the trust score directly affects scams like that one, it illustrates kind of fraud threat that Apple faces constantly. 

The update comes at a time when US lawmakers are asking Apple how it handles the personal data of its users.

Apple CEO Tim Cook has been consistently strident in his view that hoarding personal data is a bad thing.

"We felt strongly about privacy when no one cared," Cook said in June. "We could not see the specific details, but we could see that the building of the detailed profile on people likely would result in significant harm over time."

Receive a single WhatsApp every morning with all our latest news: click here.