قالب وردپرس درنا توس
Home https://server7.kproxy.com/servlet/redirect.srv/sruj/smyrwpoii/p2/ Technology https://server7.kproxy.com/servlet/redirect.srv/sruj/smyrwpoii/p2/ Apple's enthusiasts say tapping sensitive information and sexy time thanks to Siri

Apple's enthusiasts say tapping sensitive information and sexy time thanks to Siri



Photo: Getty

First Amazon, then Google and now Apple have confirmed that their devices not only listen to you, but completely strangers can view the tracks. Thanks to Siri, Apple's contractors regularly capture intimate excerpts from consumer privacy such as drug transactions, doctors visits, and sexual escapades as part of their quality control duties, Guardian said on Friday.

As part of his effort to improve the assistant's voice, "[a] a small portion of Siri's demands are being analyzed to improve the sire and dictation," Apple told Guardian. This includes sending these records without Apple ID numbers to the international counterpart team to evaluate these interactions based on Siri's response against other factors. The company also explained that these graduated records represent less than 1% of Siri's daily assets, and this is only a few seconds.

This is not the case, according to Apple's anonymous performer, with whom the Guardian has spoken. The contractor explained that since these quality control procedures did not remove cases where the user accidentally activated Siri, the performers finally listened to the conversations that the users may have never wanted to be recorded. Not only that, the details that could potentially identify the user accompany the record so that the performers can verify that the request has been processed successfully.

"There are countless cases of records, including private discussions between doctors and patients, business deals, seemingly criminal relations, sexual contacts, and so on. These records are accompanied by user data showing location, contact details, and app data, "said Guardian's ex-patron.

And it's really easy to activate Siri by accident. Most things that sound like "Hey Siri" are likely to do the job, as UK Secretary of Defense Gavin Williamson understood last year when the assistant contacted Syrian Parliament. According to the artist, the sound of the zipper may even be enough to activate it. They said that from the devices of Apple, Apple Watch and HomePod, smart speakers most often collect Siri's random triggers and recordings can last up to 30 seconds.

While Apple reported to the Guardian that the information collected by Siri is not related to other data Apple may have for a user, the contractor said another story:

"There is not much check on what works there , and the amount of data that we are free to look at looks quite broad. It would not be difficult to identify the person you are listening to, especially with random triggers – addresses, names, etc. "

Staff were told to report these random activations as technical problems, the newspaper worker said. , but there was no indication what to do if these records kept confidential information.

All of this makes Siri's Syrian answers to consumer questions look far more innocent, especially the answer when asking if she always listens, "I only listen when you talk to me."

Technological giants colleagues Amazon and Google have encountered similar privacy scandals recently over recordings from their devices. But while these companies also have employees who monitor each of them, the voice assistant can be canceled by users. Apple does not offer such an option in its products.

[The Guardian]


Source link