Apple’s Siri privacy promises come with iOS 13.2.
Users who are encouraged to install the next version of Apple’s mobile operating system will have the opportunity to delete your audio recordings history Siri from the servers.
Fair is fair, and it is just as fair that Apple has the ability to improve the Siri user experience by collecting data as it is also fair that users have the opportunity to decide whether to share their recordings or not.

But before continuing with the issue that concerns us today, it is worth noting that the Siri recordings that Apple stores are audio commands that are encrypted to protect the privacy of users. These recordings are intended to evaluate the quality of the virtual assistant, as well as improve the accuracy and reliability of Siri.
It may interest you | 6 tricks to ask Siri for absolutely anything you need

So you can delete your Siri recordings history
A couple of months ago, Apple was indicted in a series of lawsuits for the invasion of privacy users by accidentally collecting data from Siri recordings.
One of the plaintiffs said that Apple employees who work in the artificial intelligence division related to the virtual assistant Siri They “regularly listened to” confidential details of users when evaluating these audio recordings. And let’s remember that Siri is available on iOS, iPadOS, watchOS, tvOS, macOS, and HomePodOS.

They accused Apple of failing to make it clear to consumers that some of its Siri recordings were being collected for evaluation purposes.
The company reacted promptly, apologizing, and announcing new privacy measures. They even temporarily suspended these practices and promised that they would introduce a tool that would allow them to decide whether or not they wanted to share their audio recordings. These new measures will come in the form of a new configuration option available in Settings.

Thanks to this new option called “Delete Siri and Dictation History” It will appear in the Siri & Search category of Settings.
It is interesting to remember that as “Dictation” Apple refers to both the dictation of messages through the virtual assistant and the dictation of text through the keyboard.
You can see a screenshot of what the new option to delete Siri recording and dictation history will look like below:

Other new iOS 13.2 options
The arrival of iOS 13.2 will mean the appearance of multiple privacy functions related to the virtual assistant Siri. In fact, in the iOS 13.2 installation itself you will find a screen where you can decide if you want Apple to collect your audio recordings.
Here’s how Apple clearly explains how these audio recordings are used:
«Help improve Siri and Dictation by allowing Apple to store and review audio of your Siri and Dictation interactions on this iPhone and on any connected Apple Watch or HomePod. You can change this later in the settings for each device. This data is not associated with your Apple ID, and will only be stored for a limited period of time. “

In the privacy section of the iOS Settings app there is also an option to disable “Siri and Dictation Enhancement”. And finally, Apple has included a new setting in the Siri section that allows recordings to be completely removed.
«Delete Siri and Dictation interactions currently associated with this iPhone on Apple servers. Data that has been collected to help improve Siri and Dictation will no longer be associated with this iPhone and will not be deleted. “
On the other hand, in addition to these iOS 13.2 privacy options related to Siri and Dictation, Apple has also mentioned that they are working on implement more changes to your human assessment process in order to minimize the amount of data that Apple employees have access to.
Definitely…
We understand that it is difficult to digest to imagine that there are employees who are constantly watching your audio recordings in which you ask Siri to turn on the light in the living room, open an application or search for data on Google. The truth is that the world of technology is becoming very dangerous, an invasion of privacy that is slowly becoming more and more like something out of an episode of Black Mirror.
