now resides on your phone, not in the cloud, and that’s important for privacy.

Google Assistant has shrunk. The models that were used for its operation occupied 100 GB and resided in the cloud, but Google has managed to reduce them to just half a gigabyte, something simply fantastic that will allow something much more important than it seems: we can use the assistant and its benefits locally, without connection to the Google cloud.

The new Google assistant is essentially the same as always, but that ability to work locally will make everything much faster. Very much. We saw it in yesterday’s Google I / O demo in a much smoother interaction and now allowing surprising new milestones that have a promising side effect: Google takes care (at least a little more) of our privacy.

now resides on your phone, not in the cloud, and that’s important for privacy

Faster, more local, smarter

The implications for the interaction with the Google assistant are clear, especially since that speed of operation that they demonstrated on stage allows us to use this platform much more and much better than we do now. becomes more attractive than ever.

Google Assistant now makes it faster to navigate your phone by voice than finger. This will give super powers to always-in earbuds

now resides on your phone, not in the cloud, and that’s important for privacy

– Josh Constine (@JoshConstine) May 7, 2019

Some analysts commented for example how this could be a whole revolution for voice control (for example via connected headphones) became a much more powerful option than before, and this could certainly boost those scenarios.

now resides on your phone, not in the cloud, and that’s important for privacy

This local operation is coupled with the improvement of natural language recognition that once again allows the assistant to stand out in those searches by context that allow us to link commands related by context without giving too many details that are already assumed from the first consultations.

The video example showed those advances. In it we can see a woman interacting with the assistant and saying: “Show me pictures of Yelloswtone” and then, without saying anything else, say “the ones with animals” to close with a “Send them to Justin”.

now resides on your phone, not in the cloud, and that’s important for privacy

The thing did not end there, because when the interlocutor asked her when she was returning from a trip, she first consulted the information in the assistant and then answered (always with her voice) in the messaging application and gave the information. The jump between applications was transparent and very fast, something that suggests that using your mobile phone with your voice could soon become more comfortable and faster than using your fingers (If one gets used to it and is fluent in doing it).

Google showed how that assistant’s ability worked, and promised that all those improvements – there were several more that they described in a post on their official blog – would reach the Pixel range phones this year (probably in the fall). But… How does everything underneath work?

Assistant, answer at once

The speed with which the assistant works is another fundamental factor for us to interact more and better with this automated platform: latency is critical when chatting with machines (and with humans, of course), and you quickly get nervous if you ask a question and the answer takes longer than you expect.

now resides on your phone, not in the cloud, and that’s important for privacy

To improve this latency, Google announced in March its advances in speech recognition, which is now implemented locally thanks to the use of a model trained with the so-called “recurrent neural network transducer“(RNN transducer).

This model is a evolution of techniques that Google had used before such as deep neural networks (DNNs), recurrent neural networks (RNNs), short and long-term memory networks (LSTMs) or convolutional neural networks (CNNs).

The achievement of this model is its efficiency. The so-called search graph that was used for the production models that Google used and uses in speech recognition was about 2 GB, but they managed to reduce that size to 450 MB thanks to this model. A series of additional techniques made it possible to compress this data and make them accessible at the same speed in just 80 MB.

now resides on your phone, not in the cloud, and that’s important for privacy

That, they added in Google, makes the assistant’s ability do it “compact enough to be able to reside in a mobile“. Google described this technology in depth both in that post and in this study (PDF), and its results crystallized yesterday. It seems that these capabilities will only be available at full capacity in the most powerful devices, but we will have to wait for their final deployment , probably fall 2019.

Privacy (or the feeling of privacy) wins integers

A fundamental consequence of this change in the way the Google Assistant works may be even more important for today’s market: it is the perception of privacy that we will have when using the service.

Until now, when using the assistant we connected with the cloud, and it was Google’s servers that processed the information for the assistant to answer us. Now everything works locally and we can use a good part of the functions of that assistant without having an active data connection.

Just because of that the feeling of using the Google assistant will be much safer for many than before. This is more important than it seems, especially considering the recent privacy scandals at companies like Facebook and those that Amazon has suffered in its family of Echo voice assistants.

now resides on your phone, not in the cloud, and that’s important for privacy

Google does the same, of course, and one of its executives admitted how “a small fraction” of voice requests from the assistant are shared with a human team on Google working on improving its artificial intelligence systems.

Apple had long been running as a privacy advocate, but this move may help Google win money in this area. It will be difficult considering how both companies make money, but of course that things run locally without connection to the cloud is a fundamental element so that our perception of how Google protects our privacy changes.

Pichai actually had an impact on this section also during his participation in Google I / O, but he went further with an editorial article written by him and published in The New York Times in which, without mentioning Apple directly, he did affirm that “privacy cannot be a luxury product“only available to” people who can afford to buy premium products and services. ”

now resides on your phone, not in the cloud, and that’s important for privacy

It remains to be seen if the assistant manages to work in this way not only on its Pixels, but on all kinds of Android devices and, of course on smart speakers that little by little are conquering our homes. Whether or not that message is deep-seated, the truth is that the Google assistant has taken an important step to help us trust it more.