

274·
2 months agoUnknowingly?
Ingress was quite transparent about the goal of gathering real-world data to allows development of future technologies like self-driving and navigation.
It’s the reason, why I started playing it around 2012.


Unknowingly?
Ingress was quite transparent about the goal of gathering real-world data to allows development of future technologies like self-driving and navigation.
It’s the reason, why I started playing it around 2012.


OK, yes, that obviously makes sense, considering the amount of these Charakters.


I know about it, but didn’t recognize the code. So I assumed, they encoded some text to make it harder to read. So I tried decoding it.
Turns out, if you decode this in UTF-16, it turns into a japanese sentence
契ȑ璝寣䇘앖噣삈
Which means (according to DeepL)
The sound of the wind rustling through the trees
And now I’m confused, why.
It’s harder to pronounce internationally, which makes it a weaker global brand.
Also, in the early days of wakeword detection, the detection algorythm actually triggered by the ‘melody’ your voice creates automatically when producing certain vocal sounds. This basically triggered a recording before going through deeper analysis to actually determine, if this was supposed to be an actual request.
For Alexa, the a-ex-a is easy to detect. For “Hey Siri” it’s basically a ‘chime bing bing’ sound in a certain rythm. For Cortana, it’s or-a-a. But Jeeves is only a single syllable, both the J and ‘vs’ are harder to pronounce and basically not relevant for wakeword detection. So the whole wakeword is basically just “eee”, which is a bad wakeword.
So… Just not gold, both technically for reliability and efficiency and economically, not so great for global brand recognition.