Why Can’t Users Teach Siri about Its Mistakes?

Originally published at: https://tidbits.com/2019/08/14/why-cant-users-teach-siri-about-its-mistakes/

Apple took well-deserved flak in the press for having contractors listen to Siri conversations—and inadvertent initiations where people didn’t know they were being recorded. But Adam Engst suggests that we users should instead teach Siri about its mistakes.

To complex. First, if Siri is wrong, people will usually issue a second command right after the first command.

“Hey Siri, open the kitchen door.”

“Playing ‘Soul Kitchen’ by The Doors”

“No! Open the kitchen door!”

Siri could use an immediate second (similar) command to understand that the first command failed.

Another is if someone edits something Siri did. For example:

“Hey Siri, set up an appointment next Tuesday at 2:30 to see my dentist”

“Setting up an appointment all day Tuesday to see my dentist because my tooth is hurting”

(Edits appointment)

We could still use an opt-in mechanism to send the information back to Apple, but we no longer have to manually grade Siri. Siri knows when she misbehaves.

We could use a similar mechanism for accidental invocations. If Siri is invoked accidentally, Siri should realize this when a follow up command is either nonsense with no follow up, or there is no other command issued. Siri could then just send the invoking sound (or mechanism) with nothing else back to Apple.

It would better protect privacy and give Apple a higher percentage of data that’s relevant to work with. The chaff would already be separated from the wheat.

In addition to what @david19 suggests, I’d like to simply be able to correct a single word while dictating to Siri. Often, when sending messages via Siri, it will get one word very wrong, and I have to dictate the entire message again (thankful for the “change” command!). Invariably, Siri will get it wrong again. Annnnnnd AGAIN. Whereas, if I could say “change Beckons to because” when it starts a message “Beckons you haven’t messaged me back, I’m going to assume you’re not home”, it sure would be easier on me (while driving, usually) and perhaps help Siri’s future accuracy (I never use “beckons”, but start messages with “Because” a lot).

Additionally, I’d like to be able to review my own prior Siri submissions. I am required to use a Bluetooth headset, due to hearing impairment… and yet I have zero idea how bad my headset might sound to Siri! I know they go bad, I’ve had them go bad, though friends and clients aren’t complaining about call quality as of late. Also, being in my car, with the AC going (iPhone magnetically attached to a mount right by an air vent), is that effecting Siri? Is the microphone on my iPhone obstructed or dirty? How would I know, I have no mechanism to test such physical inputs. (I have used speakerphone and Voice Memos to record myself; always seems crystal clear to me, but that’s usually under good environmental conditions.)

The fact that Apple saw fit to potentially allow others to hear my private requests but doesn’t see fit to provide me the tools to help myself is beyond disappointing. This company needs to re-discover the basis of “The Power To Be Your Best.”

1 Like

I would opt out, for two reasons. The first is privacy concerns. Based on recent history, anything can be found out. The second reason is the same reason I took the dealer plates off my car the minute I got home. I’m not compensated for the service provided, whether it’s advertising or free tech support.

I understand we all would like to improve the technology we own and use. But Apple hasn’t offered compensation or anything else for the data it has already received from me, so the incentive to provide even more isn’t there. Even the (dubious) incentive of improved technology and software that I will undoubtedly eventually pay for in upgrades isn’t really enough.

I too have longed to be able to correct Siri. I’d like to be able to spell out a word it consistently mishears for example. I think a quickly delivered ‘No Siri’ should prompt a correction request.

That’s basically what I mean with the “That’s wrong” command. Obviously, you’ll issue another command after that to accomplish whatever the first one was intended to do, but you have to say something specific to mark the second command as a mistake. I often say one thing, particularly when starting music on the HomePod, and then change my mind and issue another command right away.

But that reminds me, I forgot to put in something about how Siri should apologize for mistakes. Off to add that.

If you edit within the Siri screen, I could see this working, but I often see Siri make mistakes that aren’t editable there. See:

That feels like a really hard machine-learning problem to me, since Siri has to accept a wide variety of things people say as commands. I can’t even imagine how you’d identify something as “nonsense.”

Now this is fascinating. Microsoft has now been caught having human listen to Cortana recordings, but there are more details about what the people are actually doing (and how much they make).

Siri could use an immediate second (similar) command to understand that the first command failed.

Obviously, you’ll issue another command after that to accomplish whatever the first one was intended to do, but you have to say something specific to mark the second command as a mistake.

Errors could be detected with a time limit (say 5 seconds) and a comparison between the first and second commands. If the two commands are similar, you can assume the second command is a correction of the first. Or, maybe if cussing is involved in that second command.

I guess, we could start the second command as “Wrong Siri” rather than “Hey Siri”.

Another is if someone edits something Siri did.

If you edit within the Siri screen, I could see this working, but I often see Siri make mistakes that aren’t editable there — See Bad Apple

You’re right this may be difficult with third party apps and shortcuts, but most of work Siri does is with Apple apps, and Apple controls the OS. Apple should be able to tell if Siri creates an entry in your calendar or reminders, and that the user goes into calendar or reminders and edits the very entry that Siri added.

We could use a similar mechanism for accidental invocations.

That feels like a really hard machine-learning problem to me, since Siri has to accept a wide variety of things people say as commands. I can’t even imagine how you’d identify something as “nonsense.”

This would be the most difficult part. However, if I want to use Siri, and Siri fails, I will usually either correct Siri with a second command or manually do the task myself. If Siri is unable to understand a command, and I don’t either issue a new Siri command or do something on my phone, it’s a pretty good indication that I wasn’t trying to use Siri.

I’ve worked on a lot of programs, and one thing I’ve learned is that users won’t give you feedback even if it’s fairly simple to do. They have work to do, and they’re not going to interrupt their workflow to help you out.

Many hotels want you to rate your experience when you leave. They’ll give you a one or two question survey. All you have to do is select from 1 to 5 how they did. There’s even a box on the counter to put in that form. How many people actually fill that in?

Or, you call customer service, and there’s a recording that asks you to stay on the line after you finish with the customer service agent how they did. How many people stay on the line?

I worry that a program that depends upon users to let Apple know when Siri fails won’t work — even if it’s simply that you say Bad Siri! when Siri goofs.

It depends on how easy it is. There’s a company called FeedbackNow that has done really well with providing simple feedback buttons that let you rate things in the real world. I gave TSA in Newark a low rating on our recent flight to Switzerland since they were slow, confusing, and annoying. But the bathrooms in the Geneva airport got a good rating, and I remembered to take a photo that time. :slight_smile: So if you make feedback systems easy enough, and available at the right time, I think people will use them.

“That job must eat your brain.” But it’s a job.

I have learned to HATE Siri. She cannot understand me and I have no particular regional accent or difficulty with others understanding me. I end up swear like a sailor at Siri when she continues to botch my commands. ANYTHING that would make it work better for me would be an improvement!

If humans are eavesdropping on the other end, I hope that are at least making a note of every time I say “Siri, you’re being stupid” or “Siri, you’re not being helpful.”

1 Like

I will often say “Thank you” to Siri if she does what I want, sometimes without even thinking about it. That could be your default “correct” phrase instead of “good job.”

2 Likes

My life!! Some days are wonderful.

Others are like you said, or even worse, when I say Hey Siri, remind me to check the lock when I get home….

… and she responds Ok I’ll remind you to check

It just goes downhill from there :frowning:

Diane

I have a speech disability and, unfortunately, Siri doesn’t understand a word I say. It’s almost amusing when I think of how wrong Siri has gotten me at times or has totally failed to reply at all, it’s almost as if she was embarrassed by not understanding. It would be nice to be able to tell Siri what she (or it) got wrong but that HAS to be done in writing because otherwise she wouldn’t understand and it would be like the snake eating it’s own tail!

Funny thing, I do that, too!

My husband and I would definitely like to find a way to let Siri know when it has been improperly invoked by his voice. It is regularly invoked by my husband’s voice on both of our iPhones, and sometime on our iPads, too. It happens up close and far away, and sometimes even virtually. It happens when the things he says are not even close to, “Hey Siri!” It has to stop!

On the other hand, frequently, Siri can’t hear or understand me to save my life. I would love to be able to have the option to rate Siri’s accuracy (much like we are asked to rate the usefulness of voicemail transcription or Facebook translations) - on a case by case basis, not all the time.

It would be really great if Apple would obtain Nuance’s speech recognition software…

We have our ups and downs with Siri, too. One thing I have not found out till today is, why Siri doesn’t invoke, when my wife says Hey Siri, but always invokes, when I say Hey Siri.
Does Siri has a voice recognition function, that it only invokes, when it detects it’s Masters voice ? :smiley: Could I train Siri to recognize different voices?

I thought Siri was specific to your voice on your phone? My SO and I don’t trigger each others phones.

Diane

Interesting thought. That would explain and raise 4 questions:

How does Siri learn my voice? I don’t recall any training session.

Could we actively train Siri to listen to my wife’s voice ? My wife’s iPhone is set up with my Apple ID, the only one we want to have.

Could I retrain Siri on my voice and how?

Does that imply that the voice recognition is Apple ID based, which means, on any shared device like iPad at home or home pod for that matter only one person in the household is able to trigger Siri?