Barbra Streisand calls Tim Cook to fix her biggest gripe with Siri

Barbra Streisand

Getting iPhone bugs fixed is apparently super-easy if you’re a world-famous diva.

Barbra Streisand says she recently had a huge bone to pick with Apple over the way Siri pronounces her last name. So the singer did what only Barbra Streisand could do: She dialed Tim Cook’s personal phone number.

“She pronounces my name wrong. ‘Streisand’ with a soft ‘s’ like sand on the beach — I’ve been saying this for my whole career,” Streisand told NPR in a new interview.

“And so what did I do? I called the head of Apple, Tim Cook, and he delightfully agreed to have Siri change the pronunciation of my name finally, with the next update on September 30th. So let’s see if that happens, because I will be thrilled.”

Of course, Streisand could have just taught Siri how to pronounce her tricky last name on her own, but that’s not how divas roll.

Advertisements

Feature Request: Give Siri access to third-party apps, and (in time) much more

I know not everyone gets along with Siri, but personally I love it. It wouldn’t be an exaggeration to say that it’s my default way of interacting with my iPhone, whether it’s searching the web, dictating a text message, setting an alarm, setting for location-based reminders, noting appointments, phoning friends, playing music, getting directions … even opening apps.

The main reason I use Siri is simply efficiency – which some of my friends insist is spelled ‘laziness.’ But really, if I can simply ask my phone to do something for me, or tell me something, why wouldn’t I? Interacting with a touchscreen and manually typing things is so last century.

It also saves time. If I’m walking down the street, I can ask Siri to do something without breaking stride. Doing the same task manually would mean either stopping or ending up walking in front of a bus. I’ve also used my Apple Watch to ask Siri to do something at traffic light stops when cycling – there simply wouldn’t be time in that situation to pull out my phone and do the same thing manually.

But powerful as Siri is, I’d like to see it be able to do more – much, much more …

My single biggest frustration with it right now is that I have all these apps on my phone that can answer lots of questions and do lots of things, but Siri has no access to most of them. If it’s an Apple app, no problem. I can ask Siri to send a text, remind me to buy milk when I pass my local shop, play some Anna Nalick, show me the photos I took yesterday and a lot more.

But what I can’t yet do is ask the time of my next train home, despite having an app on my phone that can answer that question. I can’t ask it to show me today’s Timehop, nor can I ask it to post that to Facebook. I can’t ask it to post something to a Hipchat or Slack chatroom. I can’t ask it to call an Uber car. I can’t ask it to translate ‘Where is the nearest pharmacy’ into Mandarin. I could name many other examples, but you get the idea.

Almost nobody buys an iPhone and then installs no third-party apps, so it feels odd that Siri is entirely unaware of them. An API that allows third-party apps to interface with Siri seems, at first glance, a small thing to ask.

watch

But I recognize that the reality is very different. What I’m asking for here is non-trivial. Let’s take the train home example, and look at exactly what I’d be asking Siri to achieve.

What time is my next train home?

Let’s start with the good news. Siri knows where my home is, and it knows my nearest train station. It knows where I am now. My train app also knows which station I need to get to, and it knows when the next train is. It can tell me which platform it goes from and what time it gets me home. So far, so good. But after that, things get complicated.

First, Siri has to parse the question. While I know mileages vary, my own experience is that Siri is almost faultless at this – but that’s in large part because there are a limited number of questions you can ask. The more questions we add to the list, the greater the chances of it failing to properly understand what I’m asking.

Second, Siri has to know which app is able to answer the question. That may be relatively trivial if I only have one app capable of answering the question, but what if I have two or three trains apps, each of which could do so? Which app does Siri query?

Third, the train app has to make sense of the query passed to it by Siri and pass the required information back to Siri.

Fourth, Siri has to be able to translate the data handed over by the app into speech – which isn’t as trivial as it sounds.

apps

We could offer Siri a helping hand with the second step. I could tell Siri which app to ask.

Ask myTrains Pro the time of my next train home

But that’s horribly clunky. Worse, I had to actually look at my iPhone to check the name of the app – I just think of it as my train app. I couldn’t tell you offhand the names of half the apps on my phone, and I bet the same is true for most people. There’s little benefit to using Siri if we first have to look at the screen and perhaps flip through to the correct screen and maybe open a folder too.

But there is one very practical way we could make the task easier. The iPhone 6s prompted third-party apps to learn a new trick: 3D Touch actions. These are a very limited number of things they can do right from the Home screen. My train app hasn’t yet learned this trick, but when it does, ‘Next train home’ would be the most obvious 3D Touch action.

So Siri wouldn’t have to learn to parse a massive number of new queries, only the very limited number of queries/actions available through the 3D Touch function. With that approach, it becomes a lot more practical.

3d-touch-04

But I’m not done yet. In time, I’d like Siri to be able to handle tasks like this:

Arrange lunch with Sam next week

Siri knows who Sam is, so that bit’s fine. It has access to my calendar, so knows when I have free lunch slots. Next, it needs to know when Sam has free lunch slots.

This shouldn’t be complicated. Microsoft Outlook may be one of my least-favorite apps in the world, but it has for years offered delegated access to calendars, where work colleagues are allowed to check each other’s diaries for free slots, and authorized people are allowed to add appointments. So what we need here is the iCloud equivalent.

I pre-approve contacts allowed to do this. iOS could show me my Favorites list as an initial prompt (though I’m probably not going to authorize my local cab company). Those approved contacts are then given access to my iCloud calendar at a busy/free level, without actually getting access to the data itself. My iPhone checks Sam’s iCloud calendar for free lunch slots and matches them with mine. It finds we’re both free on Wednesday so schedules the lunch.

piquet-1

And it could do even more. My iPhone could easily note my favourite eateries, and Sam’s, and find one we both like. It could then go online to the restaurant’s reservation system to make the booking. The process would then look (well, sound) like this …

Hey Siri, arrange lunch with Sam next week

Working – I’ll get back to you shortly …

Ok, I arranged lunch with Sam for 1pm next Wednesday at Bistro Union at Clapham Park

There are a few data privacy issues to figure out. In order to work out a convenient location, it would need to know where each of us will be before and after, to ensure the location is practical. So, in practice, Siri would need a little more access to Sam’s diary than just busy/free. But as long as only Siri sees location info, and we’re approving the contacts we allow such access, I think that’s acceptable.

Siri-style speech recognition: coming soon to apps and robots

Command your robot to find your cat with just your voice.

Your smart life is about to get even smarter with a new set of software development tools that will let coders include world-class speech recognition and natural language processing — the same stuff that powers Siri, Apple’s personal digital assistant — to thermostats, refrigerators, apps and, yes, even robots.

The folks at Nuance have created a new system, currently in beta, to allow any company to include code with language commands that are specific to their hardware or apps. It’s called Nuance Mix, and anyone can sign in and create their own speech-recognition code to work with their apps or connected devices.

“Any developer, big or small, can come in and define a custom set of use cases,” Nuance’s Kenn Harper told Cult of Mac during a demo of the SDK. “You’re going to start talking to everything at home and work — speech is about to get more ubiquitous.”

As Nuance’s mobile director of product management, Harper’s excited about the new crop of devices coming to our homes. We’re seeing more connected thermostats, fridges, home control devices, music streaming apps and speakers, robots, entertainment, virtual reality and wearables come to market, said Harper. Most of these devices won’t even have a screen, and they’ll all need a user interface.

“For the first time,” he said, “we might see voice becoming the user interface of choice.”

When you sign up for Nuance Mix beta as a developer, you get access to a robust set of coding tools that allow you to build a set of speech recognition with natural language parsing technology that applies to your own specific hardware and software requirements. A thermostat needs a vastly different set of commands than a refrigerator does, and a home robot needs something even more completely different.

It will, of course, need to be highly accurate. Siri’s famous for misunderstanding our spoken commands; you can’t have this with a personal care robot, thermostat or even wearable device.

“To get a high level of accuracy,” said Harper, “we’re creating the natural language piece, but also creating a fully customized speech model for voice recognition. We optimize speech and natural language together.”

Ultimately, what this all means for consumers is that we’ll talk more to our devices than ever before, just like we’re doing with our smartphones, which use Siri, Cortana or Google to process our requests. Imagine telling your fridge to text you whenever the temperature gets too warm, or asking your robot to dial 911 if your elderly grandmother slips and falls. The most natural way to ask for complex interactions is with our voice, and Nuance is the best-in-class speech recognition around. It makes sense they’ll power our talky future.

Developers can sign up for Nuance Mix’s beta program now and see how it all works.

I, for one, can’t wait to call out and say, “Hey, oven. Preheat yourself to 350 degrees and order my favorite take-and-bake.”

Siri Celebrates ‘Back to the Future’ Day With Humorous Responses

Apple has honored “Back to the Future” day by updating Siri with at least ten humorous responses related to the popular movie Back to the Future II, released in 1989. iPhone and iPad users can invoke Siri and say “happy Back to the Future day” to receive one of the responses below at random.

BTTF-Date

Today is “Back to the Future” day because October 21, 2015 is the date that Marty McFly and Dr. Emmett Brown travel to in the movie. The classic film makes several fictional predictions about the future, including hoverboards, flying cars, remote-control dog walkers and the Chicago Cubs winning the 2015 World Series.

bttf_siri

This is Philips’ $60 bridge to connect Hue lights to Apple’s Siri-controlled HomeKit platform.

Following confirmation from Philips that it plans to support Apple’s HomeKit platform for its popular iPhone-controlled Hue lighting system, images of a bridge device that will let existing Hue products work with Apple’s platform have leaked online.

Accessory makers can’t add support for Apple’s HomeKit to products with software alone, but earlier this year we detailed Apple’s specs for building HomeKit hardware bridges that will allow existing home automation products to tap into the platform. Philips appears to be planning an imminent launch of such a device as images leak online through a lighting retailer in the Netherlands that jumped the gun on an official launch. 

philips-hue-bridge-voor-homekitIn the image to the right we get our first look at the small Apple TV-looking “Hue Bridge” device that will allow the current lineup of Philips Hue lights to interface with HomeKit. In exchange, Hue users that invest in the bridge will gain access to Siri voice commands for controlling lights and likely other HomeKit features for grouping devices through an update to Philips’ companion app. 

No word yet on an official launch from Philips on the Hue Bridge, but the retailer listed a sales price of 60 euros before removing the listing earlier today (pictured below).

Philips-Homekit-bridge

We’d expect Apple will talk more about HomeKit at its upcoming press event on September 9th later this month. We reported that Apple is planning to introduce a new Apple TV, which acts as a hub for remote access to the Siri-controlled HomeKit platform, alongside new iPhones and an official launch for iOS 9 and OS X El Capitan.

Siri speaks 7 new languages in iOS 8.3.

Siri speaks even more languages in iOS 8.3. Photo: Jim Merithew/Cult of Mac

Apple’s second iOS 8.3 beta, which was pushed out to registered developers on Monday ahead of a public release later this year, enables Siri to speak seven new languages, testers have found. It also brings more performance improvements for older iOS devices like the iPhone 4s.

The full list of new languages for Siri includes Russian, Danish, Dutch, Thai, Turkish, Swedish and Portuguese. In addition to learning these, the virtual assistant also supports more English-speaking regions, such as New Zealand.

Videos sent into Cult of Mac by Klaus Jacobson also show noticeable performance improvements on the iPhone 4s and the iPhone 5, which will be welcome news for owners of those devices. On the iPhone 4s, in particular, iOS 8 has been notoriously slow and buggy since it made its debut last fall.

It’s nice to see Apple making performance and stability improvements as well as expanding the feature set of iOS, and we’re likely to see a whole lot more of that this year. According to recent rumors, iOS 9 will be predominantly focused on making major improvements under the hood.

While we don’t have a release date for iOS 8.3, we can probably expect it to arrive this spring. As for iOS 9, we should get a preview of that at WWDC in June ahead of its arrival alongside new iOS devices this fall.

Siri lets you relive your audio misadventures.

Want to see all the songs you've found via Siri or iTunes Radio? Photo: Buster Hein/Cult of Mac

iOS 8 includes Shazam — a magical technology that gives your iPhone the power to listen to a song and tell you what it is. In the car, at a movie theater, or even at a crowded bar, you can just ask Siri, “What song is playing?” or hold your home button for a few seconds, and your iPhone will use Shazam tech to tell you exactly what song is in your environment. You can also (surprise) buy the song you just recognized via a little button in the results screen.

But what if you want to buy it later? Or remember what song was playing at the bar last night when that cute girl gave you her number? You can easily do just that with a quick trip to iTunes on your iPhone.

Tap the icon for the iTunes Store and you’ll see the familiar set of featured albums and New Music. Tap on the little menu button in the upper right (it looks like a bulleted list with three lines) and you’ll access your Wish List, Siri, Radio, and Previews screen.

Screengrab: Rob LeFebvre/Cult of Mac

To see what you’ve had Siri recognize for you, simply tap on the Siri tab at the top of the page. Your iPhone will list out all the songs you wanted to know more about, along with a handy button to buy the song to the right of the song title. Super useful!

If you’re trying to remember what songs were playing for you on iTunes Radio, simply tap on the Radio tab at the top of the same page and you’ll get a list of all the songs played on any of your iCloud-connected devices via iTunes Radio.

Now you can relive all your audio adventures with a quick tap or two, and even contribute to your music collection with a buy in the iTunes Store. How convenient!