Maximum number of cars added to compare list.

What's your postcode?

We need your postcode in order to provide accurate search results.

Enquire

Enter your first name
Enter your last name
Enter your phone number

Got a part exchange?

Tell us your reg plate and receive a part exchange valuation on your car?

What's this?

Compare cars side by side to save time clicking backwards and forwards between them.

Your car could cheer you up after a bad day – by reading your facial expressionBack

We are now used to cars that understand what we say and experts predict that, in the future, they may also know how we feel – sometimes without us having to say a word.

Nearly 90 per cent of all new cars are expected to offer voice recognition by 2022. The next step for the cars of tomorrow could be to pick up on tiny changes in our facial expression as well as modulations and inflections in our speaking voice.

What’s happening at the moment?

This summer, Ford’s in-car connectivity system, SYNC 3, will enable drivers to connect to Amazon’s virtual assistant Alexa and offer 23 different languages and many local accents. By accessing cloud-based resources, cars of the future could enable even more drivers to speak their native language.

“Voice commands like ‘I’m hungry’ to find a restaurant and ‘I need coffee’ have already brought SYNC 3 into personal assistant territory,” says Mareike Sauer, voice control engineer at Ford of Europe. “For the next step, drivers will not only be able to use their native tongue, spoken in their own accent, but also use their own wording, for more natural speech.”

Apple CarPlay provides a simplified way to use the iPhone interface on a car’s touch screen, giving users access to Siri Eyes-Free voice controls, as well as Apple Maps, Apple Music, Phone, Messages, and a variety of third party apps. Android Auto delivers Google Maps and music to a car’s screen while enabling voice controls for phone calls and messaging.

What’s next?

A research project Ford is currently running with RWTH Aachen University in Germany includes using multiple microphones to improve speech processing and reduce the effect of external noise and potential disruptions.

Nuance says that within the next two years, voice control systems could prompt us with: “Would you like to order flowers for your mum for Mothers’ Day?” “Shall I choose a less congested but slower route home?” and “You’re running low on your favourite chocolate and your favourite store has some in stock. Want to stop by and pick some up?”

What does the future hold?

Advanced systems – equipped with sophisticated microphones and in-car cameras – could learn which songs we like to hear when we are stressed and those occasions when we prefer silence. Interior lighting could also complement our mood.

“We’re well on the road to developing the empathetic car which might tell you a joke to cheer you up, offer advice when you need it, remind you of birthdays and keep you alert on a long drive,” says Fatima Vital, senior director at Nuance Communications, which helped Ford develop voice recognition of its SYNC system.

Future gesture and eye control would enable drivers to answer calls by nodding their head, adjust the volume with short twisting motions, and set the navigation with a quick glance at their destination on a map.

Meanwhile, future voice recognition systems are predicted to evolve into personal assistants that shuffle appointments and order takeaways when drivers are held up in traffic jams.

Could we get attached?

“Lots of people already love their cars, but with new in-car systems that learn and adapt, we can expect some seriously strong relationships to form,” says Dominic Watt, senior lecturer in the Department of Language and Linguistic Science, at the University of York.

“The car will soon be our assistant, travel companion and sympathetic ear, and you’ll be able to discuss everything and ask anything, to the point many of us might forget we’re even talking to a machine.”

Posted by Beth Rose on 21/02/2017