Ford is also working with Carnegie Mellon University-Silicon Valley to develop improved embedded speech recognition that supports more natural language. The system relies on graphics processing unit computing allowing for quicker, more powerful processing. The new technology relies on more natural speech patterns rather than a restricted set of commands to perform in-car tasks such as hands-free phone dialing or requesting navigation.
Customer Experience
Ford is testing an advanced HMI system to better understand how customers prefer to control systems with a significant amount of functionality such as the high-tech, multi-contour seat. The seat features 10 adjustments plus two controls for 11 inflatable air bladders that can be used for massage functions. Ford is researching the most intuitive and effective way to control the seat, including using natural language speech recognition and a smartphone- or tablet-based interface.
Mobility
As the next phase in Ford’s Remote Repositioning mobility experiment, the Palo Alto team is now testing the ability to drive vehicles located on Georgia Institute of Technology’s campus in Atlanta from the Bay Area to prove out the new technology. A person sitting in the Palo Alto laboratory can access real-time video streamed over existing 4G/LTE technology to drive golf carts thousands of miles away. This could lead to more affordable and effective ways to manage car-sharing initiatives, or park vehicles remotely as a new form of valet parking.