Seems the office and factory of the future will be a whole lot more talkative.
That’s because “listening” technology — now powering consumer services that include Amazon Alexa, Google Home and Windows Cortana — is quickly making its way into commercial applications.
To be sure, the traditional keyboard-and-mouse combo isn’t going away anytime soon. But in a growing number of instances, it will be giving way to mics and speakers.
Consider a new application being offered by U.S. Bancorp, a banking giant with assets of $464 billion.
The system, tested by the bank over this summer, lets customers complete banking tasks — checking an account balance, learning a payment due dates, obtaining an account transaction history, and paying off a credit card — by speaking a command to an Amazon Alexa device.
The service is now being offered by U.S. Bank, a subsidiary of U.S. Bancorp that operates more than 3,000 banking offices in 25 U.S. states, and more than 4,800 ATMs.
Customers will be able to use the service with their choice of Amazon’s Echo, Echo Dot, Tap, or Alexa smartphone app. “Voice technology is going to be central to future of digital interaction,” says Gareth Gaston, the bank’s head of omnichannel banking.
Another cool promise of listening technology is the ability to create systems that can understand virtually any human language. One company working on this is Orion Labs. It recently raised $18.25 million in venture funding to expand development of voice-powered bots with real-time translation.
Known as Onyx, the systems are described by Orion as “smart walkie-talkies.” Each Onyx weighs just 1.23 oz. and measures about 2 inches across. Here’s a look:
Orion’s Onyx is a “smart walkie-talkie” weighing just 1.23 oz.
Onyx is being offered as a way of team members to stay in touch with each other, no matter the distance. The device lets the user talk, adjust the volume and set to “silent” without having to look down. Instead, the user clips the device to their clothing, then taps and holds it to speak.
For commercial applications, the Onyx can tap into databases and spreadsheets. This enables the device to provide real-time answers to commonly asked questions, such as “Is item X in stock?” And, ultimately, to answer those kinds of questions in whatever human language the user speaks.
So far, the system can understand questions in English and Spanish. Mandarin Chinese is planned next, according to a report in VentureBeat.
The ability for a device to hear ambient sound is developing, too. In fact, market watcher ABI Research is talking about a new class of device it calls “hearables.” ABI believes more than 11 million hearables will be shipped in 2022, up from about 600,000 this year.
These devices help users tune out distracting background noises, such as traffic, factory machinery, even a coworker’s music. For example, a company calls Bragi offers a wireless headphone, the Dash Pro, with two innovative features: Passive Noise Isolation, which lets the user cancel out ambient sound while listening to music, and Audio Transparency, which lets them mix ambient sounds with their music.
Another headset provider, Plantronics, recently introduced a service that promises to solve the noise and distraction problems of open-plan offices. The service, called Habitat Soundscaping, includes software that listens for loud and disruptive office sounds. When these occur, the system’s speakers play natural sounds — such as the splash of falling water — to lower the level of distraction.
These examples are just the tip of a large iceberg. Systems that can hear us, and the world around us, are coming to the workplace. Solution providers looking for new markets should look — and listen — here.
You may also enjoy: