Intel Select Avaya for AI Builders Program

Engineers from both companies will work on performance of AI systems

2
Avaya Intel AI Builders Program
Unified Communications

Published: September 5, 2019

Rene Millman

Avaya is to partner with Intel on the latter’s AI Builders Program.

In an announcement made during the publication of Avaya’s third quarter 2019 financial results, the company was selected by Intel for their AI program and will see Intel and Avaya engineers jointly working at a deep technical level to improve the performance and scale of Avaya’s AI solutions, such as Avaya Conversational Intelligence, when deployed on Intel hardware.

The Intel AI Builders Program was launched in May this year for help partners implementing artificial intelligence (AI), with the resources and support they need to accelerate the adoption of their Intel-based AI platforms.

Intel AI Academy

Ahmed Helmy Avaya
Ahmed Helmy, Avaya

This includes access to Intel’s AI Academy, an exclusive Intel AI DevCloud for Builders, and technical support. It also markets member’s solutions via Intel channels and matchmakes select members with Intel’s enterprise customers.

Speaking exclusively to UC Today, Ahmed Helmy, CTO for Avaya EMEA and APAC, said that many conversational AI solutions use custom GPUs that are expensive and only really works for customers who have contact centres with 500 to 2,000 agents, but smaller companies who want to use AI in their contact centres need better value.

Helmy said for end-users, they won’t need to buy any hardware. He said:

“Avaya bundles the products and manages everything and we offer it as either a pure cloud, or as a hybrid solution,”

“Some customers have the Avaya contact centre technology on-premises, they tell us they want to start modernising their platform, they want to add more capabilities to it”

Helmy said that the conversational AI product used in contact centres can not only understand English but from early next year will understand French, German and Spanish.

Different Accents and Dialects

Not only will the technology understand language, but it should also be able to understand different accents and dialects as well as, for example, English spoken by a non-English speaker. The technology should be able to understand over 200 accents.

“We started with English and it was very challenging at the beginning because there are so many different accents and dialects, it’s not as straoght forward as you think.”

“But as we trained it, as we improved the learning, we were able to achieve much better performance and quality”

For some places, like in the US where the accents can be pretty homogenous, the technology can reach optimal performance after a month. With countries that have a lot of diverse accents and dialects, you can spend a “few months just to get the application on the acceptance level.”

He added that the Intel program Avaya has joined will help it embed more capabilities into the technology to help accelerate the development of conversational AI in its products and services.

 

 

Artificial Intelligence
Featured

Share This Post