NEC launches its own generative AI

Written by Nick Wood for Telecoms.com

Image
NEC launches its own generative AI

Japanese vendor NEC has joined the growing list of companies aiming to capitalise on AI hype by launching three new offering.

Perhaps the most significant of these is NEC’s very own large language model (LLM), specifically targeted at the Japanese market. The vendor says it operates with a high degree accuracy using a relatively small number of parameters – 13 billion to be precise. By comparison, OpenAI’s GPT-4 reportedly uses a trillion.

Parameters are important because these are the variables within the LLM that direct the AI to a particular answer. Think of it in terms of neural pathways in human anatomy. As humans, we establish and maintain a network of neural pathways by interacting with the world. Similarly when LLMs are trained, the ‘strength’ of an AI’s neural pathway is determined by numerical values assigned to parameters. These values automatically adjust as the AI learns, helping to fine tune its responses.

A greater number of parameters means there is more room to accommodate nuance and complexity. However, it also comes with a heavier burden in terms of compute and energy consumption. This is what makes NEC’s LLM interesting, because it uses less power and computing resources, enabling it to be deployed in the cloud – including on-premises environments – rather than relying on external data centres or supercomputers.

Continue reading this article on Telecoms.com

Share article