The large language model (LLM) has undergone another iteration, according to the Technology Innovation Institute (TII) located in the United Arab Emirates (UAE).
An image-to-text model of the new Falcon 2 is available, according to a press release issued by the TII on Monday, May 13.
Per the publication, the Falcon 2 11B VLM, one of the two new LLM versions, can translate visual inputs into written outputs thanks to its vision-to-language model (VLM) capabilities.
According to the announcement, aiding people with visual impairments, document management, digital archiving, and context indexing are among potential uses for the VLM capabilities.
A “more efficient and accessible LLM” is the goal of the other new version, Falcon 2 11B, according to the press statement. It performs on par with or better than AI models in its class among pre-trained models, having been trained on 5.5 trillion tokens having 11 billion parameters.
As stated in the announcement, both models are bilingual and can do duties in English, French, Spanish, German, Portuguese, and several other languages. Both provide unfettered access for developers worldwide as they are open-source.
Both can be integrated into laptops and other devices because they can run on a single graphics processing unit (GPU), according to the announcement.
The AI Cross-Center Unit of TII’s executive director and acting chief researcher, Dr. Hakim Hacid, stated in the release that “AI is continually evolving, and developers are recognizing the myriad benefits of smaller, more efficient models.” These models offer increased flexibility and smoothly integrate into edge AI infrastructure, the next big trend in developing technologies, in addition to meeting sustainability criteria and requiring less computer resources.
Businesses can now more easily utilize AI thanks to a trend toward the development of smaller, more affordable AI models.
“Smaller LLMs offer users more control compared to large language models like ChatGPT or Anthropic’s Claude, making them more desirable in many instances,” Brian Peterson, co-founder and chief technology officer of Dialpad, a cloud-based, AI-powered platform, told PYMNTS in an interview posted in March. “They’re able to filter through a smaller subset of data, making them faster, more affordable, and, if you have your own data, far more customizable and even more accurate.”