We have just released NEOIT5, an interactive web demo designed to explore the lexical competence of NeoIT5-large, a 783M-parameter encoder–decoder language model based on the T5 architecture and fine-tuned specifically for Italian. The demo is grounded in research presented at ACL 2025 and provides an accessible platform to experiment with how neural language models handle lexical knowledge, definition generation, and contextual language use.

NEOIT5 allows users to interact with the model across three core lexical tasks. In Reverse Dictionary, the system generates candidate words from natural-language definitions, optionally guided by part-of-speech constraints, semantic labels, or etymological hints. Definition Modeling enables users to generate multiple dictionary-style definitions for a given word through stochastic sampling strategies, while Exemplification Modeling produces contextual usage examples that illustrate how a word can be employed in real linguistic settings.

The demo includes two operational modes: one focused on attested lexical items, ensuring outputs correspond to existing Italian words, and another dedicated to neologism exploration, allowing users to investigate novel lexical candidates generated by the model. Additional parameters such as semantic tags, usage contexts, and prompt constraints provide fine-grained control over the generation process, supporting both exploratory and research-oriented use cases.

NeoIT5-large builds upon the IT5-large model and has been fine-tuned on a corpus of Italian lexical tasks derived from dictionaries and Wikipedia. The model is openly available on Hugging Face (snizio/NeoIT5-large), reflecting a commitment to reproducibility and open research practices.

With NEOIT5, we aim to offer both an interactive research demonstrator and a practical tool for studying lexical reasoning, creativity, and generalization in Neural Language Models, contributing to ongoing investigations into the linguistic competence of AI systems.

You can access NEOIT5 at the following link: NEOIT5.