Home AI News Dynamic Language Models: Adapting to New Knowledge Over Time

Dynamic Language Models: Adapting to New Knowledge Over Time

0
Dynamic Language Models: Adapting to New Knowledge Over Time

In 2021, DeepMind released a paper called “Mind the Gap: Assessing Temporal Generalization in Neural Language Models” and dynamic language modelling benchmarks for WMT and arXiv to evaluate language models with a focus on temporal dynamics. The study found that large language models face challenges with temporal generalization, especially with knowledge-intensive tokens.

Today, DeepMind is releasing two papers and a new benchmark to further advance research in this area. The first paper, “StreamingQA: A Benchmark for Adaptation to New Knowledge over Time in Question Answering Models,” focuses on understanding how question-answering models adapt to new information in order to answer questions about new events. The second paper, “Internet-augmented language models through few-shot prompting for open-domain question answering,” explores the use of few-shot prompting to condition language models on information retrieved from the web using Google Search.

The StreamingQA benchmark is designed to study how semi-parametric question-answering models adapt to evolving knowledge by asking questions about 14 years of time-stamped news articles. The study shows that parametric models can be updated without full retraining, while adding new articles into the search space allows semi-parametric models to rapidly adapt to new knowledge.

The few-shot prompting approach aims to overcome challenges faced by large-scale language models in accessing factual and up-to-date information by conditioning the models on information returned from the web using Google Search. The study finds that this approach improves the performance of open-domain question-answering models.

Overall, these advancements in language model evaluation and adaptation to new knowledge over time are crucial steps in improving the flexibility and robustness of AI models in understanding dynamic and evolving language and knowledge.

Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here