This article introduces a variation of differential neural computer (DNC) architecture with a convertible short-term and long-term memory, named CSLM-DNC.
External memory-based neural networks, such as differentiable neural computers (DNCs), have recently gained importance and popularity to solve complex sequential learning tasks that pose challenges to conventional neural networks; however, a trained DNC usually has a low-memory utilization efficiency. Unlike the memory architecture of the original DNC, the new scheme of short-term and long-term memories offers different importance of memory locations for read and write, and they can be converted over time. This is mainly motivated by the human brain, where short-term memory stores large amounts of noisy and unimportant information and decays rapidly, while long-term memory stores important information and lasts for a long time. The conversion of these two types of memory is allowed and is able to be learned according to their reading and writing frequency. The authors quantitatively and qualitatively evaluate the proposed CSLM-DNC architecture on the tasks of question answering, copy and repeat copy, showing that it can significantly improve memory efficiency and learning performance. (Publisher abstract provided)
Downloads
Related Datasets
Similar Publications
- Machine Learning and the Prevention of Mass Shooting in the United States
- I studied ShotSpotter in Chicago and Kansas City – Here’s What People in Detroit and the More Than 167 Other Cities and Towns Using This Technology Should Know
- Methodology for Detecting Residual Phosphoric Acid in Polybenzoxazole Fibers