Reducing network latency using subpages in a global memory environment

1996 ◽  
Vol 31 (9) ◽  
pp. 258-267
Author(s):  
Hervé A. Jamrozik ◽  
Michael J. Feeley ◽  
Geoffrey M. Voelker ◽  
James Evans ◽  
Anna R. Karlin ◽  
...  
1996 ◽  
Vol 30 (5) ◽  
pp. 258-267 ◽  
Author(s):  
Hervé A. Jamrozik ◽  
Michael J. Feeley ◽  
Geoffrey M. Voelker ◽  
James Evans ◽  
Anna R. Karlin ◽  
...  

Author(s):  
M. Mouchet ◽  
M. Randall ◽  
M. Segnere ◽  
I. Amigo ◽  
P. Belzarena ◽  
...  

2021 ◽  
Vol 25 (4) ◽  
pp. 1031-1045
Author(s):  
Helang Lai ◽  
Keke Wu ◽  
Lingli Li

Emotion recognition in conversations is crucial as there is an urgent need to improve the overall experience of human-computer interactions. A promising improvement in this field is to develop a model that can effectively extract adequate contexts of a test utterance. We introduce a novel model, termed hierarchical memory networks (HMN), to address the issues of recognizing utterance level emotions. HMN divides the contexts into different aspects and employs different step lengths to represent the weights of these aspects. To model the self dependencies, HMN takes independent local memory networks to model these aspects. Further, to capture the interpersonal dependencies, HMN employs global memory networks to integrate the local outputs into global storages. Such storages can generate contextual summaries and help to find the emotional dependent utterance that is most relevant to the test utterance. With an attention-based multi-hops scheme, these storages are then merged with the test utterance using an addition operation in the iterations. Experiments on the IEMOCAP dataset show our model outperforms the compared methods with accuracy improvement.


2021 ◽  
Vol 0 (0) ◽  
Author(s):  
Hamish Patel ◽  
Reza Zamani

Abstract Long-term memories are thought to be stored in neurones and synapses that undergo physical changes, such as long-term potentiation (LTP), and these changes can be maintained for long periods of time. A candidate enzyme for the maintenance of LTP is protein kinase M zeta (PKMζ), a constitutively active protein kinase C isoform that is elevated during LTP and long-term memory maintenance. This paper reviews the evidence and controversies surrounding the role of PKMζ in the maintenance of long-term memory. PKMζ maintains synaptic potentiation by preventing AMPA receptor endocytosis and promoting stabilisation of dendritic spine growth. Inhibition of PKMζ, with zeta-inhibitory peptide (ZIP), can reverse LTP and impair established long-term memories. However, a deficit of memory retrieval cannot be ruled out. Furthermore, ZIP, and in high enough doses the control peptide scrambled ZIP, was recently shown to be neurotoxic, which may explain some of the effects of ZIP on memory impairment. PKMζ knockout mice show normal learning and memory. However, this is likely due to compensation by protein-kinase C iota/lambda (PKCι/λ), which is normally responsible for induction of LTP. It is not clear how, or if, this compensatory mechanism is activated under normal conditions. Future research should utilise inducible PKMζ knockdown in adult rodents to investigate whether PKMζ maintains memory in specific parts of the brain, or if it represents a global memory maintenance molecule. These insights may inform future therapeutic targets for disorders of memory loss.


Sign in / Sign up

Export Citation Format

Share Document