Quantitative Response Time Technology for Measuring Cognitive-Processing Capacity in Clinical Studies.

Author(s):  
Richard W. J. Neufeld ◽  
James T. Townsend ◽  
Jennifer Jetté
Sensors ◽  
2021 ◽  
Vol 21 (2) ◽  
pp. 636
Author(s):  
Alhassan Mabrouk ◽  
Rebeca P. Díaz Redondo ◽  
Mohammed Kayed

Recently, it has been found that e-commerce (EC) websites provide a large amount of useful information that exceed the human cognitive processing capacity. In order to help customers in comparing alternatives when buying a product, previous research authors have designed opinion summarization systems based on customer reviews. They ignored the template information provided by manufacturers, although its descriptive information has the most useful product characteristics and texts are linguistically correct, unlike reviews. Therefore, this paper proposes a methodology coined as SEOpinion (summarization and exploration of opinions) to summarize aspects and spot opinion(s) regarding them using a combination of template information with customer reviews in two main phases. First, the hierarchical aspect extraction (HAE) phase creates a hierarchy of aspects from the template. Subsequently, the hierarchical aspect-based opinion summarization (HAOS) phase enriches this hierarchy with customers’ opinions to be shown to other potential buyers. To test the feasibility of using deep learning-based BERT techniques with our approach, we created a corpus by gathering information from the top five EC websites for laptops. The experimental results showed that recurrent neural network (RNN) achieved better results (77.4% and 82.6% in terms of F1-measure for the first and second phases, respectively) than the convolutional neural network (CNN) and the support vector machine (SVM) technique.


Author(s):  
Matthew L. Cohen ◽  
Aaron J. Boulton ◽  
Alyssa M. Lanzi ◽  
Elyse Sutherland ◽  
Rebecca Hunting Pompon

Abstract Purpose Patient-reported outcome measures (PROMs) vary in their psycholinguistic complexity. This study examined whether response time to PROM items is related to psycholinguistic attributes of the item and/or the self-reported cognitive ability of the respondent. Methods Baseline data from Wave 2 of the Quality of Life in Neurological Disorders (Neuro-QoL) development study were reanalyzed. That sample contained 581 adults with neurological disorders and whose self-reported cognitive abilities were quantified by the Neuro-QoL v2.0 Cognitive Function Item Bank. 185 Neuro-QoL items were coded for several psycholinguistic variables and design attributes: number of words and syllables, mean imageability of words, mean word frequency, mean age of word acquisition, and response format (e.g., about symptom frequency or task difficulty). Data were analyzed with linear and generalized linear mixed models. Results Main effects models revealed that slower response times were associated with respondents with lower self-reported cognitive abilities and with PROM items that contained more syllables, less imageable (e.g., more abstract) words, and that asked about task difficulty rather than symptom frequency. Interaction effects were found between self-reported cognition and those same PROM attributes such that people with worse self-reported cognitive abilities were disproportionately slow when responding to items that were longer (more syllables), contained less imageable words, and asked about task difficulty. Conclusion Completing a PROM requires multiple cognitive skills (e.g., memory, executive functioning) and appraisal processes. Response time is a means of operationalizing the amount or difficulty of cognitive processing, and this report indicates several aspects of PROM design that relate to a measure’s cognitive burden. However, future research with better experimental control is needed.


1982 ◽  
Vol 111 (3) ◽  
pp. 273-295 ◽  
Author(s):  
Michael E. Dawson ◽  
Anne M. Schell ◽  
James R. Beers ◽  
Andrew Kelly

Author(s):  
Hayward P. Andres

Organizations are faced with increasing costs needed to train employees in today’s high technology environment. Educators are also striving to develop new training and teaching methods that will yield optimal learning transfer and complex skill acquisition. This study suggests that trainee/learner cognitive processing capacity, information presentation format and complexity, and multimedia technology should be leveraged in order to minimize training duration and costs and maximize knowledge transfer. It presents a causal model of how multimedia and information complexity interact to influence sustained attention, mental effort, and information processing quality, all of which subsequently impact comprehension and learner confidence and satisfaction outcomes. Subjects read a text script, viewed an acetate overhead slide presentation containing text-with-graphics, or viewed a multimedia presentation depicting the greenhouse effect (low complexity) or photocopier operation (high complexity). Causal


2021 ◽  
Author(s):  
Leandro Avila Calcagnotto ◽  
Richard Huskey ◽  
Gerald M. Kosicki

Measurement noise differs by instrument and limits the validity and reliability of findings. Researchers collecting reaction time data introduce noise in the form of response time latency from hardware and software, even when collecting data on standardized computer-based experimental equipment. Reaction time is a measure with broad application for studying cognitive processing in communication research that is vulnerable to response latency noise. In this study, we utilized an Arduino microcontroller to generate a ground truth value of average response time latency in Asteroid Impact, an open source, naturalistic, experimental video game stimulus. We tested if response time latency differed across computer operating system, software, and trial modality. Here we show that reaction time measurements collected using Asteroid Impact were susceptible to response latency variability on par with other response-latency measuring software tests. These results demonstrate that Asteroid Impact is a valid and reliable stimulus for measuring reaction time data. Moreover, we provide researchers with a low-cost and open-source tool for evaluating response time latency in their own labs. Our results highlight the importance of validating measurement tools and support the philosophy of contributing methodological improvements in communication science.


2020 ◽  
Vol 157 ◽  
pp. 107971
Author(s):  
Chun-Hao Wang ◽  
Chih-Chun Lin ◽  
David Moreau ◽  
Cheng-Ta Yang ◽  
Wei-Kuang Liang

2019 ◽  
Vol 44 (2) ◽  
pp. 123-129 ◽  
Author(s):  
Adam R. Clarke ◽  
Robert J. Barry ◽  
Diana Karamacoska ◽  
Stuart J. Johnstone

Sign in / Sign up

Export Citation Format

Share Document