attention modeling
Recently Published Documents


TOTAL DOCUMENTS

58
(FIVE YEARS 3)

H-INDEX

12
(FIVE YEARS 0)

Author(s):  
Charlie Veal ◽  
Marshall Lindsay ◽  
Scott Kovaleski ◽  
Derek T. Anderson ◽  
Stanton R. Price

Author(s):  
Dacia M. McCoy ◽  
Chelsea Ritter ◽  
J. Meredith Murphy

Peer-mediated Pivotal Response Training (PM-PRT) is a behavioral approach that incorporates instruction and practice opportunities on pivotal and socially significant skills (e.g., communication, playing with peers) for individuals diagnosed with developmental disorders in their everyday routines and environments. This chapter provides an overview of key components to successfully utilize the PM-PRT intervention with children in a variety of settings. The intervention includes a peer interventionist who may use selected strategies with a target student such as gaining attention, modeling, turn-taking, encouraging conversation, choice in tasks, and reinforcement of attempts during a play session. For example, the intervention may be implemented during recess and can be tailored to an individual’s target behaviors. The flexibility of PM-PRT allows it to be an effective and efficient intervention that promotes generalization across peers and settings.


Author(s):  
Xiaoshuai Sun ◽  
Xuying Zhang ◽  
Liujuan Cao ◽  
Yongjian Wu ◽  
Feiyue Huang ◽  
...  

2020 ◽  
Vol 34 (03) ◽  
pp. 2585-2592
Author(s):  
Jiacheng Liu ◽  
Xiaofeng Hou ◽  
Feilong Tang

The state-of-the-art machine teaching techniques overestimate the ability of learners in grasping a complex concept. On one side, since a complicated concept always contains multiple fine-grained concepts, students can only grasp parts of them during a practical teaching process. On the other side, because a single teaching sample contains unequal information in terms of various fine-grained concepts, learners accept them at different levels. Thus, with more and more complicated dataset, it is challenging for us to rethink the machine teaching frameworks. In this work, we propose a new machine teaching framework called Attentive Machine Teaching (AMT). Specifically, we argue that a complicated concept always consists of multiple features, which we call fine-grained concepts. We define attention to represent the learning level of a learner in studying a fine-grained concept. Afterwards, we propose AMT, an adaptive teaching framework to construct the personalized optimal teaching dataset for learners. During each iteration, we estimate the workers' ability with Graph Neural Network (GNN) and select the best sample using a pool-based searching approach. For corroborating our theoretical findings, we conduct extensive experiments with both synthetic datasets and real datasets. Our experimental results verify the effectiveness of AMT algorithms.


2020 ◽  
Vol 34 (07) ◽  
pp. 12894-12901
Author(s):  
Yicheng Zhang ◽  
Lei Li ◽  
Li Song ◽  
Rong Xie ◽  
Wenjun Zhang

Clothing transfer is a challenging task in computer vision where the goal is to transfer the human clothing style in an input image conditioned on a given language description. However, existing approaches have limited ability in delicate colorization and texture synthesis with a conventional fully convolutional generator. To tackle this problem, we propose a novel semantic-based Fused Attention model for Clothing Transfer (FACT), which allows fine-grained synthesis, high global consistency and plausible hallucination in images. Towards this end, we incorporate two attention modules based on spatial levels: (i) soft attention that searches for the most related positions in sentences, and (ii) self-attention modeling long-range dependencies on feature maps. Furthermore, we also develop a stylized channel-wise attention module to capture correlations on feature levels. We effectively fuse these attention modules in the generator and achieve better performances than the state-of-the-art method on the DeepFashion dataset. Qualitative and quantitative comparisons against the baselines demonstrate the effectiveness of our approach.


Sign in / Sign up

Export Citation Format

Share Document