From Human Attention to Computational Attention A Multidisciplinary Approach 1st Edition by Matei Mancas, Vincent P. Ferrera, Nicolas Riche, John G. Taylor – Ebook PDF Instant Download/DeliveryISBN: 3031843006, 9783031843006
Full download From Human Attention to Computational Attention A Multidisciplinary Approach 1st Edition after payment.
Product details:
ISBN-10 : 3031843006
ISBN-13 : 9783031843006
Author: Matei Mancas, Vincent P. Ferrera, Nicolas Riche, John G. Taylor
The new edition of this popular book introduces the study of attention, focusing on attention modeling, and addressing such themes as saliency models, signal detection, and different types of signals, including real-life applications. The first edition was written at a moment when the Deep Learning Neural Network (DNNs) techniques were just at their beginnings in terms of attention. Deep learning has recently become a key factor in attention prediction on images and video, and attention mechanisms have become key factors in deep learning models. The second edition tackles the arrival of DNNs for attention computing in images and video, and also discusses the attention mechanisms within DNNs (attention modules, transformers, grad-cam-based saliency maps, etc.). From Human Attention to Computational Attention 2nd Edition also explores the parallels between the brain structures and the DNN architectures to reveal how biomimetics can improve the model designs. The book is truly multi-disciplinary, collating work from psychology, neuroscience, engineering, and computer science.
From Human Attention to Computational Attention A Multidisciplinary Approach 1st Table of contents:
Part I: Attention Foundations
1. Why Do Computers Need Attention?
1.1 Why Care About Attention and Attention Models?
1.1.1 First Step in Perception of Living Beings …
1.1.2 … from Foetus to Death, Awake and During Dreams …
1.1.3 … Attention Is the Gate to Consciousness …
1.1.4 … Attention in Computers Might Be a First Step …
1.1.5 … to Real Artificial Intelligence
1.2 Who Should Read This Book and Why?
1.3 Book Structure
1.3.1 Summary
2. What Is Attention?
2.1 The Study of Attention: A Transversal Approach
2.2 A Short History of Attention
2.2.1 Conceptual Findings: Attention in Philosophy
2.2.2 Attention in Experimental Psychology
2.2.3 Attention in Cognitive Psychology
2.2.4 The Need for New Approaches: After the Late 1980s “Crisis”
2.2.5 Attention in Cognitive Neuroscience
2.2.6 Attention in Computer Science
2.3 So … What Is Attention?
2.3.1 Overt vs. Covert: The Eye
2.3.2 Serial vs. Parallel: The Cognitive Load
2.3.3 Bottom-Up vs. Top-Down: Memory and Actions
2.4 Attention vs. Attentions: A Summary
3. How to Measure Attention?
3.1 Indirect Measures of Attention
3.1.1 Eye Tracking: A Gold Standard for Overt Attention
3.1.2 Mouse Tracking: The Low-Cost Eye Tracking
3.2 Direct Measures of Attention
3.2.1 EEG: Get the Electric Activity from the Brain
3.2.2 Functional Imaging: fMRI
3.2.3 Functional Imaging: MEG
3.2.4 Functional Imaging: PET Scan
3.2.5 Complementary Techniques to Manipulate Brain Activity: TMS or tDCS
3.3 Summary
Part II: Attention in the Brain
4. Where: Fronto-Parietal Attention Networks in the Human Brain and Their Dysfunctions
4.1 Taxonomies of Human Attention
4.1.1 Spatial Selective Attention
4.1.2 Cued Detection Tasks
4.2 Networks of Human Attention
4.2.1 Sustaining Attention in Time
4.2.2 Orienting and Reorienting to Objects in Space
4.3 Attention and Visual Perception
4.3.1 Cortical Streams of Visual Processing
4.3.2 Attentional Modulations of Visual Perception
4.3.3 Target Salience
4.4 Visual Neglect
4.5 Conclusion
5. Attention and Signal Detection: A Practical Guide
5.1 Detection of Weak Signals
5.2 Effect of Stimulus Probability
5.3 Effect of Costs and Benefits for Various Outcomes
5.4 Effects of Pooling over Multiple Detectors
5.5 Signal Detection over Time
5.6 Conclusion
6. Effects of Attention in Visual Cortex: Linking Single Neuron Physiology to Visual Detection and Discrimination
6.1 Introduction
6.2 Effects of Attention on Neuronal Responses
6.3 Effects of Attention Across Multiple Neurons
Part III: Attention in Computer Science
7. Modeling Attention in Engineering
7.1 Attention in Computer Science: the Notion of Saliency Map
7.1.1 General Framework for Saliency
7.1.2 Static Bottom-up Saliency
7.1.3 Static Top-Down Saliency
7.1.4 Dynamic Overt Saliency
7.2 Saliency: Static Bottom-up Approach
7.2.1 Context: Pixel’s Surroundings
7.2.2 Context: The Whole Image or a Dataset of Images
7.2.2.1 Context: A Model of Normality
7.3 Saliency Models: Including Top-Down Information
7.3.1 Top-Down as Context: Learned Normality
7.3.2 Top-Down as a Task: Attending to Objects or Actions
7.3.2.1 Object Recognition
7.3.2.2 Object Location
7.3.2.3 Task, Context, and Learning
7.4 Dynamic Overt Saliency
7.5 Modeling Attention in Computer Science
8. Metrics for Saliency Models Validation
8.1 Literature Review of Metrics for Object Detection
8.1.1 Location-Based Metrics: Focus on Location of Salient Regions and Binary Masks
8.1.1.1 F: F-score from Precision-Recall (2009)
8.1.2 AUC: Area Under the ROC Curve (2011)
8.2 Literature Review of Metrics for Eye Tracking
8.2.1 Value-Based Metrics: Saliency Map Values at Eye Positions
8.2.2 NSS: Normalized Scan Path Saliency (2005)
8.2.3 PF: Percentage of Fixations into the Salient Region (2006)
8.2.4 P: Percentile (2008)
8.2.5 Distribution-Based Metrics: Focus on Saliency and Gaze Statistical Distributions
8.2.6 PCC: Pearson’s Correlation Coefficient (2004)
8.2.7 KLD: Kullback-Leibler Divergence (2004)
8.2.8 SCC: Spearman’s Correlation Coefficient (2011)
8.2.9 EMD: Earth Mover’s Distance (2012)
8.2.10 S: Similarity (2012)
8.2.11 IG: Information Gain (2015)
8.2.12 Location-Based Metrics: Focus on Location of Salient Regions at Gaze Positions
8.2.13 nAUC: Normalized Area Under the ROC Curve (2011)
8.2.14 pAUC: Post-processing for Area Under the ROC Curve (2011)
8.2.15 hAUC: Hit Rate for Area Under the ROC Curve (2012)
8.2.16 sAUC: Shuffled Area Under the ROC Curve (2012)
8.3 Literature Review of Dynamic Metrics
8.3.1 Distance-Based Metrics
8.3.2 Time Series and Vector Metrics
8.3.3 Recurrence Analysis Metrics
8.3.4 Density Maps Metrics
8.4 Discussions and Conclusions
8.5 Summary
9. Study of Parameters Affecting Visual Saliency Assessment
9.1 Experiment 1: Effects of Ground-Truth
9.1.1 Goal
9.1.2 Method
9.1.3 Results
9.2 Experiment 2: Effects of the Size of Salient Objects
9.2.1 Goal
9.2.2 Method
9.2.3 Results
9.3 Experiment 3: Effects of Post-processing
9.3.1 Goal
9.3.2 Method
9.3.3 Results
9.4 Experiment 4: Effects of Metrics
9.4.1 Goal
9.4.2 Method
9.4.3 Results
9.4.3.1 Analysis of Consistency of Metrics
9.4.3.2 Study of the Dimensionality
9.5 Conclusion
9.6 Summary
10. Attention, Multimodality, and Datasets for Validation
10.1 Attention and Signal Modalities
10.2 Attention on Still Images
10.2.1 Why Attention on Still Images?
10.2.2 Classic Model Examples
10.2.2.1 Classic Eye Tracking-Based Models
10.2.2.2 Classic Salient Object Detection-Based Models
10.2.3 Deep Learning Model Examples
10.2.4 Validation Datasets/Benchmarks
10.2.4.1 Eye Tracking-Based Datasets with Still Images
10.2.4.2 Salient Object Detection-Based Datasets with Still Images
10.3 Attention on Videos
10.3.1 Why Attention on Videos?
10.3.2 Classic Model Examples
10.3.3 Deep Learning Model Examples
10.3.4 Validation Datasets/Benchmarks
10.4 Attention on Audio and Multimodal Data
10.4.1 Why Attention on Audio and Multimodal Data?
10.4.2 Classical Model Examples
10.4.3 Deep Learning Model Examples
10.4.4 Validation Datasets/Benchmarks
10.5 Attention on 360∘ Images and Videos
10.5.1 Why Attention on 360∘ Stimuli?
10.5.2 Vision in an Omnidirectional (Virtual) World
10.5.3 Viewing Tendencies of 360∘ Stimuli
10.5.4 Predicting 360∘ Saliency with 2D Saliency Models
10.5.5 Deep Learning Model Examples
10.5.5.1 Saliency Prediction
10.5.5.2 Head Movement Prediction
10.5.6 Validation Datasets/Benchmarks
10.6 Attention on 3D Data
10.6.1 Why Attention on 3D Data?
10.6.2 Classical Model Examples
10.6.3 2.5D Models or Depth-Based Saliency
10.6.3.1 Depth and Disparity-Based Models
10.6.3.2 Stereoscopic-Based Models
10.6.4 3D Structure-Based Models
10.6.4.1 Mesh-Based Saliency
10.6.4.2 Point Cloud-Based Saliency
10.6.5 Deep Learning Model Examples
10.6.5.1 RGB-D Data Saliency
10.6.5.2 Point Clouds and Mesh Saliency
10.6.6 Validation Datasets/Benchmarks
10.7 Conclusion
11. Audiovisual Saliency Models: A Short Review
11.1 Introduction
11.2 Audiovisual Attention
11.3 Computational Models of Audiovisual Attention
11.3.1 Hand-Crafted Features
11.3.2 Deep Features
11.4 Summary and Perspectives
12. Attention in Machine Learning
12.1 Introduction
12.2 Introduction to Neural Networks
12.2.1 Artificial Neural Networks
12.2.2 The Components of Neural Networks
12.2.3 Activation Functions
12.2.4 Architectures
12.2.4.1 Fully Connected Feedforward Neural Networks
12.2.4.2 Convolutional Neural Networks
12.2.4.3 Recurrent Neural Networks
12.2.4.4 Other Architectures
12.2.5 Training Methodologies
12.2.5.1 Supervised Learning
12.2.5.2 Self-supervised Learning
12.2.5.3 Unsupervised Learning
12.2.5.4 Reinforcement Learning
12.3 Sequence Learning Before the Arrival of Attention
12.3.1 Encoder-Decoder Architecture
12.4 The Motivation Behind Attention in Neural Networks
12.4.1 The Bottleneck Problem
12.4.2 The Vanishing Gradient Problem
12.4.3 The Parallelizability Problem
12.4.4 Computational Efficiency: The Human Motivation
12.4.5 The Explainability Motivation
12.5 The Implementations of Attention
12.5.1 Implicit and Explicit Attention
12.5.2 Hard and Soft Attention
12.5.3 Self-attention
12.5.4 Scoring Functions
12.5.5 Attentional Interfaces
12.6 Attention in the Transformer Architecture
12.6.1 Explicit Soft Scaled Dot-Product Self-attention
12.6.2 Key-Query-Value Paradigm
12.6.3 Attention Heads and Parallelizability
12.7 Neural Network Attention Versus Human Attention
12.7.1 Reinforcement Learning and Implicit Attention
12.7.2 Salient Object Detection, Explicit Attention, and Visual Question Answering
12.7.3 Natural Language Processing and Transformer Attention
12.7.4 Self-attention to Mimic Bottom-Up Human Attention
12.8 Conclusion
Part IV: Convergence: When the Brain Informs Computer Science (and Vice Versa)
13. Is Attention All You Need?
13.1 Introduction
13.1.1 A Counterexample: Convolutional Neural Networks
13.2 The Origin of AI Attention
13.2.1 Attention as a Means of Alignment
13.2.2 Attention to Build a Differentiable Content-Addressable Memory
13.2.3 Attention as a Means of Focusing on Different Visual Locations
13.3 From Transformers Back to Neuroscience
13.3.1 Visual Transformers as Models of Vision
13.3.2 Transformers as Models of Eye Movements
13.3.3 Transformers as Models of Memory
13.4 Conclusion
Annex
The Math Behind Self-Attention
What Does That Have to Do with Brains?
14. Linking Attention with Goals: A Theory of Attentional Priority Based on Expected Information Gains
14.1 Introduction
14.2 Top-Down Control and Target Selection
14.3 The Computational Role of Attention Is Gathering Information
14.4 Information Stems from the Observer’s Uncertainty
14.5 A Neural Hypothesis for EIG-Based Attention Control
14.6 A Neural Hypothesis for EIG-Based Attention Control
14.7 Summary
15. The Future of Attention Models: Convergence of Deep Learning with Artificial and Human Attention
15.1 From Human Attention Study to Models
15.1.1 Attention, Emotions, Memory, and Actions
15.1.2 Perspectives in Priority and Curiosity Modeling
15.1.2.1 Salience and Priority in Human Vision
15.1.2.2 Curiosity: Uncertainty Reduction Through Guided Exploration
15.2 Perspectives in Computational Attention Modeling
15.2.1 Models
15.2.1.1 From Static to Dynamic Saliency Maps: Computing Eye Scan-Paths
15.2.1.2 Multimodal Modeling of Attention
15.2.1.3 Towards More Natural, Diverse, and Less Biased Datasets for Model Training and Validation
15.2.1.4 Foundation Models of Attention and Visual Search
15.2.1.5 More Links Between Deep-Learning and Attention?
15.3 Links Between Human and Artificial Attention
15.4 Summary
People also search for From Human Attention to Computational Attention A Multidisciplinary Approach 1st:
attention and types of attention
is attention a cognitive process
attention processing techniques
human attention
human attention mechanism
Tags: Human Attention, Computational Attention, Multidisciplinary Approach, Matei Mancas, Vincent Ferrera, Nicolas Riche, John Taylor