E-Book, Englisch, 261 Seiten
Zhao / Lv / Yang Human-Machine Interaction for Automated Vehicles
1. Auflage 2023
ISBN: 978-0-443-18998-2
Verlag: Elsevier Science & Techn.
Format: EPUB
Kopierschutz: 6 - ePub Watermark
Driver Status Monitoring and the Takeover Process
E-Book, Englisch, 261 Seiten
ISBN: 978-0-443-18998-2
Verlag: Elsevier Science & Techn.
Format: EPUB
Kopierschutz: 6 - ePub Watermark
Dr Yifan Zhao is a Reader in Data Science in the School of Aerospace, Transport and Manufacturing at Cranfield University and the academic lead of the Through-life Engineering Services Lab. He has over 20 years of experience in solving Inverse Problems based on computer vision, Artificial Intelligence (AI), signal processing, and nonlinear system identification. The covered themes include asset management of construction, non-destructive testing & evaluation (NDT&E) in Digital Manufacturing, driver monitoring and human-machine interface for Intelligent Transport, and brain functional imaging & analysis for Digital Healthcare. He has produced over 150 publications, 3 books and 3 patents.
Autoren/Hrsg.
Weitere Infos & Material
1;Front Cover;1
2;Human-Machine Interaction for Automated Vehicles;4
3;Copyright Page;5
4;Contents;6
5;1. Introduction;12
5.1;1.1 Automation level;12
5.1.1;1.1.1 Traffic safety;13
5.1.2;1.1.2 Motivation;14
5.2;1.2 Summary of the book;15
5.3;References;16
6;2. Driver behaviour recognition based on eye gaze;18
6.1;2.1 Introduction;18
6.2;2.2 Methodology;21
6.2.1;2.2.1 Framework architecture;21
6.2.2;2.2.2 Gaze estimation;22
6.2.2.1;2.2.2.1 System framework;22
6.2.2.2;2.2.2.2 Video acquisition;23
6.2.2.3;2.2.2.3 Feature extraction;24
6.2.2.4;2.2.2.4 Feature mapping;25
6.2.2.5;2.2.2.5 Heat map visualisation;28
6.2.3;2.2.3 Object recognition;29
6.2.4;2.2.4 Activity classifier;31
6.3;2.3 Results;32
6.3.1;2.3.1 Gaze estimation;32
6.3.1.1;2.3.1.1 Indoor experiment;32
6.3.1.2;2.3.1.2 In-vehicle experiment;37
6.3.2;2.3.2 Object recognition;37
6.3.3;2.3.3 Nondriving-related activity identification and analysis;40
6.3.4;2.3.4 Comparison with the state-of-the-art;45
6.4;2.4 Discussion;46
6.5;2.5 Conclusions;49
6.6;References;50
7;3. Driver behaviour recognition based on hand-gesture;54
7.1;3.1 Introduction;54
7.2;3.2 Methodology;56
7.2.1;3.2.1 System architecture;56
7.2.2;3.2.2 Region of interest selection;58
7.2.3;3.2.3 Optical flow estimation;59
7.2.4;3.2.4 2-Stream convolutional neural network;61
7.2.5;3.2.5 Experiment setup and performance validation;64
7.3;3.3 Results;65
7.3.1;3.3.1 Two streams;65
7.3.2;3.3.2 Classification performance;67
7.3.3;3.3.3 Conflicted cases analysis;69
7.4;3.4 Discussion;72
7.5;3.5 Conclusion;73
7.6;References;74
8;4. Driver behaviour recognition based on head movement;78
8.1;4.1 Introduction;78
8.2;4.2 Methodology;80
8.2.1;4.2.1 Experiments;80
8.2.2;4.2.2 Time-varying correlation estimation;82
8.2.3;4.2.3 Feature selection and classification;84
8.3;4.3 Results;85
8.3.1;4.3.1 Phase 1 – training data;85
8.3.2;4.3.2 Phase 2 – testing data;89
8.4;4.4 Conclusion;93
8.5;References;94
9;5. Driver behaviour recognition based on the fusion of head movement and hand movement;96
9.1;5.1 Introduction;96
9.2;5.2 Related works;98
9.2.1;5.2.1 Non-driving-related activities’ recognition;98
9.2.2;5.2.2 3D convolutional neural network;99
9.3;5.3 Methodology;100
9.3.1;5.3.1 3D residual block;101
9.3.2;5.3.2 Architecture of the 3D convolutional neural network model;102
9.3.3;5.3.3 Prediction process for the framework;103
9.3.4;5.3.4 Visual explanations of convolutional neural network model predictions;103
9.4;5.4 Dataset and training;105
9.4.1;5.4.1 Experiment design;105
9.4.2;5.4.2 Camera setup;106
9.4.3;5.4.3 Data pre-processing;107
9.4.4;5.4.4 Training setup;108
9.5;5.5 Results;109
9.6;5.6 Visualisation and discussion;112
9.7;5.7 Conclusion;115
9.8;References;116
10;6. Real-time driver behaviour recognition;120
10.1;6.1 Introduction;120
10.2;6.2 Methodology;123
10.2.1;6.2.1 Inverted Linear Bottlenecks;124
10.2.2;6.2.2 Channel weighting and temporal weighting;125
10.2.3;6.2.3 Model structure;126
10.2.4;6.2.4 Saliency map visualisation;127
10.2.5;6.2.5 Dataset and pre-processing;128
10.2.6;6.2.6 Hardware;128
10.3;6.3 Results;129
10.3.1;6.3.1 Training;129
10.3.2;6.3.2 Results;131
10.3.3;6.3.3 Saliency map visualisation;134
10.4;6.4 Conclusion;138
10.5;References;138
11;7. The implication of non-driving tasks on the take-over process;142
11.1;7.1 Introduction;142
11.2;7.2 Methodology;144
11.2.1;7.2.1 Take-over concept;144
11.2.2;7.2.2 Experiment setup;145
11.2.2.1;7.2.2.1 Vehicle modification;145
11.2.2.2;7.2.2.2 Participants;146
11.2.2.3;7.2.2.3 Non-driving-related activities;146
11.2.2.4;7.2.2.4 Track and take-over scenarios;146
11.2.3;7.2.3 Data acquisition;147
11.3;7.3 Results;149
11.3.1;7.3.1 Road-checking behaviour analysis;149
11.3.2;7.3.2 Take-over performance;150
11.4;7.4 Conclusion;153
11.5;References;155
12;8. Driver workload estimation;158
12.1;8.1 Introduction;158
12.2;8.2 The hybrid methods;160
12.2.1;8.2.1 Methodology;160
12.2.2;8.2.2 Identification of important variables using error reduction ratio causality;161
12.2.3;8.2.3 Model estimation using support vector machine;164
12.3;8.3 Dataset and pre-processing;166
12.4;8.4 Results and discussions;168
12.4.1;8.4.1 Identification of important variables;168
12.4.2;8.4.2 Driver workload estimation;170
12.5;8.5 Conclusion;173
12.6;Appendix;174
12.7;References;174
13;9. Neuromuscular dynamics characterisation for human–machine interface;178
13.1;9.1 Introduction;178
13.2;9.2 Dynamic model of the human–machine interacting system;180
13.3;9.3 System identification methodology;181
13.3.1;9.3.1 Non-linear least squares problem formulation;182
13.3.2;9.3.2 The Gauss–Newton algorithm;183
13.3.3;9.3.3 Convergence analysis of the Gauss–Newton algorithm;183
13.4;9.4 Experiment design and data collection;186
13.4.1;9.4.1 Steering tasks;186
13.4.2;9.4.2 Driver posture and hand positions;187
13.4.3;9.4.3 Data measurement and acquisition;188
13.5;9.5 Experiment results;189
13.6;9.6 Result analysis and discussion;192
13.6.1;9.6.1 Comparison of the passive and active steering tasks;192
13.6.2;9.6.2 Comparison of the results with different hand positions;194
13.6.3;9.6.3 Comparison of the results with different postures;195
13.6.4;9.6.4 Correlation analysis;195
13.7;9.7 Conclusions and future work;196
13.8;References;197
14;10. Driver steering intention prediction using neuromuscular dynamics;200
14.1;10.1 Introduction;200
14.2;10.2 High-level architecture of the system;203
14.3;10.3 Experiment design and data analysis;206
14.3.1;10.3.1 Experiment design;206
14.3.2;10.3.2 Electromyography data analysis;208
14.4;10.4 Hybrid-learning-based time-series model;210
14.4.1;10.4.1 Model construction;210
14.4.2;10.4.2 Model training and implementation;214
14.5;10.5 Experiment results;215
14.5.1;10.5.1 Evaluation metrics and baselines;215
14.5.2;10.5.2 Continuous steering torque prediction;217
14.5.3;10.5.3 Discrete steering intention prediction;218
14.5.4;10.5.4 Discussions and future works;222
14.6;10.6 Conclusion;224
14.7;References;224
15;11. Intelligent haptic interface design for human–machine interaction in automated vehicles;228
15.1;11.1 Introduction;228
15.2;11.2 Experimental results;231
15.2.1;11.2.1 Take-over time;232
15.2.2;11.2.2 Driver steering torque;234
15.2.3;11.2.3 Steering wheel angle;234
15.2.4;11.2.4 Yaw rate of the vehicle;236
15.3;11.3 Discussion;236
15.4;11.4 Experimental methods;241
15.4.1;11.4.1 Experimental design;241
15.4.2;11.4.2 Take-over request signalling;242
15.4.3;11.4.3 Assessment of driver states and control ability;243
15.4.4;11.4.4 Driver control authority and performance;244
15.4.5;11.4.5 Modelling of human–machine collaboration process;245
15.4.6;11.4.6 The two-phase predictive haptic steering torque controller;246
15.4.7;11.4.7 Participants;248
15.4.8;11.4.8 Statistical analysis;249
15.5;References;249
16;Index;252
17;Back Cover;261