Imaginación visual: nuevo paradigma en interfaces cerebro-máquina

This doctoral thesis focuses on the study of the use of visual imagery for the construction of brain-computer interfaces (BCI) non-invasive. Although the construction of brain-computer interfaces is already is a reality, practically all of them are based on the use of motor imagery or the use of potentials related to events, such as the P300, or the potentials evoked SSVEP (Steady State Visualy Evoked Potentials). But the being Humans have many other cognitive abilities that can be studied for use in brain-computer interfaces, with the aim of expanding the capabilities of these and create applications with a more natural interaction. One of The capacities available to the human being is to visualize objects without being perceived by the senses of sight, which is what we call visual imagery. sual. The main goal of this doctoral thesis is, firstly, to demonstrate that it is possible to differentiate visual imagery, from EEG signals acquired from non-invasive and, secondly, to offer an adequate processing framework. Initially, it began with the creation of a binary brain switch based on in visual imagery. It has been possible to demonstrate that it is posible to differentiate between images visual imagination, everyday objects and even basic geometric figures, with respect a state of non-imagination. Later it was shown that it was also possible, Using deep learning techniques, the classification of more than one class of imaginings finally, a study was carried out where the use of visual imagery together with motor imagery. This thesis provides a set of methodologies that can be applied to the processing of EEG signals coming from the visual imagination, allowing extend the capabilities of brain-machine interfaces beyond those based on conventional paradigms, such as motor imagination or evoked potentials ​
​L'accés als continguts d'aquesta tesi queda condicionat a l'acceptació de les condicions d'ús establertes per la següent llicència Creative Commons: