A Data-Based Artificial Neural Network Assisted Testing Framework for the Assessment of Electrical Stimulation Patterns for Retinal Implants

DSpace Repository

Show simple item record

dc.contributor.advisor Zrenner, Eberhart (Prof. Dr.)
dc.contributor.author Speck, Achim
dc.date.accessioned 2020-04-03T12:59:56Z
dc.date.available 2020-04-03T12:59:56Z
dc.date.issued 2022-02-14
dc.identifier.uri http://hdl.handle.net/10900/99489
dc.identifier.uri http://nbn-resolving.de/urn:nbn:de:bsz:21-dspace-994892 de_DE
dc.identifier.uri http://dx.doi.org/10.15496/publikation-40870
dc.description.abstract Electrical stimulation (E-stim) of the retina with electrode arrays can be employed to evoke visual sensations for patients blinded by photoreceptor dystrophy due to retinitis pigmentosa. Although E-stim through electrical retinal implants (E-retinal-implants) can provide benefits for affected patients in daily life, the temporal and spatial resolution of perceived visual sensations need to be optimized. A driving question is, if the perceived visual sensations in patients are an appropriate representation of the objects in their visual perception. To date, only patients can answer this question. In terms of developing E-retinal-implants to aid vision, this appears to be a very late stage in the development of implants for a qualified answer to this question. In this work, an experimental and analytical basic concept was developed, which allows to estimate the degree to which the retina has coded the desired object in terms of recognition. Regarding the minimal needs for object recognition by blind patients, a black-box approach is described on the meta-level, where all recorded retinal ganglion cell (RGC)-responses are considered. As a fundamental building block, an electrophysiological multielectrode array (MEA)-setup for recording both, light stimulation (L-stim) and E-stim induced RGC responses in epiretinal configuration from healthy mouse retinal explants was used. L-stim patterns were applied as single bars and as double bars with different bar spacings at different velocities in four directions. With this, a library consisting of L-stim induced RGC responses with 104 classes was created. E-stim patterns were presented analogously through the MEA electrodes. Since this approach includes complex stimulus and response constellations, artificial neural networks (ANNs) for pattern recognition were applied to detect E-stim induced objects. As a robust ANN architecture, convolutional ANNs were employed. To test and assess the approximation quality of E-stim, E-stim induced response sequences were presented to the ANNs trained with L-stim induced classes. Six E-stim classes were correctly classified with approximately 96 % accuracy by the networks which were exclusively trained with the 104-class library from light evoked responses. For different object structures, this approach can give hints about redundant components in the stimulus structure. Through this, it can be estimated to which degree a certain object structure is required so that L-stim induced responses can be approximated by E-stim induced responses to a useful degree. The here developed set of analysis-tools supports the evaluation and approximation of E-stim induced RGC responses to L-stim induced responses. This allows an assessment of the functionality of new E-stim patterns already before implementing respective solutions in E-retinal-implants and ahead of the testing phase with affected patients. It was possible to reduce the original test dataset with 24 E-stim induced classes to a subset of most useful classes. With the help of such an approach, new E-stim strategies could already be narrowed down to a limited parameter space, which in turn could help shorten future design procedures of retinal implants. en
dc.language.iso en de_DE
dc.publisher Universität Tübingen de_DE
dc.rights ubt-podok de_DE
dc.rights.uri http://tobias-lib.uni-tuebingen.de/doku/lic_mit_pod.php?la=de de_DE
dc.rights.uri http://tobias-lib.uni-tuebingen.de/doku/lic_mit_pod.php?la=en en
dc.subject.classification Maschinelles Lernen , Mustererkennung , Netzhaut de_DE
dc.subject.ddc 500 de_DE
dc.subject.ddc 600 de_DE
dc.subject.other Decoding Framework en
dc.subject.other Artificial Neural Network en
dc.subject.other Retinal Implant en
dc.subject.other Artificial Vision en
dc.subject.other Electrophysiology en
dc.title A Data-Based Artificial Neural Network Assisted Testing Framework for the Assessment of Electrical Stimulation Patterns for Retinal Implants en
dc.type Dissertation de_DE
dcterms.dateAccepted 2020-03-12
utue.publikation.fachbereich Biologie de_DE
utue.publikation.fakultaet 7 Mathematisch-Naturwissenschaftliche Fakultät de_DE
utue.publikation.noppn yes de_DE

Dateien:

This item appears in the following Collection(s)

Show simple item record