Integrating visual stimuli for enhancing neural text style transfer with EEG sensors
Document Type
Article
Source of Publication
Computers and Electrical Engineering
Publication Date
9-1-2022
Abstract
Font Designers need to create each font character by hand. With the help of what we propose, designers finish the process quickly and automatically. We use a neural network to learn and generate new fonts. We provide two vectors as input (Bigrams and Style Vectors) of Urdu language, encoded manually with one-hot encoding and t-distributed neighbor embedding. The transposed convolution neural network takes care of learning from input, where it decodes the input into beautiful fonts. Thus, by changing the style vector, the required changes are reflected in the resultant font style. Additionally, with the help of simulated annealing, we generate meaningful and full-length sentences. To evaluate whether the fonts generated are aesthetically sound, we provide the generated sentences to the end-users as visual stimuli and measure their responses in terms of their attention and meditation levels with EEG sensors. Higher sensor levels suggest the font quality and visual appeal.
DOI Link
ISSN
Publisher
Elsevier BV
Volume
102
Disciplines
Computer Sciences
Keywords
Bigram, EEG sensor, Fonts, Kerning, One-hot encoding, Simulated annealing, Style vector, t-SNE, Transposed convolution network
Scopus ID
Recommended Citation
Amin, Muhammad; Tubaishat, Abdallah; Al-Obeidat, Feras; Shah, Babar; and Ullah, Waseef, "Integrating visual stimuli for enhancing neural text style transfer with EEG sensors" (2022). All Works. 5211.
https://zuscholars.zu.ac.ae/works/5211
Indexed in Scopus
yes
Open Access
no