Cited 0 times in 
Cited 0 times in 
A Cheonjiin Layout Mental Speller: Developing a Simple and Cost-Effective EEG-Based Brain-Computer Interface System
| DC Field | Value | Language |
|---|---|---|
| dc.contributor.author | Ahn, Ji Won | - |
| dc.contributor.author | Yu, Gi Yeon | - |
| dc.contributor.author | Kim, Seong-Wan | - |
| dc.contributor.author | Seok, Young-Seek | - |
| dc.contributor.author | Byun, Kyung-Min | - |
| dc.contributor.author | Choi, Seung Ho | - |
| dc.date.accessioned | 2026-04-28T04:59:24Z | - |
| dc.date.available | 2026-04-28T04:59:24Z | - |
| dc.date.created | 2026-04-28 | - |
| dc.date.issued | 2026-04 | - |
| dc.identifier.uri | https://ir.ymlib.yonsei.ac.kr/handle/22282913/211926 | - |
| dc.description.abstract | A brain-computer interface (BCI) enables direct communication between the brain and external devices by translating neural activity into executable control commands. Among electroencephalography (EEG)-based paradigms, steady-state visual evoked potential (SSVEP) is widely adopted due to its high signal-to-noise ratio, robustness, and minimal calibration requirements. While SSVEP-based spellers have been extensively investigated, many existing systems rely on high-channel-density EEG recordings and computationally complex processing pipelines, and are primarily designed for alphabetic input structures. In this study, we present an SSVEP-based Korean speller that integrates the Cheonjiin keyboard layout to support intuitive composition of Hangul syllables. The proposed system adopts a simple configuration, employing only five visual stimulation frequencies (6.67-12 Hz) and two occipital EEG channels (O1 and O2), with real-time frequency recognition performed using canonical correlation analysis (CCA) within a 1.5 s sliding window. EEG signals were acquired at 200 Hz using an OpenBCI Ganglion board, band-pass filtered (5-45 Hz), and processed with harmonic sinusoidal reference templates for multi-frequency classification. The proposed interface generates five control commands (up, down, left, right, and select), enabling directional cursor navigation and character confirmation on a 4 & times; 4 virtual Cheonjiin keyboard. Experimental validation with three healthy participants demonstrated an average classification accuracy of approximately 82% and an information transfer rate (ITR) of 31.2 bits/min. Frequency-domain analysis revealed clear spectral peaks at the stimulation frequencies and their harmonics, indicating reliable SSVEP responses. The proposed system employs a simple two-channel configuration integrated with a Korean language-specific input structure, demonstrating that reliable SSVEP-based communication can be realized without computationally intensive algorithms or high-cost EEG acquisition systems. These findings demonstrate that reliable SSVEP-based communication can be achieved using a low-channel configuration without reliance on high-cost EEG equipment. | - |
| dc.language | English | - |
| dc.publisher | MDPI | - |
| dc.relation.isPartOf | SENSORS | - |
| dc.relation.isPartOf | SENSORS | - |
| dc.subject.MESH | Adult | - |
| dc.subject.MESH | Algorithms | - |
| dc.subject.MESH | Brain* / physiology | - |
| dc.subject.MESH | Brain-Computer Interfaces* | - |
| dc.subject.MESH | Cost-Benefit Analysis | - |
| dc.subject.MESH | Electroencephalography* / methods | - |
| dc.subject.MESH | Evoked Potentials, Visual / physiology | - |
| dc.subject.MESH | Female | - |
| dc.subject.MESH | Humans | - |
| dc.subject.MESH | Male | - |
| dc.subject.MESH | Photic Stimulation | - |
| dc.subject.MESH | Signal Processing, Computer-Assisted | - |
| dc.subject.MESH | Young Adult | - |
| dc.title | A Cheonjiin Layout Mental Speller: Developing a Simple and Cost-Effective EEG-Based Brain-Computer Interface System | - |
| dc.type | Article | - |
| dc.contributor.googleauthor | Ahn, Ji Won | - |
| dc.contributor.googleauthor | Yu, Gi Yeon | - |
| dc.contributor.googleauthor | Kim, Seong-Wan | - |
| dc.contributor.googleauthor | Seok, Young-Seek | - |
| dc.contributor.googleauthor | Byun, Kyung-Min | - |
| dc.contributor.googleauthor | Choi, Seung Ho | - |
| dc.identifier.doi | 10.3390/s26072265 | - |
| dc.relation.journalcode | J03219 | - |
| dc.identifier.eissn | 1424-8220 | - |
| dc.identifier.pmid | 41978050 | - |
| dc.subject.keyword | brain-computer interface (BCI) | - |
| dc.subject.keyword | electroencephalography (EEG) | - |
| dc.subject.keyword | speller system | - |
| dc.subject.keyword | Cheonjiin keyboard | - |
| dc.subject.keyword | directional input | - |
| dc.contributor.affiliatedAuthor | Choi, Seung Ho | - |
| dc.identifier.scopusid | 2-s2.0-105035570414 | - |
| dc.identifier.wosid | 001738864800001 | - |
| dc.citation.volume | 26 | - |
| dc.citation.number | 7 | - |
| dc.identifier.bibliographicCitation | SENSORS, Vol.26(7), 2026-04 | - |
| dc.identifier.rimsid | 92478 | - |
| dc.type.rims | ART | - |
| dc.description.journalClass | 1 | - |
| dc.description.journalClass | 1 | - |
| dc.subject.keywordAuthor | brain-computer interface (BCI) | - |
| dc.subject.keywordAuthor | electroencephalography (EEG) | - |
| dc.subject.keywordAuthor | speller system | - |
| dc.subject.keywordAuthor | Cheonjiin keyboard | - |
| dc.subject.keywordAuthor | directional input | - |
| dc.subject.keywordPlus | CANONICAL CORRELATION-ANALYSIS | - |
| dc.subject.keywordPlus | BENCHMARK DATASET | - |
| dc.subject.keywordPlus | CLASSIFICATION | - |
| dc.type.docType | Article | - |
| dc.description.isOpenAccess | Y | - |
| dc.description.journalRegisteredClass | scie | - |
| dc.description.journalRegisteredClass | scopus | - |
| dc.relation.journalWebOfScienceCategory | Chemistry, Analytical | - |
| dc.relation.journalWebOfScienceCategory | Engineering, Electrical & Electronic | - |
| dc.relation.journalWebOfScienceCategory | Instruments & Instrumentation | - |
| dc.relation.journalResearchArea | Chemistry | - |
| dc.relation.journalResearchArea | Engineering | - |
| dc.relation.journalResearchArea | Instruments & Instrumentation | - |
| dc.identifier.articleno | 2265 | - |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.