Saturday, January 5, 2013

Brain Computer Interface - A Reflection

Since inception of the human race the idea of being able to perform an action solely on the basis of thought has been an unending quest. Throughout eras scientists have been working either actively or sporadically to explore ways to make this happen. Initial work (in the late 1920s) was involved around discovering the fact that electrical signal produced by the human brain can be detected and recorded from the scalp. 

The advent of computers was a significant breakthrough in this area of work, as it lead the way towards what we today know as “Brain Computer Interface (BCI)”. The computational processing aspect of computers now opened up doors to the possibility of being able to analyze and distinguish the patterns of the electrical signals being produced by the human brain, and thus allowing to trigger a desired outcome for a given identified brain signal pattern. For example, when brain signal pattern A is detected by the computer then it could be associated to trigger a tangible action like activating a switch to turn on/off the lights. In simple terms this kind of a setup that detects electrical signals from the brain, and computes these signal patterns to initiate a substantial tangible action is known as a BCI. 

BCI has always been a subject matter of interest in the fields of Rehabilitation and Assistive Technology. The potential of BCI is unbounded with regards to improving the lives people with disabilities. For example, BCI based systems could allow an individual with severely restricted or no range of movements (e.g. spinal cord injuries) to drive a power wheelchair, operate household appliances, etc. Or even help someone in a vegetative state to communicate by speaking out the words that the individual would like to say. 

Primarily there are two categories of BCI systems – “invasive” and “noninvasive”. Invasive systems interact with the brain directly via electrodes / sensors that are implanted into the brain or its surface. While noninvasive systems interact with the brain indirectly via electrodes / sensors placed on the surface of the head that detect brain signal emissions (e.g. Electro-Encephalography (EEG), functional Magnetic Resonance Imaging (fMRI), and Magnetic Sensor Systems).

Image of a person using a Noninvasive BCI system
Example of a noninvasive BCI system in use
Noninvasive BCI systems usually involve wearing a head cap (aka EEG cap) with multiple holes / slots to put on the electrodes at the relevant areas of the surface of the head to detect and record the electrical signals emitted by the brain. Electro gel is used on the electrodes to improve contact between the scalp and the electrode. Due to the requirement of a gel such electrodes are also known as wet-electrodes. Existing systems can have anywhere between a few to more than a 100 electrodes. The practicalities of using wet electrodes – e.g. drying up of gel, repeated cleaning of electrodes and head skin to setup EEG Cap, irritation of sensitive skin due to application of gel, etc. – does not make them convenient for quick setup and daily use. Over the recent few years this has prompted the development of dry electrodes. Unlike wet electrodes, dry electrodes do not require the use of gel and can be setup direct into the EEG Cap. Although still in the infancy of its developmental cycle, dry electrodes are quickly catching up in terms of detecting high quality brain signals as their wet counterpart. Regular comparison studies are being carried out to evaluate the performance of Wet vs. Dry Electrodes within the context of EEG based noninvasive BCI.

Image of a quadriplegic woman controlling a robotic arm using an Invasive BCI system
Quadriplegic Woman Controlling Robotic Arm with Thought
using an Invasive BCI System 
Currently invasive BCI systems are mostly implemented within controlled research oriented environments. Rehabilitation and Assistive Technology applications are at the forefront of the research being conducted in this area. Recent examples of breakthroughs include enabling a Paralyzed Woman to Steer a Robotic Arm with her Mind and Sip Coffee and Controlling a Prosthetic Arm with Thought to perform generic tasks. Both these cases required a computer chip to be implanted in the user’s brain. This chip is programmed to collect the electrical signals from the brain and translate them into actions to be carried out by the robotic arm. 

Noninvasive EEG based BCI systems have also been a sizzling area of development. Due to the non-surgical nature of setup noninvasive systems are being experimented to be utilized for a wider range of applications. The Brainable Project is an example of noninvasive BCI systems being employed to control home automation systems by people with severe disabilities. Noninvasive BCI systems have also managed to penetrate into the mainstream market, for the most part in the gaming industry. The Emotiv EPOC is a noninvasive EEG mainstream BCI interface that is sold with an optional SDK kit to develop customized applications. The EPOC also comes with a number of readily available applications such as emotion / mood recognition, and virtual games.  Similarly, BCI technology from NeuroSky has managed to incorporate its application into a number of devices like media players, video games, and the latest being a fluffy wearable pair of cat ears that responds to the emotional state of the user

The future of BCI looks promising with immense positive impact on improving the lives of people with disabilities. Both invasive and noninvasive BCI technologies possess their own set of current practical drawbacks. Invasive BCI requiring a surgical procedure to implant sensors and the dependency of noninvasive BCI on wet electrodes to detect a good quality brain signal make them inconvenient of day-to-day use. The entry of noninvasive BCI into the games market has been an encouraging aspect of development. Although in its early stages at present, the future seems to be geared towards BCI being seamlessly integrated into video-games with industry giants like Sony (Playstation), and Nintendo (Wii) looking to find new ways of interaction and controls for gamers. The widespread usage of BCI will result in improved hardware and extensive application of such technology. The work in the field of BCI has evolved enough to continue maturing into a fully developed solution. The question is how much longer will it take for such a breakthrough technology to be perfected?

Sources:

• Brain-Computer Interfaces: Revolutionizing Human-Computer Interaction, B. Graimann, B. Allison, G. Pfurtscheller, 2010
• Brain-Computer Interfaces: Principles and Practice, Jonathan R. Wolpaw, Elizabeth W. Wolpaw, 2012
• Brain-Computer Interfaces: An International Assessment of Research and Development Trends, Theodore W. Berger, John K. Chapin, Greg A. Gerhardt, Dennis J. McFarland, Jose C. Principe, Walid V. Soussou, Dawn M. Taylor, Patrick A. Tresco, 2008
• Simultaneous EEG Recordings with Dry and Wet Electrodes in Motor-Imagery – http://mlin.kyb.tuebingen.mpg.de/BCI2011JS.pdf
• Paralyzed Woman Steers Robotic Arms With Mind And Sips Coffee – http://www.assistivetechnologyblog.com/2012/05/paralyzed-woman-steers-robotic-arms.html
• The Brainable Project - http://www.brainable.org/en/Pages/Home.aspx
• Necomimi brain-activated cat ears hit the U.S. –  http://www.gizmag.com/necomimi-brain-activated-cat-ears/23302/


Wednesday, October 10, 2012

Is Windows 8 a Step Backwards in Accessibility?


The introduction of Windows 8 perhaps brings along the most radical revolution in user interface that Microsoft has introduced since its shift from command line based MS DOS to Windows 3.1.

With regards to Accessibility, Microsoft had initiated the practice of bundling these options within the Operating System as supplementary installations since its launch of Windows 95. Later, some of these accessibility features were developed into a bundled software package by the Trace R&D Center for compatibility with Windows 3.x machines. This was known as the Trace Access Pack for Windows 3.0 and 3.1. However, built-in accessibility options became more of a common practice in the future releases of Operating Systems from Microsoft (i.e. Windows 98, XP, and Windows 7).

Images showing different versions of Windows UI

With the improvements in the development of third party assistive technologies, and the gaining acceptance of operating system developers to allow integration of accessibility features, the widespread usage and availability of such solutions has been noticeable. Over the years, assistive technology developers have designed solutions that work well with systems like Windows XP and Windows 7. There have been challenges at times in porting such solutions with these operating systems, but this has been part of ongoing design considerations taken by assistive technology developers.

The launch of Windows 8, however, might bring about a new set of accessibility challenges all together. My very first reaction to the Windows 8 interface was - "this is very different". The most noticeable change is the introduction of what Microsoft calls the “Metro User Interface.” This is essentially a screen with tiled icons for various applications (including Windows Apps that can be downloaded from the new Windows Store). Although fully customizable, these icons may not necessarily be of a fixed shape and size to begin with, which might present a confusing navigational experience for screen-reader users as there can be unequal number of icons in each row.

Another new feature of Windows 8 is a menu bar which pops out from either side of the screen. While these menus seem to be natural extensions when using a touch screen, they are quite unexpected and unnatural while using a mouse or keyboard. The only indicator to their existence is a miniature “+” sign which is really unnoticeable. This icon needs to be clicked for the side panels to be expanded. However, it should be noted that keyboard shortcuts have been assigned to allow quick access to such menus, and the keyboard arrow keys support navigating through the menu items, which will work well for certain segments of disabled users (e.g. Screen Reader Users, Users with restricted mobility, etc). However, this will still remain a challenge to users with access issues.

The Metro layout did look promising to me in certain ways. The tile based icons make it look familiar for switch based navigation. What happens once an icon is selected, however, is solely dependent on the “switch accessibility” features found within the selected application. For those of us who are unfamiliar with switch access, it is a mechanism that allows the user to operate a device (in this case the computer) by pressing one or more switches (usually in the form of large round plastic buttons). While operating a computer using switche(s), a “scan pattern” is setup which scans through and highlights relevant “clickable” items (i.e. icons, files, web links, etc) on the screen; once the desired item is reached (i.e. highlighted) the switch is pressed to select it. I strongly believe that Microsoft at some point should allow for some degree of switch access to be integrated into its built-in Accessibility Options. Currently the native On-Screen Keyboard in Windows 7 supports switch scanning by emulating a specified keystroke via the switch. I look forward to seeing the features the Windows 8 On Screen keyboard has to offer, and whether these features will remain or even be enhanced.

There don’t seem to be a lot of new accessibility features introduced for Windows 8. To be fair, however, considerable work has been put into the enhancement of existing features, particularly so these features can be used with touch interfaces as well. Some features such as the Narrator (the Windows built in screen reader) have expanded capabilities such as support for a greater number of screen elements and languages.

Despite these gains, my first glimpse into the accessibility of Windows 8 has fallen short of expectations. With the look and feel of Windows 8 being so different, I expect AT developers to have an initial “frantic” response, concerned by the need to successfully port existing products to such a drastically different operating system. As such, the adoption of Windows 8 within the disabled community will also take its own time for these reasons. It might be worth reflecting that the core to most of the upcoming issues is Microsoft’s approach of designing a single interface that is compatible with both touch and non-touch interfaces. Is this hybrid approach realistic for a feature filled & graphically rich system like Windows 8? No one has attempted this before – so we will only learn over time.


Sources:

Windows 3.1x, Wikipedia - http://en.wikipedia.org/wiki/Windows_3.1x
Trace Access Pack for Windows 3.0 and 3.1, Microsoft Support Article - http://support.microsoft.com/kb/99381
Excerpts from article published in Advance for Speech and Language Pathologists (Sep 1998), Clay Nichols with Terri Nichols, MS, CCC-SLP - http://www.bungalowsoftware.com/articles/AccArticle.htm
How good are Windows 8 accessibility features for the blind?, Mardon Erbland - http://betanews.com/2012/03/02/how-good-are-windows-8-accessibility-features-for-the-blind/
Switch Access to Technology, ACE Centre - http://www.acecentre.org.uk/assets/Product%20Downloads/SwitchScanningMaster_8_472.pdf