With the growing dependence of mankind on the use of different electronic devices in day to day life, man wants to have easy handling to these electronic devices and easiest one is the possible brain wave command to electronic devices. Users want electronic devices to work through the control of brain waves without even to use of hands. Scientists and engineers are working to make this happen and great advances have been achieved towards brain-computer interface (BCI) or controlling electronic devices with brain waves. Forget about keyboard, mouse, touch screens or even voice recognition: the real dream is thinking about what we want our electronic gadget to do. Imagine a future where we can move anything with just our mind. The idea of interfacing minds with machines has long captured the human imagination.
The brain is an electrical device and electricity is its common language and this is what allows us to interface the brain to electronic devices. The brain is made up of billions of brain cells called neurons, which use electricity to communicate with each other. The combination of millions of neurons sending signals at once produces an enormous amount of electrical activity in the brain, which can be detected using sensitive medical equipment (such as an EEG), measuring electricity levels over areas of the scalp. The combination of electrical activity of the brain is commonly called a Brainwave pattern. Our mind regulates its activities by means of electric waves which are registered in the brain, emitting tiny electrochemical impulses of varied frequencies, which can be registered by an electroencephalogram. Recent advances in neuroscience and engineering are making this idea a reality, opening the door to restoring and potentially augmenting human physical and mental capabilities. Medical applications such as cochlear implants for the deaf and deep brain stimulation for Parkinson’s disease are becoming increasingly commonplace. Brain-computer interfaces (BCIs) (also known as brain-machine interfaces or BMIs) are now being explored in applications as diverse as security, lie detection, alertness monitoring, telepresence, gaming, education, art, and human augmentation.
A brain–computer interface (BCI), sometimes called a mind-machine interface (MMI), direct neural interface (DNI), or brain–machine interface (BMI), is a direct communication pathway between an enhanced or wired brain and an external device. Brain-computer interface is collaboration between a brain and an electronic device that enables signals from the brain to direct some external activity, such as control of a cursor or a prosthetic limb. When neurons in the brain interact via chemical reactions, measurable currents called brain waves are created. The four main types of brainwave patterns are delta, theta, alpha, and beta, and these can be detected and interpreted and signals sent wirelessly to devices to control them. The interface enables a direct communications pathway between the brain and the object to be controlled. BCIs are often directed at researching, mapping, assisting, augmenting, or repairing human cognitive or sensory-motor functions. BCI (brain–computer interface) has long been a favorite of sci-fi movies. However, some early BCI products are already for sale. These products are crude, imprecise and sometimes frustratingly nonresponsive—that’s how it goes with EEG-based headsets, which pick up only the faintest electroencephalographic echoes of neural activity through the skull. But these technologies are based on real BCI principles, and when they work, they’re a fascinating glimpse of mind–machine merging mergers to come. Consider the potential to manipulate computers or machinery with nothing more than a thought. It isn’t about convenience — for severely disabled people, development of a brain-computer interface (BCI) could be the most important technological breakthrough in decades. It may soon be possible for anyone, everyone, to control technologies using a wearable mind control device based on EEG or electroencephalogram technology.
How it works
The reason a BCI works at all is because of the way our brains function. Our brains are filled with neurons, individual nerve cells connected to one another by dendrites and axons. Every time we think, move, feel or remember something, our neurons are at work. That work is carried out by small electric signals that zip from neuron to neuron as fast as 250 mph. The signals are generated by differences in electric potential carried by ions on the membrane of each neuron. Although the paths the signals take are insulated by something called myelin, some of the electric signal escapes. Scientists can detect those signals, interpret what they mean and use them to direct a device of some kind.
With an EEG or implant in place, the subject would visualize closing his or her right hand. After many trials, the software can learn the signals associated with the thought of hand-closing. Software connected to a robotic hand is programmed to receive the “close hand” signal and interpret it to mean that the robotic hand should close.
At that point, when the subject thinks about closing the hand, the signals are sent and the robotic hand closes. Once the basic mechanism of converting thoughts to computerized or robotic action is perfected, the potential uses for the technology are almost limitless. Instead of a robotic hand, disabled users could have robotic braces attached to their own limbs, allowing them to move and directly interact with the environment. This could even be accomplished without the “robotic” part of the device. Signals could be sent to the appropriate motor control nerves in the hands, bypassing a damaged section of the spinal cord and allowing actual movement of the subject’s own hands.
The most common and oldest way to use a BCI is a cochlear implant. For the average person, sound waves enter the ear and pass through several tiny organs that eventually pass the vibrations on to the auditor nerves in the form of electric signals. If the mechanism of the ear is severely damaged, that person will be unable to hear anything. However, the auditory nerves may be functioning perfectly well. They just aren’t receiving any signals.
A cochlear implant bypasses the nonfunctioning part of the ear, processes the sound waves into electric signals and passes them via electrodes right to the auditory nerves. The processing of visual information by the brain is much more complex than that of audio information, so artificial eye development isn’t as advanced. Still, the principle is the same.
Electrodes are implanted in or near the visual cortex, the area of the brain that processes visual information from the retinas. A pair of glasses holding small cameras is connected to a computer and, in turn, to the implants. After a training period similar to the one used for remote thought-controlled movement, the subject can see.
Every technology is said to have its pros and cons and same will be the case with brain wave interface with electronic devices. However, the interest of users is the most important parameter to influence the development and market growth of any technology.
The recent history of human technology shows an increasing number of products and services that can be controlled remotely and automatically using computer algorithms. But what are the prospects of controlling devices using our brain waves. A few paralyzed patients could soon be using a wireless brain-computer interface able to stream their thought commands as quickly as a home Internet connection.
By reading signals from an array of neurons and using computer chips and programs to translate the signals into action, BCI can enable a person suffering from paralysis to write a book or control a motorized wheelchair or prosthetic limb through thought alone. Current brain-interface devices require deliberate conscious thought but future applications, such as prosthetic control, are likely to work effortlessly.
One of the biggest challenges in developing BCI technology has been the development of electrode devices and/or surgical methods that are minimally invasive. In the traditional BCI model, the brain accepts an implanted mechanical device and controls the device as a natural part of its representation of the body. Much current research is focused on the potential on non-invasive BCI.
One of the biggest challenges facing brain-computer interface researchers today is the basic mechanics of the interface itself. The easiest and least invasive method is a set of electrodes — a device known as an electroencephalograph (EEG) — attached to the scalp. The electrodes can read brain signals. However, the skull blocks a lot of the electrical signal, and it distorts what does get through. To get a higher-resolution signal, scientists can implant electrodes directly into the gray matter of the brain itself, or on the surface of the brain, beneath the skull. This allows for much more direct reception of electric signals and allows electrode placement in the specific area of the brain where the appropriate signals are generated.
This approach has many problems, however. It requires invasive surgery to implant the electrodes, and devices left in the brain long-term tend to cause the formation of scar tissue in the gray matter. Although we already understand the basic principles behind BCIs, they don’t work perfectly and there are several reasons for this as:
- The brain is incredibly complex. To say that all thoughts or actions are the result of simple electric signals in the brain is a gross understatement. There are about 100 billion neurons in a human brain. Each neuron is constantly sending and receiving signals through a complex web of connections. There are chemical processes involved as well, which EEGs can’t pick up on.
- The signal is weak and prone to interference. EEGs measure tiny voltage potentials. Something as simple as the blinking eyelids of the subject can generate much stronger signals. Refinements in EEGs and implants will probably overcome this problem to some extent in the future, but for now, reading brain signals is like listening to a bad phone connection. There’s lots of static.
- The equipment is less than portable. It’s far better than it used to be — early systems were hardwired to massive mainframe computers. But some BCIs still require a wired connection to the equipment, and those that are wireless require the subject to carry a computer.
- Video games have started to use EEG technology, equipping gamers with sleek headsets that claim to read the gamer’s mind and translate their thoughts into machine-readable instructions. Gamers can use their minds to drive a virtual car and create musically-inspired brain-wave art. A firm has developed what it considers to be the next level in gaming – a headset that lets us to control on-screen and physical objects using just our mind. During the demonstration, the brain waves moved a car the size of a shoebox around a track and each race involves two players wired up to the headset.
- Neural prosthetic devices also use the shared language of electronics to control robotic limbs, but through a somewhat more sophisticated interface with the brain. These devices use neural implants consisting of an array of electrodes that are implanted in the brain to monitor a small set of neurons and detect an individual’s intentions to maneuver an object such as a prosthetic limb. Mathematical formulas then decode these brain signals and turn them into instructions that drive the prosthetic device.
- Today it’s a headband or a helmet that reads brain waves from external EEG sensors, but to get to the subtleties that a true user interface would require we’d need to put sensors inside the head or add more components, such as the vision mentioned in the research above. But if we want to rely on the brain, then we need better electronics that could be implanted into a person’s body, which requires new coatings and research into chips.
The “mind control” headband unveiled by startup BrainCo. effectively hacks into brain signals with a range of possible applications — from helping to improve attention spans, to detecting disease, controlling smart home appliances or even a prosthetic device.
The helmet is built around a NeuroSky headset, an EEG device that senses the activity of neurons in the brain and indicates whether a person’s thoughts are either meditative or attentive.
- The US Department of Defense is pushing for the development of cheap, wearable systems that can detect the brain waves of people and display the data on smartphones or tablets.
Acknowledgement: The use of information retrieved through various references/sources of internet in this article is highly acknowledged.