- Apple has announced its intention to take charge of control of the switch for brain interfaces
- The tool would make devices such as iPhones and vision of vision pro accessible to people with conditions such as SLA
- Combined with the function of personal voice AI fueled by AP, the computerly of the brain interfaces could allow people to think of words and hear them speak in a synthetic version of their voice
Our smartphones and other devices are essential for so many personal and professional tasks throughout the day. The use of these devices can be difficult or downright impossible for people with ALS and other conditions. Apple thinks that it has a possible solution: reflection. More specifically, a brain interface (BCI) was built with the Australian neurotech start -up synchron which could provide hands -free versions and controlled by the thought of operating systems for iPhones, iPads and Vision Pro helmet.
A brain implant to control your phone may seem extreme, but it could be the key to those who suffer from severe spinal cord lesions or related injuries to engage with the world. Apple will support control of the switch for those who have the integrated implant near the brain’s engine cortex. The implant picks up electrical signals from the brain when a person thinks of moving. It translates the electrical activity and nourishes it to the Apple Switch control software, becoming digital actions such as the selection of icons on a screen or navigation in a virtual environment.
Brain implants, voice of AI
Of course, this is still the start of the system. It can be slow in relation to the tapping, and it will take time for developers to build better BCI tools. But speed is not the point for the moment. The fact is that people could use the brain implant and an iPhone to interact with a world of which they were otherwise locked.
The possibilities are even greater by looking at how it could be registered with clones of personal voice generated by AI. Apple’s personal voice function allows users to record a sample of their own discourse so that, if they lose their ability to speak, they can generate a synthetic word that always looks like them. It is not quite indistinguishable from the real thing, but it is close and much more human than the familiar robotic imitation of old films and television program.
Currently, these voices are triggered by touch, eye monitoring or other assistance technologies. But with the integration of BCI, these same people could “think” their voice to existence. They could simply speak while wanting to speak, and the system would do the rest. Imagine someone with Als not only sailing on their iPhone with their thoughts, but also speaking through the same device by “tapping” instructions for their synthetic voice clone.
Although it is incredible that a brain implant can let someone control a computer with their mind, AI could bring it to another level. It wouldn’t only help people to use technology, but also to be in a digital world themselves.