To blur the lines between man and machine is an ambition lifted straight from the pages of science fiction. Take for instance Imp, a disembodied and progressively unhinged human brain sent into space to control humanity’s interplanetary monitoring platform in Joseph McElroy’s 1977 post-modern story, Plus.
Sci-fi anticipates the invention of technologies we hope to utilise one day. And while Imp’s invention remains unlikely – the concept almost certainly violates Isaac Asimov’s Laws of Robotics in terms of causing harm to a human being – researchers still hope to deliver brain-computer interfaces that put our minds directly in control. For example, using brain signals as a control point could enable medical applications that empower the patient, perhaps even detecting illnesses automatically.
Returning to sci-fi for inspiration, we could instead cite The Terminal Man, published in 1972, whose protagonist relies on a neuromodulator to temper the frequency of violently epileptic seizures. Twenty-five years later, a vagus nerve stimulator received US regulatory approval to manage epilepsies resistant to medication and surgery.
More recently, demonstrations on mice have shown how an implant could flag up potential health complications by interpreting body-brain nerve signals, in a study led by Theodoros Zanos, an assistant professor at the Feinstein Institute for Medical Research’s Neural Decoding and Data Analytics Laboratory, with support from GE Global Research, the R&D division of industrial technology manufacturer General Electric.
Other brain-computer interfaces could make it easier for the disabled to engage with the world around them. Social network operator Facebook’s new research unit, Building 8, is working on one such technology, a neurological imaging implant that could enable us to type, using only our brains, at a rate of 100 words per minute. Elsewhere, US-based Synchron has closed a $10m series A round ahead of clinical trials of a miniaturised neurological device that could one day help paralysed patients control communication aids or robotic limbs.
Nearer-term developments in interface technologies may seem more mundane by comparison. Human-machine interfaces (HMIs) broadly include any system through which people interact with their computers, and thus we must also consider the successors to the humble mouse and keyboard. Consumers are now accustomed to needling their smartphone touch screens, dictating memos to voice-activated assistants or even immersing themselves in virtual or augmented reality. Healthcare CVCs, among others, are increasingly looking at similar functionality for their systems.
The number of deals in the four health subsectors most likely to generate HMIs rose to 98 in 2017 from 84 the previous year, according to GCV Analytics, though this was lower than the 105 recorded in 2015. Those four subsectors are health IT and administration, internet-of-things products, medical devices and diagnostics, and virtual or augmented technologies. Together, they registered a cumulative deal value of $2.5bn last year, up from $2.3bn in 2016 and $1.3bn in 2015.
Roel Bulthuis, senior vice-president and managing director of M Ventures, the strategic venture fund of Germany-based pharmaceutical firm Merck, said immediate interface developments were piquing his company’s interest, rather than electronics which plug directly into the body.
He said: “What we are very interested in is that we have so many interfaces with technology in our day-to-day lives. Our interest is how we can use those interfaces when I use my phone, when I connect to wifi spots, to beacons. I have phones and I have electronics in my car and everywhere.
“So, there are so many things that I do as an individual or as a patient. Just as an example, if I have Parkinson’s disease at a stage where I am still using my cell phone effectively, there is a gyroscope in my cell phone and I can record my voice on my cell phone. Can I use those measurements to inform my physician whether he needs to up-or-down-titrate my medication?
“I think that these kinds of approaches are very interesting. It is really difficult to introduce a medical device that patients need to use to record something unless it has a huge impact. It is really easy to say: ‘I am generating data, why do I not use that to help us manage disease in a better way?’ ”
Healthcare businesses in this space include Akili Interactive Labs, which creates interactive video games to build the cognitive abilities of people with conditions such as autism spectrum disorder and major depressive disorder. Akili, which closed a $55m series C round backed by M Ventures in May 2018, uses designs partially created by cognitive neuroscientists to target affected neural systems.
Industrial interfaces are also becoming more nuanced, with industrial CVCs searching for HMIs that could be imported from other segments to give clients a tighter command of their day-to-day operations. Tyler Durham, venture principal at oilfield services firm Schlumberger, said gains could be made from interfaces finding favour in the consumer and computing sectors, though many of these plans remain early stage.
Schlumberger regards virtual and augmented reality (VR and AR) as two of the most important emerging technologies in the interface space, according to Durham. He said: “One of the key things we are concentrating on is trying to take the cool, showy AR and VR technology that is out there and apply it back into our data. That could mean freeing up people’s hands when they are working or connecting them remotely to subject matter or experts.
“We opened up our research centre in Menlo Park recently and invited people [representing companies] from all over the world to look at the VR tech coming out of the valley, to let them try out new hardware and see demos from companies like Google, Microsoft or [industrial AR technology developer] RealWare, letting the business units see that type of technology so they can go back, look at their day-to-day operations and see how to apply that technology.”
In addition to VR, Schlumberger is keeping an eye on the possibilities of full-body interfaces such as exoskeletons, wearable suits or vests that feature embedded supports, equipment or electronics. Durham said one of the corporate’s portfolio companies would begin testing the lower half of a model exosuit later this year.
Schlumberger executives joined the Exoskeleton Technical Advisory Group initiative formed by robotics and microelectromechanical technology developer Sarcos Robotics in March 2018. The group featured industrial corporate representatives from GE, engineering and construction company Bechtel, carmaker BMW, construction machinery supplier Caterpillar, air carrier Delta Air Lines, industrial assembly parts supplier Würth Group and equipment dealer KG Industrial Product.
Durham said: “It is about seeing how [the exoskeleton] develops in one stream and with VR and AR in the other stream. I think it is going to be a progressive development.
“What we took away from the [Menlo] workshop is that some things can be done with today’s technology, so it is about understanding what we currently do and trying to make those processes that little bit easier. It is about trying to get comfortable with it, maybe get some small wins to get us more familiar before we see it changing things fundamentally.”
It is clear corporate venturing approaches to interfaces can vary depending on one’s view on the adoption timeframe for such technologies. Corporates will naturally seek easily implementable interfaces which could yield rapid gains, but it is also important to track more fundamental changes that require significant research. Both approaches are likely to result in more venturing dollars pouring into companies working with innovative interfaces in the years ahead.