Graham’s Reaction Pad (Big Life Fix)

Graham Bullivant’s Reaction Pad is a bespoke communication aid and environmental controller built in partnership with Ruby Steel and Mettle Studio as part of the BBC2 TV series The Big Life Fix. The system comprises a bespoke iPad app and two bespoke hardware interfaces – a touch sensitive arrow pad and a joystick. The Reaction Pad allows Graham to quickly react to what is going on around him using his own voice rather than a synthesised one. He is also able to control the brightness and colour of his lights (for preference or to attract attention) and a thermostat. The hardware interfaces allow him to adapt how he controls the system to suit his capabilities and energy levels. The reaction pad is focused on a small set of functions that Graham felt were very important and so is a compliment to other communication aids, rather than a replacement.

After a review by John Bercow and many years of campaigning by organisations like the charity Communication Matters, people like Graham (who could benefit from a digital communication aid) have a right to one through the NHS. How this right is translated into an actual devices getting to the people who need them is still being defined as service provision spreads out from the small number of centres of expertise which had previously been focused on education. I hope they can be provided in a manner as consistent with the Social Model of Disability as possible. Where users like Graham have a high degree of control over the technology that is supplied to them whilst still benefiting from the knowledge and expertise of the amazing professionals in the sector.

The problem

Graham made it clear to us very early on that he felt extremely frustrated at feeling shut out of conversations, because of both the time it took him to formulate responses with the communication setup he was using, as well as his unwillingness to use a synthesised voice. We asked him to prioritise different attributes of a communication aid and he put speed and emotion at the top of the list with the ability to have a large vocabulary and be specific much lower down.

With this extremely strong steer from him we knew we needed to build a tool that prioritised selecting quickly from a small set of options above all else. More worrying for us, we knew that he would not accept anything with a synthesised voice.

Zoe (Graham’s wife) has complained that one of the early impacts of Graham’s stroke had been a loss of control of the house as Graham had always taken care of things like the heating. Because of this we wanted to incorporate environmental controls into whatever we built so we could give control back to Graham.

On the input side we knew that pressing the iPad screen with a stylus worked for Graham but that it demanded a lot of energy from him and he was often too tired to hold the stylus. Additionally on some days his hand movement was reduced and he was not able to reach all the way across the screen. We knew that the iPad screen could not be the only interface to the system, we had to allow Graham to take a break and give him something he could use on days when he had low hand movement.

Initial experiments

We began our experiments with two hypotheses:

1) We might be able get around the problems with synthesised voice by instead using carefully chosen clips from film and TV.

2)  We could capture Graham’s gestures with a piece of hardware that worked similarly to the stylus he was already using but required a smaller range of movement

To investigate hypothesis 1 I built a sound board app using App Inventor and handed it over to Ruby who populated it with some carefully chosen emotional sound clips from everything from The Thick of It to Pulp Fiction.

To investigate hypothesis 2 I rigged a stylus with an accelerometer, Arduino pro micro and Adafruit Bluefruit Ez-Key. It was possible to use the stylus to press buttons on the sound board in the conventional manner, but also to leave the point in place and use it more like a joystick to scan between buttons. I wrote Arduino code so that buttons could be selected by tapping the end of the stylus on any surface.

We took these prototypes to Graham to try out. Things went well at the beginning, our hunch about the clips was good and Graham really liked some of them, particularly Malcolm Tucker saying “I’ve told you to f**k off twice already and yet you’re still here” and Marcellus Wallace saying “I’m pretty f***ing far from OK”.

The stylus/joystick hybrid was less of a success. Without a force to return to centre Graham found it difficult to stop movement in a particular direction once it had started. This is a useful lesson – different is usually not better – it looked like we would need to build an actual joystick. I had brought an arcade button with me, we tried it in Graham’s hand and he was able to easily grip it and press the button.

The final experiment we did on that visit was to get Graham to mark out the extent of his hand movement (on both a good day and a bad day) on a piece of paper.

We had a look at the bad day range and I drew a set of five buttons (four arrows and select) on a piece of cardboard that fitted within that envelope. I gave the cardboard to Graham and got him to simulate pressing the buttons to see if he was able to reliably select a particular one.

Half-way there

The next time we saw Graham we had made some significant advances.

Mettle Studio had started working on an iOS app that incorporated the sound board functionality from my Android app along with the capability, through Apple’s Home Kit to control products like smart light bulbs and thermostats. Ruby had worked out a four level hierarchy for all of the commands in the app based on the interviews she had done with Graham previously.

I had made a joystick that used the arcade button to select. It used a Sparkfun arcade joystick module with a casing designed in 123D design and 3D printed on the Ultimaker 2s at Machines Room. All of the switches were wired directly to a Bluefruit EZ-Key so it didn’t need any programming.

joystickv1

I also made the first arrow pad, based on the dimensions of the cardboard prototype we had tested with Graham on the previous visit. We knew this needed to be light so it could stay with Graham as much as possible (giving him control of the iPad even when it wasn’t on his lap) and ideally waterproof so he could even take it into the shower.

arrowpad1

I used laser cut corrugated polypropylene as the main structure with thin (0.5mm) acrylic as the touch surface. Over at Bare Conductive we screen printed their Electric Paint onto the reverse side of the touch surface to create capacitive sensing electrodes (effectively touch-sensitive pads). I connected the electrodes to an MPR121 on a breakout board from Sparkfun. Crosstalk can be a problem when connecting up these chips so I routed the wires up the flutes of the corrugated polypropylene to prevent them coming into close proximity.

arrowpadwires

The once routed through the plastic with one end soldered to the MPR121 I soldered the other ends of these wires onto small pieces of copper tape. I stuck theses to the corrugated plastic below where each touch-sensitive pads would sit. I put a dab of electric paint on the the top of each of the pieces of copper tape and then stuck down the acrylic (with the touch pads printed on the bottom surface) to the corrugated polypropylene with superglue.

arrowpadwired

I connected the MPR121 to an Arduino Pro Micro using based the code on the Tailored Touch system Sam Jewell and I had built and published back in 2013 as part of the Enabling Technology project. As we needed to connect to an iPad (no USB port!) I connected the Arduino to another Bluefruit EZ-Key and added in a LiPo battery and USB charge module.

With a partially completed app and a working joystick and arrow pad we went back to see Graham. He liked the app and was happy with the overall structure of the commands. Unfortunately there were many things he wanted to say that we just didn’t have an appropriate audio clip for.

He also liked the arrow pad and it worked to control the app but I had made the buttons too large and too far apart so it required too much of a stretch to use on a bad day. Graham also requested that the pad be mountable to the button on his fly so it could stay with him even when he was moved.

The joystick also had a major issue. It was OK to move down, left and right but to move up required Graham to pull it towards him at an angle he couldn’t really manage. We would need a joystick with a totally different grip style. Additionally graham needed a way of bracing the joystick against himself. He was only able to use it when it was sitting on top of his iPad.

Ruby and I left the hospital with me knowing I would need to scrap both prototypes and start them both again and both of us feeling that a sound board app with emotional clips from films (whilst it might solve Graham’s problem of getting attention and reacting) was a long way from ‘giving him his voice back’.

The VHS

After the hospital we went back to Graham and Zoe’s house to look at how we could set up the smart bulbs and thermostat. Zoe had mentioned that, in the process of making the changes to the house required for Graham’s return the builders had uncovered a box of VHS tapes she had though had been lost.

We got a VHS player set up and put in the first tape. There was Graham speaking. This was a fantastic opportunity. One of the benefits of being in the middle of making a TV program was we had access to a team perfectly set up for rapidly ingesting, cataloguing and editing video footage.

The reaction pad

The tapes went away for ingestion and Ruby got back a file with all of the clear audio of Graham speaking. She spent many long hours trimming out the appropriate bits and combining audio from multiple clips to create new phrases.

arrowpadinside

I built a new, smaller arrow pad, with a fly mount laser-cut from neoprene. I also added induction charging (to replace the USB socket) so that it would be water resistant.

I built a new joystick with a ball type head with a bespoke button. I added holes in the bottom so it could be bolted to a roughly A4 sized piece of 3mm acrylic to brace against Graham’s body when he was using it.

joystick1

We also brought with us a powerful battery powered bluetooth speaker, to dramatically increase the volume of the sound coming from the iPad.

Unfortunately due to delays with Graham’s house we needed to wrap up making the TV program before Graham would be able to go home. This meant that we could not show any of the environmental controls in the ‘big reveal’ or see how Graham used the reaction pad in the busy situations it was designed for.

We handed everything over to Graham and Zoe at the hospital none the less and it went well. The clips of Graham speaking had the desired effect, completely transforming the tone of the app in a way that both he and Zoe really appreciated. The joystick worked really well with the touch pad proving a bit temperamental until I eventually adjusted the touch sensitivity in the Arduino Code. The only major setback was realising that, because of its dependence on capacitive touch (basically sensing the proximity of wet bodies), the arrow pad would not work in the shower.

It was fantastic to see Graham use the tool in the way he had planned, to interact with Zoe but also  to quickly and vocally make his wishes known to the nurses and other people working around him in the hospital. We went out in an ambulance and, using the joystick, he was able to interact with the iPad even though he wasn’t holding it and we were bumping around on the roads. I really knew we had built Graham what he wanted when he used it to interrupt Simon.

We’ve been back to see Graham twice since the reveal. With all the work on his house done is now back at home and able to use the pad to control the lights, including changing the colour, and the thermostat, once it is fitted. I’m looking forward to seeing how he uses the reaction pad in the social situations it was designed for.

Communication aids

The reaction pad we made for Graham is a relatively simple embodiment of a digital communication aid. These devices have been around for about 25 years and have been steadily improving. The talented and dedicated people who manufacture, prescribe and maintain devices like these call the sector Alternative and Augmentative Communication (AAC).

The reaction pad is not a replacement for one of these systems – it is merely a complement focused specifically around a small subset of things Graham wanted to do quickly and with a minimum of energy expenditure. A great benefit of being in the privileged position of being asked to make a one-off, for an specific individual, is being able to apply this laser-like focus to the design. No one making a product that is mass produced has this luxury.

On the flip side, devices and apps that have years of development and are used by thousands of people are much more capable. Hardware like Tobii’s eye gaze products allow people with far less movement than Graham to fully express themselves whilst software like Smart Box’s Grid products allow a speech and language therapist to take a child all the way from learning simple cause and effect to speaking, writing texts emails, social media and essays.

If you know someone who needs a communication aid, or would like to donate money so that more people like Graham are able to get their voices back get in touch with Communication Matters.