Lauran Doak explores how children with learning disabilities engage with an iPad storymaking app.

An iPad storymaking app allows users to create their own stories by combining photos, videos, text and audio recording of their own voice. Virtual pages can be ‘turned’ by swiping the screen, or sometimes by using switch access (some storymaking apps are designed for disabled users). Storymaking apps can engage students, including those who do not independently read and write. Stories can help celebrate, evoke memories, connect families, reassure, teach, prepare and entertain. Interaction with the app can occur at multiple levels, from direct editorial control over story assemblage to indirect ‘co-authoring’ by having your embodied experience and expressed preferences represented in story format.

In my research I worked with Gavin, whose stories covered his favourite music band One Direction, a family holiday, and costumes he has worn over the years. His mother Rachael scaffolded the story making process by collating the photos and videos, encouraging Gavin to add more content, and assisting Gavin with typing words by ‘sounding out’ the spelling. Voice recording motivated Gavin to develop his engaging ‘story voice’. One home video showed him punching the air and shouting ‘YES!’ after successful recording. In fact, the app used in this study also allows verbal dictation of intended text, allowing the user to bypass typing completely. Gavin did not use this feature during the study, but Rachael reflected that in future this feature might encourage him to write (dictate) longer and more detailed stories. The app used additionally has a ‘word prediction’ feature which anticipates the word you are attempting to type. Rachael was initially hesitant about this, but noted ”if it makes his life easier and motivates him to want to do more, then the predictive text thing isn’t such a bad idea’.

The second young person to directly engage with storymaking was George, aged six, who has a diagnosis of autism. George enjoyed producing two stories independently. The first story consisted of one page with a photo of George and his mother, and the typed label ‘Mummy and George’. In the second story, George discovered the word prediction feature and used it to generate longer sentences. His mother Emily described what happened. “He was … just pressing the word prediction button, and he was giggling his head off. Then he was pressing it and pressing it and then listening to it back and repeating it and copying it.” Even though the final text does not entirely cohere, the value of the process should not be underestimated for George. Predictive text allowed him to experience the relationship between written and spoken language in a playful way, and he could play back his compositions using the synthesised voices, which he found a lot of fun.

Eve is 11 years old and has profound and multiple learning disabilities) PMLD). She was supported by her mother, Anna, who initially experimented with stories based on real events using family photos and videos, but found Eve’s engagement only moderate compared to her enjoyment of traditional children’s storybooks. Home videos suggested a degree of impatience on Eve’s part, with frequent screen swiping to turn the page before narration had finished. So Anna decided to use the app to create three digital adaptations of Eve’s favourite storybooks. She did this by photographing the books page by page, uploading the images and recording narration as she would normally read it. Eve showed much more engagement with these digital stories, consistently smiling at her favourite bits. Her screen swiping was better spaced, and she paused to listen to the full recording. Anna concluded that while digitally converted stories would not replace in-person story sharing, they did carry different advantages: “I think she was quite proud that she was telling herself the story … ‘Look what I’m doing, I can do it myself’”. It also makes the family story sharing experience exportable to other contexts: ‘if she’s ever … staying elsewhere, in respite or whatever, if she’s got our voices with the familiar stories … it’s perfect’. 

Matthew is nine years old and also has PMLD. He was supported by his mother, Laura. As with Eve, some trial-and-error was required. Laura soon discovered that the built-in synthesised voices did not engage Matthew as well as her own recorded voice, and that unfamiliar content such as videos of distant family members were not well received. With practice, however, Laura was able to create stories which reflected Matthew’s lived experience of the world as a wheelchair user and a ‘Sensory Being’. For instance, one of Matthew’s most engaging stories was A Visit to a Castle which foregrounded Matthew’s sensory experiences of geese honking, the shade of the trees overhead, the taste of ice-cream, and the jiggling feeling of having your wheelchair pushed over cobblestones. Laura incorporated fun sound effects. For example, she read the line ‘I don’t know if I like it!’ in a shaky voice to evoke the jiggling feeling of cobblestones. Matthew seemed highly engaged at this story, giggling at the geese, maintaining eye contact with the screen, reaching to touch the iPad. Laura reflected that this story could be used as a celebration of happy memories, or as a digital object of reference to prepare for a future visit. Matthew also responded well to stories containing sound effects or music—a story about his dad’s birthday with an audio recording of the family singing Happy Birthday proved engaging. The final story Laura created was My Haircut, which used images and recorded sounds to evoke the stages of the process: running water, electric hair clippers, the snipping of scissors. As Matthew finds haircuts stressful, Laura hopes to use this story as a digital object of reference before each haircut, and if this is successful to then write further digital stories to prepare for other stressful events such as hospital visits.

■ Story topic choices.

I have argued that it is helpful to view young people like Matthew and Eve as ‘co-authors’ of their own stories, rather than mere recipients. For example, Eve’s agency in expressing a preference for certain books, for rhyming and for engaging narration with anticipatory build-up deeply influenced the format and content of her digital stories. Meanwhile, Matthew’s stories were shaped by his preference for music, sound effects and representation of his own sensory experience. Matthew and Eve clearly expressed preferences through their embodied communication such as eye gaze, facial expression, vocalisation, posture and gesture—whether during a previous event such as the castle visit, or during prior storysharing—and this self-expression contributed in a meaningful way to the digital stories eventually created.

My research challenges some preconceived notions of who can benefit from storymaking apps. Many families of young people with severe or profound and multiple learning disabilities expressed doubts about the relevance of a ‘literacy’ app, or the contribution their child could make. Some had been advised to use simpler cause-and-effect style apps which allow instant triggering of sound or visual effects by touching the screen. However, as Eve’s mother Anna noted, a storymaking app gives ‘the opportunity for progression, because you can simplify and complicate things as much as you want’. Thinking of such users as co-authors, through the contributions of their embodied experience, rather than as mere recipients of stories reminds us that literacy and literacy apps are for everyone.

Lauran Doak
Author: Lauran Doak

Lauran Doak
+ posts

Dr. Lauran Doak, Senior Lecturer in Special & Inclusive Education, Nottingham Institute of Education, Nottingham Trent University.

Website: bit.ly/3KOWBSl
Twitter: @LauranDoak

LEAVE A REPLY

Please enter your comment!
Please enter your name here