Smartphone Self-Paced Learning With a Chatbot in Malaysian Older Adults

By on May 14th, 2025 in Articles, Commentary, Communication Technology, Ethics, Health & Medical, Human Impacts, Magazine Articles, Social Implications of Technology, Societal Impact

Today, almost everyone has a smartphone. Over the years, smartphone ownership by older adults has also increased [1], and during the COVID-19 pandemic, many older adults turned to technology to maintain social interactions with loved ones [2]. Unlike young adults and adolescents, many older adult smartphone users have differential views on perceived usefulness and ease of use, offering us a window into their smartphone usage and perceptions [3]. In a meta-analysis, Ma et al. [3] show that older adults perceive smartphone technology to be beneficial if they find it easy to use. This finding is not surprising because ease of use facilitates positive perceptions of technology. The more interesting question is whether older adults will continue to perceive the tech as useful if it is difficult to use. For example, some smartphones have buttons, while other models use touch screens to function. In addition, applications on the devices can have similar functions, such as if a user has several social media applications. The different ways the devices and applications are used, between feature smartphones and regular smartphones, and also between iOS and Android, can result in further confusion for older adults.

Research has shown that cognitive training done on computers results in modest learning gains with the potential to minimize cognitive decline in older adults.

Thus far, only a handful of studies have attempted to design such interventions. Some studies showed older adults successfully learning to use a specially designed app on their smartphone and health-related application [6], while two other studies showed an increase in knowledge of how to use a smartphone [7], [8]. At present, there are limited research studies in this area, with the three mentioned earlier conducted in Taiwan, China, and The Netherlands indicating that we still have much to discover about getting older adults to use their smartphones as learning tools. In this project, we designed a three-week training for Malaysian older adults. In the training, we introduced a new topic each week—chatbot in Week 1, Quick Response (QR) scanner in Week 2, and Google Drive in Week 3. These topics were selected from a preliminary study that had participants rate the topic’s usefulness and complexity for learning. In our module, we created a series of daily tasks (labeled as homework for our participants) to measure learning outcomes. The tasks were scaffolded for each day because we wanted participants to demonstrate growth and not skip to a preferred part and/or avoid a difficult section.

Methods and materials

We recruited 15 older adults for this study, but three withdrew at Week 1, thus leaving 12 participants’ data for analysis. The participants were recruited from community centers dedicated to senior citizens and through word-of-mouth reference. The 12 participants’ mean age was 64.75 years (SD = 4.35 and range = 61–74 years). Out of the 12 participants, there were two male and ten female participants. Eight were using iOS, and four were using Android OS. In terms of familiarity with mobile phones, the average usage was 8.22 years (SD = 3.52 and range = 4–15 years). None of our participants reported any physical or mental health difficulties throughout this study. We received ethics approval from the Sunway University Research Ethics Committee and obtained informed consent from participants before participating in this study.

When designing the training materials, we selected three topics (chatbot, QR, and Google Drive) out of 20 topics. In our pilot work, we had 25 Malaysian older adults (separate from this study) who rated the difficulty level (easy to difficult) for each module and indicated whether they had external help (family members, friends, and browsing for answers) to complete the topic’s homework. Results from the pilot study showed that the top two topics that participants wanted to learn were QR scanners and Google Drive. Specifically, the pilot study participants described that Google Drive functions for collaborating and managing files were identified as difficult and hard to manage. For the QR scanner and Google Drive functions, participants requested more information (range between 52% and 68%) and sought help from others (32%–44%). Malaysian smartphone users are primarily Android users with a 70.26% market share in the mobile operation system market [9], which might explain why participants explicitly mentioned Google Drive compared to other file-sharing applications. We also asked our current study participants to identify difficult mobile applications that they currently wanted to use and six identified file-sharing applications, e.g., Google Drive or Dropbox. Other difficulties identified were e-wallet payments using QR codes, checking and tracking deliveries, and using their phones generally (note that we were unable to discern the actual difficulty in this last comment).

QR is widely used in Malaysia. This is because Malaysia has adopted a cashless system and one report showed that Malaysia has a 96% adoption with only Singapore higher at 97% compared to other countries in the region [10], [11]. The Malaysian government has encouraged cashless alternatives and uses eWallet instead of cash to disburse financial aid to low-income residents [12], for example. In addition to financial services, QR services are also used in retail (making payments, item delivery, and automated machines), marketing (e.g., scanning a QR code to get a second drink for a nominal amount), education (to obtain interactive material from textbooks), information (checking item details to prevent counterfeit), and access to a full website [13].

Because we had a chatbot in mind for the training application, we asked our current participants if they were familiar with a chatbot and to describe their experience. Only two had used a chatbot before, suggesting that introducing them to a chatbot and any information gathered would be from an inexperienced perspective. Research has shown that chatbots are being used to enhance customer service [14] and improve language learning [15] and in training regarding obstetric vaccines in a nursing program [16]. Students reported that they liked that the chatbot is available 24/7, it has a broad range of skills/abilities that other student trainers may not have, and the chatbot could provide extra information repeatedly [15]. Chatbots are increasingly being used alongside regular customer service, and having older adults’ experience could provide them with more familiarity with technology-based services. Informal conversations with older adults showed that they wanted to learn how to more proficiently use their smartphones, as there have been increased reports of online and smartphone fraud targeting older adults, and they were concerned about being potential victims [17]. When the older adults were queried about how they obtained help, they complained that family members and friends were not always available or that they did not necessarily have the knowledge to help. This observation provided us an opportunity to include and offer the chatbot as an additional potential support for users to obtain information at their convenience, as well as to explore the participants’ acceptance of using the chatbot.

Learning materials

For each topic, there was a set of instructional slides prepared in Microsoft PowerPoint and an accompanying instructional video. In the instructional slides, we included an overview of the topics, such as the use of the specific mobile applications, the benefits of using the mobile applications, and ways to conduct certain activities with the mobile applications (see Figure 1). Participants were given more visual cues (see red circles highlighted in the slides) and a link to the instructional video. The PowerPoint slides were made available in Google Drive, and links to access the slides were also included on the website. All participants were directed to the website “Hands-On Smartphone Training for Seniors,” which is hosted by Google (https://sites.google.com/view/hands-on-smartphone-training-f/home; see Figure 2). We also used a Google email as the main point of contact for all participants.

Figure 1.Instructional slides with detailed text instructions, more visual aids, and a link to a recorded video guide.

Figure 2.Website with links to three modules slides and icons to access the Nerdybot 2.0.

 

We recorded instructional videos because one study showed that older adults found simultaneously watching a video and following the instructions to be difficult, and this resulted in increased unintended tapping errors [18]. Another study showed that older adults liked having a manual for reference, which they then could refer to at their leisure [6]. In our study, we made PowerPoint slides, which served as a manual for our participants.

The steps to perform certain tasks in a mobile application were screen-recorded using an iPhone 13. We added visual cues such as a prompt box and arrows into the instructional videos using an application called CapCut (Bytedance Pte. Ltd., China, version 5.8) to demonstrate the steps for a screenshot of the instructional video on performing certain tasks in Google Drive). The completed video was then uploaded onto YouTube to be accessed by the study participants. Each YouTube link was included in the slides of each week’s module accordingly (see Figure 3). The “Week 1 Introduction” video had the highest view count (n=53 ) as this video served as an introduction to the training, together with demonstrations on using the chatbot (range: 2–53). The other instructional videos demonstrated the steps to execute certain tasks with a QR scanner or Google Drive (see Figure 4).

 

Figure 3.Sample of recorded video guide at YouTube.

Figure 4.List of instructional video guides and their view count.

Chatbot

We developed a chatbot “Nerdybot 2.0” using Dialogflow from Google [19]. Details on how we created Nerdybot are available in another paper [20]. Nerdybot 2.0 was embedded in the smartphone training website.

Learning outcomes “homework”

We created a learning outcome for each topic for each day of the week (see Table 1). As this training was designed for self-paced learning, we developed two types of questions: screenshots and short answer questions (SAQs). Participants could view all five days’ questions in the PowerPoint slides, which were made available on Google Forms.

Table 1 Questions for the homework.

As the training was done remotely (not in a laboratory, and we did not have any face-to-face contact), we asked each participant to take screenshots for each step from their smartphones to demonstrate their learning. For example, we asked participants to upload the screenshot of their chat with Nerdybot 2.0. The SAQs were intended to verify whether they understood the task required, and the level of difficulty for these questions varied for each week. The SAQ questions ranged from multiple-choice answers to short free-text responses and long free-text responses and answers with images required.

In terms of accuracy, a participant received five points for each day if they answered correctly when uploading the correct screenshot and completing all instructional steps. However, if participants provided incomplete answers, e.g., partially correct instructions, they received three points for that day. Incorrect answers were a zero.

Qualitative interviews

Five participants (five females; age range: 62–70 years; three Android and two iOS) were invited to participate in qualitative interviews for us to obtain more feedback regarding the chatbot, instructional slides, and videos. The interviews were conducted over WhatsApp video call and lasted for approximately 30 min per interview.

Results

Learning outcomes accuracy

Our 12 participants performed well in Week 2 (QR scanner) and Week 3 (Google Drive). Both weeks had over 90 % accuracy. Specifically, Week 2 was 94.70 % (range: 84%100% ), and the mean for Week 3 was 92.30 % (range: 68%100% ). However, the mean accuracy for Week 1 was 86.3 % (range: 65%100% ). Inferential statistics showed that participants had lower accuracy in the chatbot compared to the QR scanner, t(11)=8.63,p<0.001 , and Cohen’s d=2.49 and Google Drive, t(11)=6.13,p<0.001 , and Cohen’s d=1.77 , with no difference between Weeks 2 and 3(p=0.53) . Figure 5 shows a sample of screenshots for a good and a poor homework response.

Figure 5.Homework via screenshot submission with remarks by participants.

We reviewed participants’ usage using Google Data Analytics over the three-week period. Figure 6 shows that Week 1 had the highest number of interactions with the chatbot compared to Weeks 2 and 3. The highest number of interactions with the chatbot was likely caused by the daily homework tasks related to the module. Week 3 showed a slight increase in the interaction with the chatbot on the Google Drive module as this module was also identified as being “difficult” by the pilot group.

Figure 6.Number of interactions with the chatbot over the three weeks of hands-on training.

We also looked at the questions asked by the participants. Some were relevant, e.g., “What is a QR scanner?” However, there were also nonrelevant questions such as “How is the weather today?” and “What did I have for lunch today?”

Qualitative feedback

We asked our five participants questions on the instructional slides, videos, and chatbot experience. Regarding the slides, the general consensus was that the slides were useful, but two participants found the detailed slides for Week 3 to be messy and confusing.

The week 3 module was too complicated; like in Google Drive, there was “share” and “manage and share.” I prefer a direct one like the previous modules; there are too many. Also, there were too many links in the slides. Clicking the links to homework and links to the video here was so confusing.

Unlike the slides, all five users reported that the video was useful, but one preferred the slides over the videos. That participant said, “I just look at the slides and I get my answers from that.”

Regarding the chatbot, all participants used it in Week 1, basically because it was required. The general consensus was that the chatbot was not intuitive and the participants simply referred to the slides or videos for help rather than going to the chatbot for answers. They mentioned that the chatbot was unable to answer their questions, and this created more overall dissatisfaction.

Other than feedback on the chatbot, some participants commented that they had difficulty in doing the tasks because of iOS and Android platforms. Others asked that we include more “homework” because of the difficulty in doing Week 3 tasks. Across all three weeks, our participants generally had high accuracy, suggesting that our homework was likely too easy, but some still claimed that the tasks especially on Week 3 were still quite difficult to manage in their own post-training.

Discussion

In our study, our objectives were to explore older adults’ experience with the chatbot and evaluate their learning progress in the three-week training program on their smartphones. We found that there was low engagement with the chatbot overall as demonstrated in the historical data and qualitative feedback. Participants admitted that they engaged with the chatbot when they had to and lost interest when it was no longer needed. This was partly because they found it not useful. Arguably, Nerdybot 2.0 was still in its infancy stages, did not have a large number of prepared intents, or questions, created for the chatbot, and, therefore, was limited in its performance. For instance, the questions might have been categorized by a single function in an application, such as the way to share a folder in Google Drive.

Indifferent attitudes toward the chatbot on the part of participants suggest that developers need to adopt a different approach in designing a more interactive and human-like experience, as older adults may have unique preferences and requirements compared to more tech-savvy users.

Furthermore, indifferent attitudes toward the chatbot on the part of participants suggest that developers need to adopt a different approach in designing a more interactive and human-like experience, as older adults may have unique preferences and requirements compared to more tech-savvy users. For example, developers could enhance the chatbot’s natural language processing capabilities so that it can respond to queries in a more human-like way and/or incorporate voice recognition and voice-based interactions to make it more accessible and user-friendly for older adults.

In terms of learning the other two modules, participants had high levels of accuracy suggesting that either the homework was too easy or the homework was challenging, but participants were engaged enough to put in the effort. It is likely that it was too easy for Week 2 but not the case for Week 3. Some participants complained that the numerous links and constant switching of apps made the studying confusing, but they did appreciate learning how to use Google Drive. Unlike the chatbot, the QR scanner and Google Drive were more familiar to the participants because of prior exposure to these features. Prior exposure included that Malaysia has seen the implementation of many QR payment services in shops and as part of customer services (e.g., to obtain a menu in a restaurant and trip itinerary), and thus, participants had the opportunity to try using the QR scanner in their daily activities. Likewise, many of our participants informed us that family members and friends placed photographs of loved ones in Google Drive and that they often faced limited storage in their own phones, which necessitated the use of online cloud storage.

Our participants, in general, found our slides and videos useful for learning because they were able to do the homework on their own and did not need to seek external help. That said, most participants were living with at least one other person in their homes, so it is possible that the other person contributed to their learning motivation or that they indirectly helped the participant and could have contributed to their learning experiences. Furthermore, we examined the homework completion rates. One person did the homework in a single session, but others followed the suggested daily homework schedules, suggesting that the self-paced learning approach was a positive one for the older adults in the study.

In sum, our small study showed positive outcomes. First, participants were able to learn how to use “difficult” applications. Second, self-paced learning on a smartphone was found suitable for learning. We acknowledge that our study has a small sample size and that most participants were prior smartphone users, thus limiting the results as showing benefits to existing smartphone users. From our perspective, we had technical difficulties with participants having older Android and iOS versions of phones, which limited some functionality. This is expected as older adults are less likely to upgrade to a newer model of smartphone compared to young adults.

ACKNOWLEDGMENTS

We would like to thank Zhi Shan Lim for her time and effort in creating and collecting data for this project. This project was part of her BSc (Hons.) in information technology (computer networking and security) at Sunway University and was supervised by Yunli Lee. This work was supported by the U.K. Department for Business, Energy and Industrial Strategy and Malaysian Industry-Government Group for High Technology (MIGHT) and delivered by the British Council. The work of Min Hooi Yong was supported in part by the Long Term Research Grant Scheme, Ministry of Higher Education Malaysia under Grant LRGS/1/2019/SYUC/02/1; and in part by the Newton Fund Institutional Links through the Newton-Ungku Omar Fund Partnership under Grant 331745333. For further information, please visit https://www.newtonfund.ac.uk

 

Author Information

Min Hooi Yong is an associate professor in the Department of Psychology at the University of Bradford, BD7 1DP Bradford, U.K. Her recent research projects focus on Theory-of-Mind in older adults as well as using technological tools for assessing cognition and well-being measures. She is an experimental cognitive psychology researcher with a focus on cognition in older adults. Yong has a PhD in psychology from the University of Otago, Dunedin, New Zealand.
Yunli Lee is an associate professor with the Department of Smart Computing and Cyber Resilience, School of Engineering and Technology, Sunway University, Bandar Sunway 47500, Malaysia. She is also a researcher with the Research Centre for Human-Machine Collaboration (HUMAC). She is a professional technologist of MBOT and the Malaysia director of the International Association for Convergence Science and Technology (IACST). Her current research interests include ultrasound imaging, the time series of FOREX data, technology modules for seniors, and augmented reality technology. Lee has a BIT (Hons.) in software engineering from Multimedia University, Cyberjaya, Malaysia, a master’s in software from Dongseo University, Busan, South Korea, and a PhD in engineering (digital media) from Soongsil University, Seoul, South Korea. She is a Senior Member of IEEE.
_________
To read the full version of this article, including references, click HERE.
___________