Page 39 - IJPS-10-2
P. 39
International Journal of
Population Studies Design and usability evaluations of a course
points. Higher SUS scores indicate greater usability. Based for the analysis of the qualitative data to identify potential
on empirical evaluations of the SUS, a score below 50 user problems. One example of the principles is “#2 Match
indicates usability difficulties, while scores in the 70 and 80 between system and the real world” (i.e., use words familiar
ranges are considered promising (Bangor et al., 2008). The to the user).
SUS items and one final question (Adjective Rating Scale) Audio-recorded interviews were summarized, and
about the overall user-friendliness (rated on a scale from 1 answers to the questions in the interview guide were
“Worst imaginable” to 7 “Best imaginable”) are included in transcribed. Provisional coding was applied, that is,
Appendix 2.
analysis begins with a start list of researcher-generated
The recorded interviews lasted between 38 and 67 min codes based on what might appear in the data before they
and included a series of open-ended questions related to are collected and analyzed (Miles, 2014). Based on the
general opinions about the course content, perceived ease content of transcripts and field notes taken during the
of use, and usability concerns. The experts were asked direct observations, the material was sorted as previously
about what type of device and internet browser they had described.
used. The interview guide is shown in Appendix 3.
SUS scores for each participant were transformed into
The learning management system recorded participants’ a usability score from 0 to 100 points. A score of 68 was
online activity in terms of page views and time logged in. considered to be “OK,” between 68 and 80.3 was “Good,”
Approximately 4.5 h in total were available for reading and above 80.3 was “Excellent” (Usability.gov., n.d.).
through the course online during the three sessions in the
test apartment. After the first round, changes were made to (B) Round 2
the course content based on the results. A theory-driven approach based on the technology
(B) Round 2 acceptance model was used for the analysis of the
qualitative data to identify participants’ general opinions,
The second round followed the same procedure as the first perceived ease of use, and concerns (Davis, 1989).
one. However, the task was slightly modified. No checklist Data from transcripts and field notes taken during the
was provided, and the instruction was to take notes and pay direct observations in round 2 were sorted into two pre-
attention to: (i) readability (e.g., difficult words or phrases or determined main categories: “perceived ease of use” and
incomprehensible text), (ii) whether the course was easy to “perceived usefulness.” A third pre-determined main
use (e.g., instructions for the online assignments), and (iii) category was “Potential future improvements.” Emergent
whether they would be able to do the home assignments. findings were compared with field notes taken in connection
Furthermore, questions related to the perceived usefulness with course enrolment and during the direct observations
of the course were added to the interview. and interactions with participants in the apartment.
The entrance door to the test apartment was left open Additional data collected in the second round were
to the laboratory space where the researcher monitored analyzed thematically. Written responses to the first
the activities. The participants could call on the researcher online assignment of the course, “Your expectations,” were
when they faced any navigation problems on the digital assigned content-based codes to capture the participants’
platform or had trouble understanding course instructions. motivation for course participation. The purpose was
During the test session, the researcher made regular checks to check whether the invitation for course participation
on the participants in the test apartment to ask whether should be modified to attract the intended users. The
they had encountered anything particular working well or online assignment included the following questions:
poorly and if there was something they wondered about. (i) Why would you like to make changes to your home
The reason for more active interaction was to continuously and routines related to light, physical activity, and
solicit participants’ feedback and reduce the cognitive sleep?
load of note-taking. The interviews lasted between 32 (ii) What are you aiming for? Please write a few positive
and 50 min. Participants received a gift card (550 SEK) effects that you hope for.
as reimbursement a few weeks later by post. The course
content was further refined after the second round. 3. Findings
2.2.4. Data and statistical analysis 3.1. Round 1 with experts
(A) Round 1 3.1.1. Usability testing interviews
A theory-driven approach based on ten heuristic design The three experts gave positive feedback regarding the course
principles developed by Nielsen & Molich (1990) was used (“fun and interesting”, “nice and inviting”) and considered
Volume 10 Issue 2 (2024) 33 https://doi.org/10.36922/ijps.378

