An analytical review of the literature on
Reading Comprehension in Technology-Based Learning
INTRODUCTION
The purpose of this paper is to present an analysis of the theme of reading comprehension in technology-based learning. Reading comprehension is an active cognitive process that involves interaction between the reader and text to construct meaning (Ertem, 2010). Reading comprehension is essential to the development of reading skills, and it effects the learners’ overall ability to obtain an education (Ertem, 2010). There is a growing concern for students’ ability to read, comprehend and learn from text (McNamara, O’Reilly, Best & Ozuru, 2006). Teachers are being encouraged to set goals to spend more time teaching specific reading comprehension strategies and offering support to students (Ponce, Lopez & Mayer, 2012). The use of technology is a promising method for intervention in the areas of language and literacy (Rodriguez, Filler & Higgins, 2012). There is a pressing need to investigate whether technology-based learning enhances or hinders reading comprehension and learning (Grimshaw, Dungworth, McKnight & Morris, 2007).
Organization of this paper
This paper begins with a methods’ section that describes how the analysis was conducted. The findings section reports on the results of the analysis of the 15 studies and identifies themes that emerged. The discussion section provides a comprehensive analysis of the results by interpreting the findings of the 15 studies. The conclusion highlights the main findings about reading comprehension in technology-based learning and discusses the limitations of this analysis. Also, implications for practice and policy concerning reading comprehension in technology-based learning are discussed.
METHODS
The 15 sources used in this analysis were selected from 12 peer reviewed journals. The analysis was conducted from educational technology journals; therefore all of the sources contained an electronic medium. The analysis only included sources with the words “reading comprehension” or “comprehension” as part of the title. As part of the analysis, the journal sources had to include research participants. This means that meta-analyses, book reviews, etc. were excluded from the study selections. Using relevant keywords and publication dates from 2004 to present, studies were identified through database searches of Education Resources Information Centre (ERIC) and Google Scholar. The list of journals included the following: International Journal of Instructional Media; Educational Technology Research and Development; The Turkish Online Journal of Educational Technology; Journal of Research on Technology in Education; British Journal of Educational Technology; Computers and Education; Journal of Educational Computing Research; Language Learning and Technology, Journal of Literacy and Technology; Computers in the Schools; International Journal of Technology in Teaching and Learning and Computer Assisted Language Learning.
This analysis was a content analysis that “seeks to analyze data within a specific context in view of the meanings someone – a group or culture – attributes to them” (Krippendorff, p. 403). Studies were selected, organized, and then analyzed to identify similarities and differences. Patterns and common themes were identified and the results grouped, categorized, analyzed and discussed. Conclusions, limitations and implications were then drawn, based on the results of the analysis.
The studies used in this analysis contained a variety of features. The participants ranged in numbers from 28 (Rodriguez et al., 2012) to 2468 (Ponce, Mayer & Lopez, 2013). The participants ranged from grade one students (Lysenko & Abrami, 2014) to university students (Connell, Bayliss & Farmer, 2012; Schugar, Schugar & Penny, 2011). Five of the studies focused on non-mainstream students (ESL, lower socio-economic backgrounds, below grade level) while the others focused on mainstream students not diagnosed with learning disabilities. The technologies used in the studies were varied including the use of computers, hand-held computers and specific computer software. Eight studies used computer software programs to teach reading and reading comprehension skills. Three of the computer software programs were designed to teach specific reading comprehension strategies (McNamara et al., 2006; Ponce et al., 2012; Ponce et al. 2013). Five software programs used in the studies were designed to teach basic literacy skills (Murphy, 2007; Rodriguez et al., 2012; Tozcu & Coady, 2004); Cuevas et al., 2012; Lysenko & Abrami, 2014). The other seven studies compared the use of e-texts on iPads, computers, tablets, Nooks (e-readers) and CD ROMS both with and without narration, and other unique features such as a dictionary or animation. One unique feature was the use of a hypertext with navigation overviews. Nine of the studies based the research in the area of Language Arts or English, three based their study on second language learning, and the remaining three were in social studies and science or science related reading. Table one outlines the characteristics of the fifteen studies analyzed.
Table 1: Descriptive characteristics of the studies
Subject of Text/Course |
Technology Used |
Participant Demographics |
# of Participants |
|
Connell, Bayliss & Farmer (2012) |
Text on human heart |
Kindle 3 e-reader/iPad tablet |
University students |
201 |
Cuevas, Russell & Irving (2012) |
Language Arts |
Computer module designed to teach essential components of reading |
Secondary students (lower socioeconomic backgrounds) |
145 |
Ertem (2010)
|
Language arts |
Electronic books with and without animation |
4th graders (reading below grade level) |
77 |
Fry & Gosky (2007/2008) |
Social Studies |
Online text with pop-up dictionary |
Middle school |
129 |
Grimshaw, Dungworth, McKnight & Morris (2007) |
Language Arts |
CD ROM e-book/Toshiba satellite pro with software including online dictionary |
9-10 year olds |
132 |
Lysenko & Abrami (2014) |
Language Arts |
ABARACADABRA online interactive media tool for developing emergent literacy and e-PEARL digital portfolio to support self- regulation |
Early elementary Gr. 1-2 |
351 |
McNamara, O’Reilly, Best & Ozuru (2006) |
Science |
i-START Interactive Strategy Trainer for Active Reading and Thinking |
Adolescent students |
39 |
Murphy (2007)
|
ESL |
Computer Supported Language Learning (CALL) different forms of feedback |
First year English majors in Japan |
231 |
Ponce, Lopez & Mayer (2012) |
Language Arts |
E-PELS software application that teaches learning strategies that use graphic organizers |
Grades 4, 6, 8 |
1041 |
Ponce, Mayer & Lopez (2013) |
Language Arts |
Computer software instruction program for teaching reading comprehension and writing |
Grades 4, 6, 8 |
2468 |
Roberts & Barber (2013) |
Language Arts |
Electronic text on laptop computers |
Grade 2 |
30 |
Rodriguez, Filler & Higgins (2012) |
ESL |
Lexia Learning – computer based literacy program with instructions in Spanish/ English |
ESL Gr. 1 (Spanish first language) |
28 |
Salmeron & Garcia (2012) |
Language Arts |
Hypertext with navigation overviews |
Gr. 6 native Spanish speakers |
66 |
Schugar, Schugar & Penny (2011) |
English |
e-reader - Nook |
University – 1st year general composition courses |
30 |
Tozcu & Coady (2004) |
ESL |
CALL New Lexis |
Post-secondary (English full- time) |
65 |
The content analysis of the 15 articles revealed four categories related to reading comprehension in technology-based learning. The categories that emerged from the analysis were: 1) electronic texts vs print texts 2) effectiveness of software design 3) effects on motivation and 4) training, support, and resources.
Electronic Texts vs Print Texts
The analysis of the literature revealed that there were distinct differences in the purposes of the studies. Seven of the studies compared the effects of using e-texts vs print texts on reading comprehension. Connell et al. (2012) and Schugar et al. (2011) found no differences in reading comprehension, regardless of device used to read. The other researchers noted significant gains in reading comprehension in students who used the e-texts. Some researchers used e-texts with additional features beyond the basic text. Fry (2007-2008) used the online text with a pop-up dictionary, and found significant differences in reading comprehension for participants who used this medium, as opposed to an online text without a dictionary, or a print text. Ertem (2010) used the e-text with animation, and his findings revealed significant differences in reading comprehension as compared to a control group who read from a text. In the Grimshaw et al. (2007) study, findings revealed that only the e-text with narration resulted in significant gains in reading comprehension. Roberts and Barber (2013) noted that advanced readers showed significant gains in reading comprehension regardless of format used, however, proficient readers in their study only had greater gains when using a print text. Interestingly, Salmeron and Garcia (2012) found that students with low sustained-attention abilities made significant gains in integrating comprehension questions when they used hypertext with navigation overviews.
Effectiveness of Software Design
The remaining eight studies investigated whether software designed to promote reading and/or reading comprehension strategies was effective when used as an intervention to increase reading comprehension. A variety of software programs were used in the studies. These included four designed by the researchers (Cuevas, Russell & Irving, 2012; Ponce et al., 2012; Ponce et al., 2013; Murphy, 2007). In the Cuevas et al., (2012) study, students who performed Independent Silent Reading (ISR) from a textbook or a computer reading module made significant gains in reading comprehension. Only the computer module group performed better on text-specific assignments. Ponce et al. (2012) confirmed the effectiveness of the e-PELS intervention in promoting significant gains in reading comprehension. These gains were particularly significant for low-achieving students. In the Ponce et al. (2013) study, only the computer-based instruction group made significant gains in reading comprehension, although the gains could not be directly attributed to the software. Murphy (2007) found that students who received elaborative feedback and worked with a partner made significant gains in reading comprehension.
The remaining four software programs used were available for purchase or free online (Lysenko & Abrami, 2014; McNamara et al., 2006; Rodriguez et al., 2012; Tozcu & Coady, 2004). In the Lysenko and Abrami (2012) study, findings revealed that when the two computer software programs were used consistently, students demonstrated significant gains in reading comprehension and vocabulary. Like Lysenko and Abrami (2012), Tozcu and Coady (2004) found that students who used the CALL program showed significant increases in reading comprehension and vocabulary. McNamara et al. (2006) found that prior knowledge of reading strategies combined with the use of the i-START program contributed to the quality of self-explanations and reading comprehension. Their findings revealed significant gains in text-based questions for students with less knowledge of reading strategies and inference questions for those with greater prior knowledge of reading strategies. In the Rodriguez et al. (2012) study, findings revealed that students given instructions in Spanish had statistically significant gains in reading comprehension.
The results of the studies predominantly indicated that certain types of computer modules designed to increase reading comprehension are effective. Only Cuevas et al. (2012) noted that there were no significant differences in reading comprehension although they did find that the group that used the computer for ISR performed better on text-specific assignments. The three studies involving ESL participants demonstrated that students using the computer modules made significant gains in reading comprehension (Murphy, 2007; Rodriguez et al., 2012; Tozcu & Coady, 2004). In several of the studies, significant gains in reading comprehension were only found when other conditions were present (Murphy, 2007; Lysenko & Abrami, 2014; McNamara et al., 2006). In one study (Murphy, 2007), the gains in reading comprehension were only made when students worked with a partner. Lysenko and Abrami (2014) found that the combination of the use of both software used in their study resulted in significant gains in reading comprehension. McNamara et al. (2006) found that only the combination of prior knowledge of reading strategies and the use of software contributed to significant reading comprehension gains.
Effects on Motivation, Interest, and Engagement
Within the group of studies that compared e-texts and print texts, the findings were inconsistent in terms of their observation/data concerning engagement and motivation. This finding is important because motivation is a key issue in reading comprehension and, by extension, overall learning (Cuevas et al., 2012). Roberts and Barber (2013) found that only their proficient, as opposed to their advanced, group demonstrated consistent reading enjoyment with the e-text. They pointed out that educators should be cognizant of the role that interest plays in reading comprehension. Grimshaw et al. (2007) noted that enjoyment was enhanced by narration and that the use of the dictionary was greater with the e-text than with the print text. Using informal observation, Ertem (2010) found that students were usually more enthusiastic about reading the e-text as compared with the print text. By contrast, the participants in the Connell et al. (2012) study were university students who were reluctant to use the e-books for academic purposes.
One of the studies that investigated the effectiveness of computer software, Cuevas et al. (2012) specifically proposed to investigate motivational effects of using the computer software. The researchers found that the participants demonstrated more motivation to read when using the computer module as opposed to the print text. The participants in the Rodriguez et al. (2012) study also indicated that they preferred the use of their primary language in the CALL program. Results from a survey in the Tozcu and Coady (2004) study indicated that one hundred percent of the students were interested in and enjoyed using the program. Murphy (2007) noted that Elaborative Feedback (EF) was more effective than Knowledge of Correct Response feedback in engaging students with each other and with the text. The use of EF resulted in more quality interaction between student pairs.
Training, Support, and Resources
A final category that emerged in the studies was the need for training, support and resources when implementing technology-based learning activities in the classroom. Two studies comparing the text formats (e-text and print text) suggested that students required additional training and knowledge of features on the devices (Connell et al., 2012; Schugar et al., 2011). Five of the researchers indicated that students needed to be given instruction on how to use the technology (Connell et al., 2012; Grimshaw et al., 2007; McNamara et al., 2006; Schugar et al., 2011; Ponce et al., 2013). Only one researcher suggested that students are technologically literate (Ertem et al., 2010). One researcher discussed the issue of lack of teacher experience and comfort with using computers (Ponce et al., 2013).
Cuevas et al. (2012) and Fry and Gosky, (2007/2008) indicated that there were issues with access to the computer labs during the course of their research. Three of the studies also indicated that participants took longer times to read from the electronic text than from the print text (Connell et al., 2012; Ertem, 2010; and Grimshaw et al., 2007). This increased reading time would further impact limited time available in computer labs in schools. Ponce et al. (2012) and Ponce et al. (2013) pointed out the lack of technical and administrative support for teachers who implemented technology during their study.
Lysenko and Abrami (2014) suggested that more practical means to promote self-regulated reading comprehension skills was needed. Fry and Gosky (2007/2008) suggested the use of hand-held devices rather than the computer lab medium used in their study. Grimshaw et al. (2007) highlighted the need to choose appropriate e-texts as many qualities and varieties are available. Only two of the studies indicated that the technology used in their study was web-based and available free on-line (Lysenko & Abrami, 2014; McNamara et al., 2006).
DISCUSSION
The studies that compared the use of either print or e-text investigated the use of a variety of features in the e-text formats. Most studies found significant gains in reading comprehension. The use of electronic text with pop-up dictionary, animation, and narration all resulted in significant gains in reading comprehension (Fry & Gosky, 2007-2008; Ertem, 2010; Grimshaw et al., 2007). However, Roberts and Barber (2013) found significant gains in reading comprehension for advanced readers using both the electronic and print text. The proficient readers in this study only made significant gains when reading from a print book. This finding indicated that reading comprehension gains from using e-texts may be greater for students with more advanced reading skills. By contrast, Connell et al. (2012) found that reading comprehension results were not affected by text format, though their participants were university students, who were likely proficient readers.
The studies that compared the use of computer software designed to promote literacy skills or reading comprehension to traditional instruction overwhelmingly found that the technology-based learning resulted in better reading comprehension. Only Cuevas et al. (2012) noted no significant differences in reading comprehension although even this study identified differences in performance on text-specific assignments. The findings of several studies revealed that when used consistently and in conjunction with classroom instruction, using technology-based learning activities effectively promoted gains in reading comprehension (Rodriguez et al., 2012; Ponce et al., 2013).
There was evidence from the analysis that motivational/engagement/interest effects of technology based learning should be considered in investigations concerning reading comprehension. Ertem (2010) and Roberts and Barber (2013), suggested that future studies should investigate how technology based instruction effects motivation. Several researchers used surveys/observation to investigate motivational effects of using technology in instruction (Cuevas et al., 2012; Grimshaw et al., 2007; Roberts, 2013). Grimshaw et al. (2007) noted that the participants used the dictionary in the electronic text more often than they did when reading a print text with a dictionary. Interestingly, Connell et al. (2012) noted that participants were hesitant to use the e-text for academic purposes.
Some of the studies suggested that participants should be more familiar with the technology medium used in the study (Connell et al., 2012; McNamara et al., 2006; Schugar et al., 2011). Connell et al. (2012) suggested that if participants were more familiar with features available on the device used, their results may have been different. Other studies indicated that a longitudinal study with more interventions or time spent using the technology would give a clearer indication of whether the use of technology based learning inhibits or enhances reading comprehension (Cuevas et al., 2012; Fry & Gosky, 2007-2008; Rodriguez et al., 2012; Tozcu & Coady, 2014). Lysenko and Abrami (2014) conducted a second study in which the effects of using the two computer software programs to promote reading comprehension more than doubled. This finding indicated that further experiences with technology-based learning and thus familiarization with these new technologies may increase learning effects on reading comprehension (Lysenko & Abrami, 2014).
CONCLUSIONS
Conclusions were drawn in areas of the four categories that emerged in the content analysis. The findings of the studies suggested that the use of electronic books and print books will yield similar effects on reading comprehension. If the electronic books contain features such as narration (Grimshaw et al., 2007), animation (Ertem, 2010), a pop-up dictionary (Fry & Gosky, 2007/2008), or hypertext with navigation overviews (Salmeron & Garcia, 2012), they will more effectively promote gains in comprehension. However, based on the results of the Roberts & Barber (2013) study, further research is needed to determine if electronic texts or print texts are beneficial for students who are not advanced readers. Most studies that tested the effects of using computer software to promote reading comprehension confirmed the effectiveness of the use of the programs/modules tested (Lysenko & Abrami, 2014; McNamara et al., 2006; Murphy, 2007; Ponce et al., 2012; Rodriguez et al., 2012; Tozcu & Coady, 2004). The studies used the modules/programs to complement existing classroom instruction. Motivational effects of technology-based learning is an area that several researchers tested using surveys or informal observation (Cuevas et al., 2012; Grimshaw et al., 2007; Roberts & Barber, 2013; Tozcu & Coady, 2004). Most studies, with the exceptions of Connell et al. (2012) and Roberts and Barber (2013) concluded that technology-based learning consistently increases motivation. The need for teacher and student training in technology was highlighted in many of the studies (Connell et al., 2012; Grimshaw et al., 2007; McNamara et al., 2006; Schugar et al., 2011; ponce et al., 2013). Access to computer labs was an issue in several studies (Tozcu & Coady, 2004; Ponce et al., 2012; Ponce et al. 2013) and the absence of administrative and technical support for teachers was noted (Ponce et al., 2012; Ponce et al. 2013).
LIMITATIONS
This analysis reported on a variety of electronic texts and computer software. Some of the software were web-based and available free online, while others were purchased, or developed by the researchers (Cuevas, Russell & Irving, 2012; Ponce, Lopez & Mayer, 2012; Ponce, Mayer & Lopez, 2013; Murphy, 2007; Cuevas et al., 2012). Including only web-based or purchased computer applications/software might have changed the results. The analysis included studies with participants from grade one to university students with differing ability levels and differing levels of computer literacy. Focusing on studies that controlled for age, general ability level or level of computer ability could have yielded different results. The sample size of the studies ranged from 28 to 2468. Rodriguez et al. (2012) and Schugar et al. (2011) suggested that future studies should include more participants. Narrowing the range of sample size to only include larger numbers may have changed the results of the analysis. Several studies suggested the need for a longitudinal study (Cuevas et al., 2012; Grimshaw et al., 2007); McNamara et al., 2006; Rodriguez et al., 2012; Tozcu & Coady, 2004) which may have resulted in differing outcomes. This analysis also included only 15 studies and they were divided in their overall purpose; to test effectiveness of computer software on reading comprehension and to compare the effects of reading from a print vs an e-text on reading comprehension. By focusing on one of these major categories, the findings would have been more comprehensive.
IMPLICATIONS
The analysis showed that text format (print/e-text) does not affect reading comprehension unless the electronic format also has extra features that promote reading comprehension. The addition of animation (Ertem, 2010), narration (Grimshaw et al. (2007), pop-up dictionary (Fry & Gosky, 2007/2008), and hypertext with navigation overviews (Salmeron & Garcia, 2012), significantly promoted gains in reading comprehension. One exception is with proficient readers as they increased reading comprehension more with print text (Roberts & Barber, 2013). This evidence should inform electronic text and software designer’s decisions and teachers’ decisions in choosing text formats for proficient and advanced readers.
The analysis showed that the use of technology-based learning for language development, and specifically for reading comprehension can be effective. Only Cuevas et al. (2012) noted that there were no significant differences in reading comprehension though they did find that the group who used the computer for ISR performed better on text-specific assignments. Several of the studies found significant gains when the program was paired with other variables such as partner groups, (Murphy, 2007) or students with knowledge of reading strategies (McNamara et al., 2006). Finally, one study found that the combination of two software programs resulted in significant gains in reading comprehension (Lysenko & Abrami, 2014). Thus, the technology enhanced learning environments resulted in significant gains in reading comprehension, but only under certain conditions.
Reverences
Connell, C., Bayliss, L. & Farmer, W. (2012). Effects of E-book readers and tablet computers on reading comprehension. International Journal of Instructional Media, 39, (2), 131-140.
Cuevas, J., Russell, R. & Irving, M. (2012). An examination of the effect of customized reading modules on diverse secondary students’ reading comprehension and motivation. Educational Technology Research and Development, 60, 445-467.
Ertem, I. (2010). The effect of electronic storybooks on struggling fourth graders’ reading comprehension. The Turkish Online Journal of Educational Technology, 9, (4), 140-155.
Fry, S. & Gosky, R. (2007/2008). Supporting social studies reading comprehension with an electronic pop-up dictionary. Journal of Research on Technology in Education, 40, (2), 127-139.
Grimshaw, S., Dungworth, N., McKnight, C. & Morris, A. (2007). Electronic books: children’s reading comprehension. British Journal of Educational Technology, 38, (4), 583-599.
Krippendorff, K. (1989). Content analysis. ScholarlyCommons. Retrieved from: http//repository.upenn.edu/asc_papers/226
Lysenko, L. & Abrami, P. (2014). Promoting reading comprehension with the use of technology. Computers and Education, 75, 162-172.
McNamara, D., O’Reilly, T., Best, R., & Ozuru, Y. (2006). Improving adolescent students’ reading comprehension with iSTART. Journal of Educational Computing Research, 34 (2), 147-171.
Murphy, P. (2007). Reading comprehension exercises online: The effects of feedback, proficiency and interaction. Language Learning & Technology, 11, (3), 107-129.
Ponce, H., Lopez, M. & Mayer, R. (2012). Instructional effectiveness of a computer-supported program for teaching reading comprehension strategies. Computers and Education, 59, 1170-1183.
Ponce, H., Mayer, R. & Lopez, M. (2013). A computer based spatial learning strategy approach that improves reading comprehension and writing. Educational Technology Research Development, 61, 819-840.
Roberts M. & Barber, C. (2013). Effects of reading formats on the comprehension of new independent readers. Journal of Literacy and Technology, 14, (2), 24-55.
Rodriguez, C., Filler, J. & Higgins, K. (2012). Using primary language support via computer to improve reading comprehension skills of first grade English language learners. Computers in the Schools, 29, 253-267.
Salmeron, L. & Garcia, V. (2012). Children’s reading of printed text and hypertext with navigation overviews: The role of comprehension, sustained attention, and visuo-spatial abilities. Journal of Educational Computing Research, 47, (1), 35-50.
Schugar, J, Schugar, H & Penny, C. (2011). A nook or a book: Comparing college students’ reading comprehension level, critical reading, and study skills. International Journal of Technology in Teaching and Learning, 7, (2), 174-192.
Tozcu, A. & Coady, J. (2014). Successful learning of frequent vocabulary through CALL also benefits reading comprehension and speed. Computer Assisted Language Learning, 17, (5), 473-495.