If you’ve had any contact with youngsters over the last few years, you’ll have noticed that they seem to spend a significant amount of their time glued to a range of digital devices. Partly because of this, English language teachers are often told that they should be using digital technologies to enhance their teaching and to increase their students’ motivation.
But the essential question – Do digital technologies actually help students learn English? – is not always asked. Let’s ask that question.
Consider these statements.
Note: This blog post is based on a 30-minute talk that I gave at the recent IATEFL (International Association of Teachers of English as a Foreign Language) conference held in Glasgow in March 2017. The research referenced below is from my recent book Focus on Learning Technologies (Oxford University Press, 2016). The talk included four statements, but in the interests of brevity we’ll start with just two. The next post will consider two more, and the following look at some of the wider issues.
1 Children & teens are naturally good at using technology
Despite the myth of the digital native being well and truly debunked (I’ve written about this in my blog, as have many others), this is still a statement that is regularly trotted out. However, although youngsters may be comfortable with digital technologies, they are far from savvy users of technology. Research tends to bear this out.
A large-scale comparative study into the state of digital literacies around the world was carried out in 2013 by the International Association for The Evaluation of Educational Achievement (IEA), an independent consortium of national research agencies. 60,000 13 to 14-year-old students (grade 8) in 3,300 schools in 21 education systems/countries were surveyed, with additional data collected from 25,000 teachers, school principals, and school ICT (Information Communications Technology) coordinators working in these schools. The study evaluated students’ computer and information literacy, defined as ‘an individual’s ability to use computers to investigate, create, and communicate, in order to participate effectively at home, at school, in the workplace, and in society’ (Fraillon, et al., 2013, p. 17). It included a focus on the impact of student characteristics, and home and school contexts on levels of computer literacy, both within and between countries. A computer-based assessment and questionnaire was delivered to students via USB drives attached to school computers. The assessment required students to carry out a number of practical tasks that required a range of digital skills, and led to a larger task, such as creating a webpage with information about a school band competition, or collecting and managing information to create a presentation about ‘breathing’ to present to 9-year-olds. There were a total of four of these larger tasks, and each student completed two, randomly assigned. Results were mapped to a proficiency scale, from level 4 (the highest) to level 1 (the lowest). 81% of the students surveyed achieved scores that placed them within levels 1 – 3, with the majority at level 2. In addition, factors such as students’ expected educational attainment, parents’ educational level and profession, the number of books in the home, and access to ICT resources at home were all found to positively impact individual test scores across most education systems, although low socio-economic status cancelled out the positive impact of having access to ICT resources at home. In all but two countries, females scored higher than males on the proficiency scale. Having received ICT instruction in schools also positively affected the test scores in eight countries/education systems. (Focus on Learning Technologies, page 36).
2 Blogs can help (teenage) students improve their writing skills
There are many reasons why one might intuitively think that this statement is true. For a start, blogs provide students with a way to write for a real audience (e.g. other classmates, parents, or even the general public). It’s been argued (e.g. Raith, 2009), that blogs enable new genres of writing and the development of new contexts for communication, and this requires students to develop new literacies. Several researchers (e.g. Ducate & Lomicka, 2005; Hendron, 2003; Hourigan & Murray, 2005) have argued that blogs can motivate students to write more and to write more accurately, and that they are therefore good tools for English language teachers to adopt. But do blogs actually get students to write more and to write better? Let’s see.
Raith (2009) examined the use of blogs with twenty nine grade 9 EFL students in Germany with a fairly low level of English proficiency (A2 on the CEFR). The aim of this 6-week qualitative study was to investigate the effect of an online audience on the students’ writing process. The students created written journals about their reading of a set book in English, and they were allowed to choose which medium to write in: nineteen students chose to use traditional paper and pen journals and wrote for an imagined/abstract audience, while ten students chose to use blogs, and wrote for a real online audience. Data were collected by means of pre- and post-treatment questionnaires, and post-treatment focused interviews with the students, as well as the content of the paper journals and the blogs. The researcher found that both groups of students were acutely aware of audience, but the blog writers showed more focus on meaning in their writing, and were keen to interact with their audience about their writing.
Raith’s study adds to the positive press that blogging for English language students receives, both in secondary and adult contexts. However, there are voices of caution. A study carried out with a group of twenty-seven Belgian 17-year-old EFL students examined the extent to which blogs motivated the students to write more and to write better, as well as whether the quality of their writing, and their understanding of the content under discussion, actually improved (Sercu, 2013). Over a period of 6 weeks, students were given a weekly prompt on a range of topics (e.g., an international political issue, a recent local health campaign, their post-high school plans), and were asked to write a blog post with their reactions to the prompt. The data analysis involved both qualitative and quantitative approaches: questionnaires were administered to the students, and two software packages were used to analyze the linguistic complexity of their blog posts. Sercu found that the majority of students were motivated by writing for a real audience, and by being able to interact and discuss issues via the comments section on the blog posts; the students also wrote more than usual. However, the researcher did not find conclusive data to demonstrate that the students’ writing had improved over the course of the project, possibly because of the short duration of the study, and the limited amount of written data produced. In addition, the Sercu found that the lower proficiency students found the project less motivating, wrote less, and preferred to read their classmates’ blog posts rather than producing their own. Sercu concluded that ‘blogs work for some students, but not for others’ (2013, p. 4364). Nevertheless, the post-treatment questionnaires revealed that the majority of the students felt that their writing had improved, and that they had become more aware of the areas in their writing that needed work. The detailed analysis carried out in this study provides a useful counter-balance to claims that blogs are always effective in supporting and developing all students’ writing skills. (Focus on Learning Technologies, pages 113-114).
It often seems that for every study showing a positive outcome for technology x, there is a study showing the opposite. The research into blogs is a case in point. Sometimes they seem to support the development of students’ writing skills, sometimes they don’t. Clearly there are plenty of issues involved in research in our field (as in many other fields) – small sample sizes, lack of replicability, dodgy research design, contextual factors, research bias, publication bias, and so on… I’ll be exploring these in more detail in a later post.
References
Ducate, L. & Lomicka, L. (2005). Exploring the blogosphere: Uses of weblogs in the foreign language classroom. Foreign Language Annals, 38, 3, 410-21.
Fraillon, J., Ainley, J., Schulz, W., Friedman, T., & Gebhardt, E. (2013). Preparing for life in a digital age. The IEA International Computer and Information Literacy Study International Report. Springer Open: Springer International Publishing AG Switzerland.
Hendron, J. G. (2003). Educators as content publishers. The VSTE Journal, 17, 3, 2-6.
Hourigan, T., & Murray, L. (2010). Using blogs to help language students to develop reflective strategies: Towards a pedagogical framework. Australasian Journal of Educational Technology, 26, 2, 209-225.
Raith, T. (2009). The use of weblogs in education. In Thomas, M. (Ed.). Handbook of research on web 2.0 and second language learning (pp. 274-91). Hershey, PA: IGI Global.
Sercu, L. (2013). Weblogs in foreign language education: Real and promised benefits. Proceedings of INTED2013, 7th International Technology, Education & Development Conference, Spain, pp. 4355-66.
Additional resources (e.g. book sample, discussion questions) for Focus on Learning Technologies can be found on the OUP companion website for the book.
Nicky Hockly
May 2017
The Raith (20009) study seems to me rather problematical. If the students are allowed or asked to choose between pan and paper writing or writing a blog, there is surely no way the two groups can be mmeaningfully compared with each other afterwards because there is already a bias due to the reasons behind their choice of medium (IT affinity and experience, for example).
Thanks for your comment Dia. I think the aim of the study was precisely to compare whether there was any difference in student writing between these two mediums (paper vs blog), and students carried out the same writing tasks to minimise difference. An interesting follow-up study might have a group of students all use paper for a series of writing tasks, and then use blogs, and to see what differences that produced. But as with any study with technology, it’s extremely difficult, if not impossible, to isolate just the ‘technology’ as having a positive (or negative or negligible) effect, because of the wide range of contextual factors involved. This is one of the issues with research into technology and its effects on language learning, as you rightly point out. Thanks for dropping by and contributing!
Hi Nicky, it’s very useful to have an accessible summary of some of the research into the effectiveness of technology on learning. In a way I am glad that the result is inconclusive because I think that the way technology is used by the teacher is key and actually not easy to control for in research. Imagine being asked as a student to write regularly on a blog by different types of teachers:
1. The technophobe – I am required to do this but don’t believe it will help you.
2. The technophile – we should be using ICT as much as possible
3. The one lacking in ICT skills – I am sure this is going to go wrong and then I won’t know how I will solve it.
4. The pedagogically and tech confident teacher – Experience tells me that this works and is what you would benefit from right now.
I am sure that the outcomes would be very different in each case.
Also audience is a very variable factor and motivating to very different degrees depending on the connection the audience has with the students ie random strangers may be less motivating than a partner school in another country with which a good relationship has been built over a period of time.
I wonder if it will ever be possible to isolate all these factors to a satisfactory degree. If nothing else though, I think that what this research tells us is that tech can be helpful but is never a magic bullet and should be used thoughtfully.
Thanks for your thoughtful comments, Anne. You make a point CALL researchers have been struggling with for decades ie the many factors that intervene in language learning make it very difficult to isolate just the technology as the defining factor that helps (or hinders) learning. Meta-studies that compare the results of many individual studies are even more fraught with potential issues. I’ll be looking more at these points in the next blog post (part 3) so thanks for bringing it up!
there are many studies about the impact of web2 tools in improvement writting or reading skill’s in the 2l like the study of mimi li about wikis and aydin selami and aydin & ozdemir and vasquez and also the study of lomiqua anderson and all this study prouve that wikis and blog help student to be good in 2l.
Thanks for your comment khabbab – could you provide the full references for the studies you mention? Readers may be interested in following those up (I certainly would!). Thanks.
yes ofcourse Nicky i want to learnt more from you, the study of vasquez & wang it is a review article about ” web 2 and second language ” The review reveals that blogs and wikis have been the most studied Web 2.0 tools, while others, such as social networking applications and virtual worlds, have been less frequently explored. In addition, the most commonly investigated languages have been English, Spanish, German, and French.Considerably less research has been conducted on applying Web 2.0 technologies to less commonly taught languages, such as Arabic, Chinese, or Russian. Additionally,the language learning environments afforded by Web 2.0 technologies have greatly broadened the scope of topics explored in computer-assisted language learning (CALL): from earlier research which tended to concentrate on the traditional four broad language skills, to more recent topics, such as learners’ identities, online collaboration,and learning communities.the most frequently reported benefit associated with Web 2.0 technologies is the favorable language learning environments they help to foster.
Thanks khabbab, you’re quoting the authors’ abstract for their article directly I see, which can be found here: https://eric.ed.gov/?id=EJ968795 (for anyone interested in following this up). When I read the abstract I realised that I also refer to it in my book (Focus on Learning Technologies, OUP 2016), as follows:
“In addition, not all research studies are firmly grounded in theory. For example, in a review of 43 empirical research studies into Web 2.0 tools in a number of learning technology journals and books, Wang and Vásquez (2012) found that over half of the studies (56%) did not have an identifiable theoretical foundation. They also found that several of the studies suffered from common methodological weaknesses, such as convenience-sampling of participants, rather than random-sampling or purposeful-sampling. They pointed out that in K-12 contexts this was often due to logistical issues such as controlled access to minors, or the need to work with already formed groups of students in individual classrooms. Wang and Vásquez also identified a general shift in these research studies from primarily quantitative approaches to data collection, to an emphasis on qualitative data. However, they pointed out that some research studies failed to carry out in-depth analyses of qualitative data, such as exploring students’ perspectives to help explain observed phenomena. Another weakness they identified stemmed from the techno-centric view taken by some researchers, in which the technology alone was held to account for learning, without the wider pedagogical context (such as teaching strategies or instructional materials), being considered” (pp. 79-80).
The study points to the kinds of issues I mention at the end of this blog post.
Thanks for your comments!
good job nicky first i am working in my doctorat about the impact of wikis as a web 2 tool in the acquisition of 2l and really i am interesting about your publishing and i want to follow this page to learnt more and if you have another studies about this theme please ?