TOEFL iBT Integrated and Independent Writing Tasks: Iranian Test Takers' Performance in Focus
محورهای موضوعی : نشریه زبان و ترجمهFarah Shooraki 1 , Hossein Barati 2 , Ahmad Moinzadeh 3
1 - Department of English, Maybod Branch, Islamic Azad University, Maybod, Iran
2 - Department of English Language, University of Isfahan, Isfahan, Iran
3 - Department of English Language, University of Isfahan, Isfahan, Iran
کلید واژه: Coh-Metrix, independent writing task, integrated writing task, TOEFL-iBT, writing assessment,
چکیده مقاله :
This study aimed to compare the performance of international and Iranian test takers on TOEFL iBT integrated and independent writing tasks. The international test takers' data were provided by TOEFL organizer: Educational Testing Service (ETS). This included a total of number of 4916 samples for both integrated and independent tasks. A paired sample t-test showed the international test takers significantly outperformed on the independent task. Further, a sample of 100 integrated and independent writing tasks produced by international test takers was randomly selected through systematic sampling, and compared with 96 counterparts written by Iranian TOEFL iBT test takers. Although the difference between Iranian and International test takers was not significant, the paired sample t-test on Iranian samples per se showed that they too outperformed on the independent writing tasks. Although the difference between Iranian and International test takers was not significant, the paired sample t-test on Iranian samples per se showed that they too outperformed on the independent writing tasks.
This study aimed to compare the performance of international and Iranian test takers on TOEFL iBT integrated and independent writing tasks. The international test takers' data were provided by TOEFL organizer: Educational Testing Service (ETS). This included a total of number of 4916 samples for both integrated and independent tasks. A paired sample t-test showed the international test takers significantly outperformed on the independent task. Further, a sample of 100 integrated and independent writing tasks produced by international test takers was randomly selected through systematic sampling, and compared with 96 counterparts written by Iranian TOEFL iBT test takers. Although the difference between Iranian and International test takers was not significant, the paired sample t-test on Iranian samples per se showed that they too outperformed on the independent writing tasks. Although the difference between Iranian and International test takers was not significant, the paired sample t-test on Iranian samples per se showed that they too outperformed on the independent writing tasks.
Aminzadeh, R. and Booheh, Z. S. (2015). The Comparative Effect of Reading-to-Write and WritingOnly Tasks on the Improvement of EFL Learners’ Writing Ability. The Journal of Applied Linguistics, 6 (12).
Amiryousefi, M., & Tavakoli, M. (2014). An empirical study into the effects of Iranian test takers' personal attributes on their TOEFL scores: Reading, writing, and listening in focus. International Journal of Research Studies in Language Learning, 3(6), 35-45.
Barkaoui, K. (2015). Test Takers’ Writing Activities During the TOEFL iBT® Writing Tasks: A Stimulated Recall Study. ETS Research Report No. RR–15-04.
Biber, D., & Gray, B. (2013). Discourse characteristics of writing and speaking task types on the TOEFL-iBT test: A lexico-grammatical analysis. TOEFL-iBT research report. Retrieved from http://www.ets.org/Media/Research/pdf/RR-13-04.pdf
Camp, R. (1993). Changing the model for the direct writing assessment. In M. M. Williamson & B. A. Huot (Eds.), Validating holistic scoring for writing assessment: theoretical and empirical foundations (pp. 45-78). Cresskill, NJ: Hampton Press, Inc.
Campbell, C. (1998). Teaching second-language writing: Interacting with text. Boston: Heinle & Heinle.
Cho, Y., Rijmen, F., & Novak, J. (2013). Investigating the effects of prompt characteristics on the comparability of TOEFL iBT integrated writing tasks. Language Testing, 30(4), 513-534.
Crossley, S. A., Kristopher, K., & Dascalu, M. (2018). The Tool for the Automatic Analysis of Cohesion 2.0: Integrating semantic similarity and text overlap. Behavior Research Methods, 51(1), 14–27.
Cumming, A. (1998). Theoretical perspectives on writing. Annual Review of Applied Linguistics, 18, 61-78.
Cumming, A. (2013). Assessing integrated writing tasks for academic purposes: Promises and perils. Language Assessment Quarterly, 10, 1, 1-8.
Cumming, A., Kantor, R. Baba, K., Erdoosy, U., Eouanzoui, K., & James, M. (2006). Analysis of discourse features and verification of scoring levels for independent and integrated tasks for the new TOEFL (TOEFL Monograph No. MS-30). Princeton, NJ: ETS.
Cumming, A., Kantor, R., Baba, K., Erdosy, U., Eouanzoui, K., & James, M. (2005). Difference in written discourse in independent and integrated prototype tasks for next generation TOEFL. Assessing Writing, 10, 5‒43.
Cumming, A., Kantor, R., Powers, D., Santos, T., & Taylor, C. (2000). TOEFL 2000 writing framework: A working paper (TOEFL Monograph Series No. 18). Princeton, NJ: Educational Testing Service.
Cumming, A., Grant, L., Mulcahy-Ernt, P., & Powers, D. (2004). A teacher verification study of speaking and writing prototype tasks for a new TOEFL. Language Testing, 21(2), 159‒197.
Esmaeili, H. (2002). Integrated reading and writing tasks and ESL students’ reading and writing performance in an English languagetest.The Canadian Modern Language Review, 58, 599–620.
ETS. (2022). TOEFL iBT® Test Content. Retrieved from https://www.ets.org/toefl/ibt/about/content/.
Gebril, A. (2009). Score generalizability of academic writing tasks: Does one test method fit it all? Language Testing, 26(4), 507-531.
Gebril, A., & Plakans, L. (2009). Investigat-ing source use, discourse features, and process in integrated writing tests. In Spaan fellow working papers in second/foreign language assessment (Vol. 7, pp. 47–84). Ann Arbor: The University of Michigan.Google Schol-ar
Gebril, A. (2021). Learning-oriented language assessment: Putting theory into practice. New York: Routledge.
Gebril, A., & Plakans, L. (2009). Investi-gating source use, discourse features, and process in integrated writing tests. In Spaan fellow working papers in second/foreign language assessment (Vol. 7, pp. 47–84). Ann Arbor: The University of Michigan.Google Scholar.
Gholami, J. and Alinasab, M. (2017). Source-Based Tasks in Writing Independent and Integrated Essays. International Journal of Instruction, 10(3), 127-142.DOI: 10.12973/iji.2017.1039a.
Graesser, A. C., McNamara, D., Cai, Z., Conley, M., Li, H., & Pennebaker, J. (2014). Coh-Metrix measures text characteristics at multiple levels of language and discourse. Elementary School Journal, 115(2), 211-229.
Grant, L., & Ginther, A. (2000). Using computer-tagged linguistic features to describe L2 writing differences. Journal of Second Language Writing, 9, 123–145.
Guo, L., Crossley, S., & McNamara, D. S. (2013). Predicting human judgments of essay quality in both integrated and independent second language writing samples: A comparison study. Assessing Writing, 18, 218–238.
Kim, E. J. (2017). The TOEFL iBT writing: Korean students’ perceptions of the TOEFL iBT writing test. Assessing Writing, 33, 1-11.
Liao, L. (2020). A Comparability Study of Text Difficulty and Task Characteristics of Parallel Academic IELTS Reading Tests. English Language Teaching, 13(1).
Oxford, R. (2006). “Task-based language teaching and learning: An overview”, The Asian EFL Journal Quarterly, 8 (3), 94-121.
Plakans, L., Gebril, A., & Bilki, Z. (2019). Shaping a score: The impact of fluency, accuracy, and complexity on integrated skills performances. Journal of Language testing. https://doi.org/10.1177/0265532216669537.
Plakans, L., & Gebril, A. (2012). A Close Investigation into Source Use in Integrated Second Language Writing Tasks. Assessing Writing, 17(1), 18-34.
Plakans, L., & Gebril, A. (2017). Exploring the relationship of organization and connection with scores in integrated writing assessment. Assessing Writing, 31, 18-34. http://dx.doi.org/10.1016/j.asw.2016.08.005 1075-2935.
Plakans, L., & Gebril, A. (2013). Using multiple texts in an integrated writing assessment: Source text use as a predictor of score. Journal of Second Language Writing, 22, 217–230.
Riazi, A.M. (2016). Comparing writing per-formance in TOEFL-iBT and academic assignments: An exploration of textual features. Assessing Writing, 28, 15-27. DOI: 10.1016/j.asw.2016.02.001.
Shi, L. (2004). Textual Borrowing in Second-Language Writing. Written Communication, 21, 171-200.
Soleimani,H., and Mahdavipour, M. (2014). The effect of variations in integrated writing tasks and proficiency level on features of written discourse generated by Iranian EFL learners. Journal of Teaching Language Skills. (JTLS)6 (2), Ser. 75/4.ISSN: 2008-8191. pp. 131-159.
Stricker, L., & Attali, Y. (2010). Test takers’ attitudes about the TOEFL iBT (TOEFL iBT™ Report No. iBT-13). Princeton, NJ: ETS.
Yu, G. (2008). Reading to summarize in English and Chinese: A tale of two languages? Language Testing, 25(4), 521–551.
Weir, C. J., Huizhong, Y., and Yan, J. (2000). An Empirical Investigation of the Componentiality of L2 Reading in English for Academic Purposes. UCLES Cambridge: Cambridge University Press.
Weir, C. J. (2005). Language testing and validation: An evidence-based approach. Basingstoke: Palgrave Macmillan.
Weir, C., & Shaw, S. (2006). Defining the constructs underpinning main suite writing tests: a socio-cognitive perspective. Research Notes, 26(3), 9-14.
Weir, C. J. (1993). Understanding and developing language tests. New York: Prentice Hall.