View More View Less
  • 1 Hunan Normal University, China
  • | 2 Durham University, UK
Restricted access

Abstract

This study explores the interaction effect between source text (ST) complexity and machine translation (MT) quality on the task difficulty of neural machine translation (NMT) post-editing from English to Chinese. When investigating human effort exerted in post-editing, existing studies have seldom taken both ST complexity and MT quality levels into account, and have mainly focused on MT systems used before the emergence of NMT. Drawing on process and product data of post-editing from 60 trainee translators, this study adopted a multi-method approach to measure post-editing task difficulty, including eye-tracking, keystroke logging, quality evaluation, subjective rating, and retrospective written protocols. The results show that: 1) ST complexity and MT quality present a significant interaction effect on task difficulty of NMT post-editing; 2) ST complexity level has a positive impact on post-editing low-quality NMT (i.e., post-editing task becomes less difficult when ST complexity decreases); while for post-editing high-quality NMT, it has a positive impact only on the subjective ratings received from participants; and 3) NMT quality has a negative impact on its post-editing task difficulty (i.e., the post-editing task becomes less difficult when MT quality goes higher), and this impact becomes stronger when ST complexity increases. This paper concludes that both ST complexity and MT quality should be considered when testing post-editing difficulty, designing tasks for post-editor training, and setting fair post-editing pricing schemes.

  • Aikawa, T. , Schwartz, L. , King, R. , Corston-Oliver, M. , & Carmen, L. (2007). Impact of controlled language on translation quality and post-editing in a statistical machine translation environment. In B. Maegaard (Ed.), Proceedings of the MT Summit XI, Copenhagen, Denmark (pp. 17). Copenhagen, Denmark.

    • Search Google Scholar
    • Export Citation
  • Aziz, W. , Koponen, M. , & Specia, L. (2014). Sub-sentence level analysis of machine translation post-editing effort. In S. O’Brien , L. W. Balling , M. Carl , M. Simard , & L. Specia (Eds.), Post-editing of machine translation: Processes and applications (pp. 170199).Cambridge Scholars Publishing.

    • Search Google Scholar
    • Export Citation
  • Bates, D. , Maechler, M. , Bolker, B. , & Walker, S. (2014). Lme4: linear mixed-effects models using Eigen and S4. R package version 3.1.2. http://CRAN.R-project.org/package=lme4.

    • Search Google Scholar
    • Export Citation
  • Carl, M. , Dragsted, B. , Elming, J. , Hardt, D. , & Jakobsen, A. L. (2011). The process of post-editing: a pilot study. Copenhagen Studies in Language, 41, 131142.

    • Search Google Scholar
    • Export Citation
  • Castilho, S. , Moorkens, J. , Gaspari, F. , Sennrich, R. , Way, A. , & Georgakopoulou, P. (2018). Evaluating MT for massive open online courses. Machine Translation, 32, 255278. https://doi.org/10.1007/s10590-019-09232-x.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Daems, J. , Vandepitte, S. , Hartsuiker, R. J. , & Macken, L. (2017). Identifying the machine translation error types with the greatest impact on post-editing effort. Frontiers in Psychology, 8, Article 1282. https://doi.org/10.3389/fpsyg.2017.01282.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Dahl, Ö. (2004). The growth and maintenance of linguistic complexity. John Benjamins.

  • Fox, J. , Weisberg, S. , Friendly, M. , & Hong, J. (2017). Effects: Effect displays for linear, generalized linear, and other models. R package version 4.0-0. https://cran.r-project.org/web/packages/effect.

    • Search Google Scholar
    • Export Citation
  • Gallupe, R. B. , DeSanctis, G. , & Dickson, W. G. (1988). Computer-based support for group problem-finding: An experimental investigation. MIS Quarterly, 12(2), 277296.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Gaspari, F. , Toral, A. , Naskar, S. K. , Groves, D. , & Way, A. (2014, October). Perception vs reality: measuring machine translation post-editing productivity [Paper presentation]. The third workshop on post-editing technology and practice (WPTP-3), within the eleventh biennial conference of the Association for Machine Translation in the Americas (AMTA-2014). Vancouver, Canada.

    • Search Google Scholar
    • Export Citation
  • Halverson, S. L. (2017). Multimethod approaches. In J. W. Schwieter , A. Ferreira , & J. Wiley (Eds.), The handbook of translation and cognition (pp. 195212). Wiley-Blackwell.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Herbig, N. , Pal, S. , Vela, M. , Krüger, A. , & Genabith, J. (2019). Multi-modal indicators for estimating perceived cognitive load in post-editing of machine translation. Machine Translation, 33, 91115. https://doi.org/10.1007/s10590-019-09227-8.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Hvelplund, K. T. (2011). Allocation of cognitive resources in translation: An eye-tracking and key-logging study. [Unpublished PhD dissertation]. Copenhagen Business School.

    • Search Google Scholar
    • Export Citation
  • International Organization for Standardization. (2017). Translation services – Post-editing of machine translation output – Requirements (ISO Standard No. 18587:2017). https://www.iso.org/standard/62970.html.

    • Search Google Scholar
    • Export Citation
  • Junczys-Dowmunt, M. T. , & Dwojak, H. (2016). Is neural machine translation ready for deployment? A case study on 30 translation directions. In Proceedings of the 9th international workshop on spoken language translation, Seattle, WA .https://arxiv.org/abs/1610.01108.

    • Search Google Scholar
    • Export Citation
  • Kappus, M. , & Ehrensberger-Dow, M. (2020). The ergonomics of translation tools: understanding when less is actually more. The Interpreter and Translator Trainer, 14(4), 386404. https://doi.org/10.1080/1750399X.2020.1839998.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Krings, H. P. (2001). Repairing texts: Empirical investigations of machine translation post-editing processes. The Kent State University Press.

    • Search Google Scholar
    • Export Citation
  • Kuznetsova, A. , Brockhoff, P. B. , & Christensen, R. H. B. (2017). lmerTest: Tests in linear mixed effects models. R package version 2.0-20. http://CRAN.R-project.org/package=lmerTest.

    • Search Google Scholar
    • Export Citation
  • Lacruz, I. , & Shreve, G. M. (2014). Pauses and cognitive effort in post-editing. In S. O’Brien , L. W. Balling , M. Carl , M. Simard , & L. Specia (Eds.), Post-editing of machine translation: Processes and applications (pp. 246272).Cambridge Scholars Publishing.

    • Search Google Scholar
    • Export Citation
  • Liu, Y. , Zheng, B. , & Zhou, H. (2019). Measuring the difficulty of text translation: The combination of text-focused and translator-oriented approaches. Target, 31(1), 125149. https://doi.org/10.1075/target.18036.zhe.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Lommel, A. (2018). The multidimensional quality metrics and dynamic quality framework. In J. Moorkens , S. Castilho , F. Gaspari , & S. Doherty (Eds.), Translation quality assessment: From principles to practice (pp. 109127). Springer.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Mesa-Lao, B. (2014). Gaze behaviour on source texts: An exploratory study comparing translation and post-editing. In S. O’Brien , L. W. Balling , M. Carl , M. Simard , & L. Specia (Eds.), Post-editing of machine translation: Processes and applications (pp. 219245). Cambridge Scholars Publishing.

    • Search Google Scholar
    • Export Citation
  • Moorkens, J. (2018). What to expect from Neural Machine Translation: a practical in-class translation evaluation exercise. The Interpreter and Translator Trainer, 12(4), 375387. https://doi.org/10.1080/1750399X.2018.1501639.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • O’Brien, S. (2004). Machine translatability and post-editing effort: How do they relate. Translating and the Computer, 26, 131.

  • O’Brien, S. (2006). Controlled language and post-editing. Multilingual, 17(7), 1719. https://multilingual.com/issues/2006-10-11.pdf.

    • Search Google Scholar
    • Export Citation
  • O’Brien, S. (2011). Towards predicting post-editing productivity. Machine Translation, 25(3), 197215. https://doi.org/10.1007/s10590-011-9096-7.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Paas, F. , & Van Merriënboer, J. J. G. (1994). Instructional control of cognitive load in the training of complex cognitive tasks. Educational Psychology Review, 6(4), 351371. https://doi.org/10.1007/BF02213420.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Sanchez-Torron, M. , & Koehn, P. (2016). Machine translation quality and post-editor productivity. In S. Green , & L. Schwartz (Eds.), MT researcher’s track, within proceedings of Association for Machine Translation in the Americas (AMTA-2016) (pp. 1626). Austin, USA.

    • Search Google Scholar
    • Export Citation
  • Sennrich, R. , Haddow, B. , & Birch, A. (2016). Neural machine translation of rare words with subword units. In Proceedings of the 54th annual meeting of the association for computational linguistics, Berlin, Germany (pp. 17151725). Association for Computational Linguistics. https://arxiv.org/abs/1508.07909.

    • Search Google Scholar
    • Export Citation
  • Specia, L. , & Shah, K. (2018). Machine translation quality estimation: Applications and future perspectives. Translation Quality Assessment, 1, 201235.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Sun, S. (2015). Measuring translation difficulty: Theoretical and methodological considerations. Across Languages and Cultures, 16(1), 2954. https://doi.org/10.1556/084.2015.16.1.2.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Sun, S. , & Shreve, G. M. (2014). Measuring translation difficulty: An empirical study. Target, 26(1), 98127. https://doi.org/10.1075/target.26.1.04sun.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Sweller, J. (1988). Cognitive load during problem solving: Effects on learning. Cognitive Science, 12(2), 257285. https://doi.org/10.1207/s15516709cog1202_4.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Sweller, J. , Ayres, P. , & Kalyuga, S. (2011). Cognitive load theory in perspective. In J. Sweller (Ed.), Cognitive load theory (pp. 237242). Springer.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • TAUS. (2013). Adequacy/Fluency guidelines. https://taus.net/academy/best-practices/evaluate-best-practices/adequacy-fluency-guidelines.

    • Search Google Scholar
    • Export Citation
  • TAUS. (2019). A review of the TAUS global content conference in Salt Lake City. https://www.taus.net/academy/reports.

  • Temizöz, Ö. (2012). Machine translation and postediting. European Society for Translation Studies Research Committee State of the Art Research Reports.

    • Search Google Scholar
    • Export Citation
  • Vieira, L. N. (2016). Cognitive effort in post-editing of machine translation: Evidence from eye movements, subjective ratings, and think-aloud protocols. [Unpublished doctoral dissertation]. Newcastle University.

    • Search Google Scholar
    • Export Citation
  • Vieira, L. N. (2019.) Post-editing of machine translation. In M. O’Hagan (Ed.), The Routledge handbook of translation and technology (pp. 206318). Routledge.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Yamada, M. (2019). The impact of Google neural machine translation on post-editing by student translators. The Journal of Specialised Translation, 31, 87106. https://jostrans.org/issue31/art_yamada.php.

    • Search Google Scholar
    • Export Citation

 

Author Guidelines are available in PDF format.
Please, download the file from HERE.

 

Editor-in-Chief: Kinga KLAUDY (Eötvös Loránd University, Hungary)

Consulting Editor: Pál HELTAI (Kodolányi János University, Hungary)

Managing Editor: Krisztina KÁROLY (Eötvös Loránd University, Hungary)

EDITORIAL BOARD

  • Andrew CHESTERMAN (University of Helsinki, Finland)
  • Kirsten MALMKJÆR (University of Leicester, UK)
  • Christiane NORD (University of Free State, Bloemfontein, South Africa)
  • Anthony PYM (Universitat Rovira i Virgili, Tarragona, Spain, University of Melbourne, Australia)
  • Mary SNELL-HORNBY (University of Vienna, Austria)
  • Sonja TIRKKONEN-CONDIT (University of Eastern Finland, Joensuu, Finland)

ADVISORY BOARD

  • Mona BAKER (Shanghai International Studies University, China, University of Oslo, Norway)
  • Łucja BIEL (University of Warsaw, Poland)
  • Gloria CORPAS PASTOR (University of Malaga, Spain; University of Wolverhampton, UK)
  • Rodica DIMITRIU (Universitatea „Alexandru Ioan Cuza” Iasi, Romania)
  • Birgitta Englund DIMITROVA (Stockholm University, Sweden)
  • Sylvia KALINA (Cologne Technical University, Germany)
  • Haidee KOTZE (Utrecht University, The Netherlands)
  • Sara LAVIOSA (Università degli Studi di Bari Aldo Moro, Italy)
  • Brian MOSSOP (York University, Toronto, Canada)
  • Orero PILAR (Universidad Autónoma de Barcelona, Spain)
  • Gábor PRÓSZÉKY (Hungarian Research Institute for Linguistics, Hungary)
  • Alessandra RICCARDI (University of Trieste, Italy)
  • Edina ROBIN (Eötvös Loránd University, Hungary)
  • Myriam SALAMA-CARR (University of Manchester, UK)
  • Mohammad Saleh SANATIFAR (independent researcher, Iran)
  • Sanjun SUN (Beijing Foreign Studies University, China)
  • Anikó SOHÁR (Pázmány Péter Catholic University,  Hungary)
  • Sonia VANDEPITTE (University of Gent, Belgium)
  • Albert VERMES (Eszterházy Károly University, Hungary)
  • Yifan ZHU (Shanghai Jiao Tong Univeristy, China)

Prof. Kinga Klaudy
Eötvös Loránd University, Department of Translation and Interpreting
Múzeum krt. 4. Bldg. F, I/9-11, H-1088 Budapest, Hungary
Phone: (+36 1) 411 6500/5894
Fax: (+36 1) 485 5217
E-mail: 

  • WoS Arts & Humanities Citation Index
  • Wos Social Sciences Citation Index
  • WoS Essential Science Indicators
  • Scopus
  • Linguistics Abstracts
  • Linguistics and Language Behaviour Abstracts
  • Translation Studies Abstracts

2021  
Web of Science  
Total Cites
WoS
214
Journal Impact Factor 1,292
Rank by Impact Factor Linguistics 98/194
Impact Factor
without
Journal Self Cites
1,208
5 Year
Impact Factor
1,210
Journal Citation Indicator 0,85
Rank by Journal Citation Indicator Language & Linguistics 108/370
Linguistics 122/274
Scimago  
Scimago
H-index
19
Scimago
Journal Rank
0,994
Scimago Quartile Score Linguistics and Language 67/1103 (Q1)
Scopus  
Scopus
Cite Score
2,5
Scopus
CIte Score Rank
Language and Linguistics 121/968 (Q1, D2)
Linguistics and Language 128/1032 (Q1, D2)
Scopus
SNIP
1,576

2020  
Total Cites
WoS
169
Journal Impact Factor 1,160
Rank by Impact Factor

Linguistics 99/193 (Q3)
Languages & Linguistics 57/205 (Q2)

Impact Factor
without
Journal Self Cites
1,040
5 Year
Impact Factor
1,095
Journal Citation Indicator 1,01
Rank by Journal Citation Indicator

Linguistics 107/259 (Q2)
Language & Linguistics 94/356 (Q2)

Citable
Items
12
Total
Articles
12
Total
Reviews
0
Scimago
H-index
14
Scimago
Journal Rank
1,257
Scimago Quartile Score

Language and Linguistics Q1
Linguistics and Language Q1

Scopus
Cite Score
93/50=1,9

Scopus
Cite Score Rank

Language and Linguistics 130/879 (Q1)
Linguistics and Language 147/935 (Q1)
Scopus
SNIP
1,670

2019  
Total Cites
WoS
91
Impact Factor 0,360
Impact Factor
without
Journal Self Cites
0,320
5 Year
Impact Factor
0,500
Immediacy
Index
0,083
Citable
Items
12
Total
Articles
12
Total
Reviews
0
Cited
Half-Life
n/a
Citing
Half-Life
12,7
Eigenfactor
Score
0,00018
Article Influence
Score
0,234
% Articles
in
Citable Items
100,00
Normalized
Eigenfactor
0,02306
Average
IF
Percentile
20,053 (Q1)
Scimago
H-index
13
Scimago
Journal Rank
0,648
Scopus
Scite Score
94/51=1,8
Scopus
Scite Score Rank
Language and Linguistics 120/830 (Q1)
Linguistics and Language 135/884 (Q1)
Scopus
SNIP
1.357

Across Languages and Cultures
Publication Model Hybrid
Submission Fee

none

Article Processing Charge 900 EUR/article
Printed Color Illustrations 40 EUR (or 10 000 HUF) + VAT / piece
Regional discounts on country of the funding agency World Bank Lower-middle-income economies: 50%
World Bank Low-income economies: 100%
Further Discounts Editorial Board / Advisory Board members: 50%
Corresponding authors, affiliated to an EISZ member institution subscribing to the journal package of Akadémiai Kiadó: 100%
Subscription fee 2022 Online subsscription: 310 EUR / 384 USD
Print + online subscription: 362 EUR / 452 USD
Subscription fee 2023 Online subsscription: 318 EUR / 384 USD
Print + online subscription: 372 EUR / 452 USD
Subscription Information Online subscribers are entitled access to all back issues published by Akadémiai Kiadó for each title for the duration of the subscription, as well as Online First content for the subscribed content.
Purchase per Title Individual articles are sold on the displayed price.

Across Languages and Cultures
Language English
Size B5
Year of
Foundation
1999
Volumes
per Year
1
Issues
per Year
2
Founder Akadémiai Kiadó
Founder's
Address
H-1117 Budapest, Hungary 1516 Budapest, PO Box 245.
Publisher Akadémiai Kiadó
Publisher's
Address
H-1117 Budapest, Hungary 1516 Budapest, PO Box 245.
Responsible
Publisher
Chief Executive Officer, Akadémiai Kiadó
ISSN 1585-1923 (Print)
ISSN 1588-2519 (Online)

Monthly Content Usage

Abstract Views Full Text Views PDF Downloads
Jan 2022 0 0 0
Feb 2022 0 0 0
Mar 2022 0 0 0
Apr 2022 0 0 0
May 2022 284 15 16
Jun 2022 254 8 10
Jul 2022 29 0 0