Authors:
Chao Han Xiamen University, Xiamen, China

Search for other papers by Chao Han in
Current site
Google Scholar
PubMed
Close
https://orcid.org/0000-0002-6712-0555
,
Bei Hu National University of Singapore, Singapore

Search for other papers by Bei Hu in
Current site
Google Scholar
PubMed
Close
,
Qin Fan Southwest University, Chongqing, China

Search for other papers by Qin Fan in
Current site
Google Scholar
PubMed
Close
,
Jing Duan Southwest University of Political Science and Law, Chongqing, China

Search for other papers by Jing Duan in
Current site
Google Scholar
PubMed
Close
, and
Xi Li University of Chinese Academy of Sciences, Beijing, China

Search for other papers by Xi Li in
Current site
Google Scholar
PubMed
Close
View More View Less
Restricted access

Abstract

Translation assessment represents a productive line of research in Translation Studies. An array of methods has been trialled to assess translation quality, ranging from intuitive assessment to error analysis and from rubric scoring to item-based assessment. In this article, we introduce a lesser-known approach to translation assessment called comparative judgement. Rooted in psychophysical analysis, comparative judgement grounds itself on the assumption that humans tend to be more accurate in making relative judgements than in making absolute judgements. We conducted an experiment, as both a methodological exploration and a feasibility investigation, in which novice and experienced judges were recruited to assess English-Chinese translation, using a computerised comparative judgement platform. The collected data were analysed to shed light on the validity and reliability of assessment results and the judges’ perceptions. Our analysis shows that (1) overall, comparative judgement produced valid measures and facilitated judgement reliability, although such results seemed to be affected by translation directionality and judges’ experience, and (2) the judges were generally confident about their decisions, despite some emergent factors undermining the validity of their decision making. Finally, we discuss the use of comparative judgement as a possible method in translation assessment and its implications for future practice and research.

  • Andrich, D. (1978). Relationships between the Thurstone and Rasch approaches to item scaling. Applied Psychological Measurement, 2(3), 451462.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Angelelli, V. C. (2009). Using a rubric to assess translation ability: Defining the construct. In C. V. Angelelli , & H. E. Jacobson (Eds.), Testing and assessment in translation and interpreting studies: A call for dialogue between research and practice (pp. 1347). John Benjamins.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Bowker, L. (2000). A corpus-based approach to evaluating student translations. The Translator, 6(2), 183210.

  • Bradley, A. R. , & Terry, E. T. (1952). Rank analysis of incomplete block designs: The method of paired comparisons. Biometrika, 39(3/4), 324345.

    • Search Google Scholar
    • Export Citation
  • Bramley, T. (2007). Paired comparison methods. In N. Paul , J. Baird , H. Goldstein , H. Patrick , & P. Tymms (Eds.), Techniques for monitoring the comparability of examination standards (pp. 246300). Qualifications and Curriculum Authority.

    • Search Google Scholar
    • Export Citation
  • Colina, S. (2008). Translation quality evaluation: Some empirical evidence for a functionalist approach. The Translator, 14(1), 97134.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Colina, S. (2009). Further evidence for a functionalist approach to translation quality evaluation. Target, 21(2), 215244.

  • De Sutter, G. , Cappelle, B. , De Clercq, O. , Loock, R. , & Plevoets, K. (2017). Towards a corpus-based, statistical approach of translation quality. Linguistica Antverpiensia, New Series: Themes in Translation Studies, 16, 2539.

    • Search Google Scholar
    • Export Citation
  • Eckes, T. (2015). Introduction to many-facet Rasch measurement: Analyzing and evaluating rater-mediated assessments. Peter Lang.

  • Eyckmans, J. , & Anckaert, P. (2017). Item-based assessment of translation competence: Chimera of objectivity versus prospect of reliable measurement. Linguistica Antverpiensia, New Series: Themes in Translation Studies, 16, 4056.

    • Search Google Scholar
    • Export Citation
  • Eyckmans, J. , Anckaert, P. , & Segers, W. (2009). The perks of norm- referenced translation evaluation. In C. V. Angelelli , & H. E. Jacobson (Eds.), Testing and assessment in translation and interpreting studies: A call for dialogue between research and practice (pp. 7393). John Benjamins.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Garant, M. (2009). A case for translation holistic assessment. AFinLA-e Soveltavan kielitieteen tutkimuksia, 1, 517.

  • Gijsen, M. , van Daal, T. , Lesterhuis, M. , Gijbels, D. , & De Maeyer, S. (2021). The complexity of comparative judgments in assessing argumentative writing: An eye tracking study. Frontiers in Education, 5, Article 582800.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Gouadec, D. (1981). Paramètres de l’evaluation des traductions. Meta, 26(2), 99116.

  • Han, C. (2020). Translation quality assessment: a critical methodological review. The Translator, 26(3), 257273.

  • Han, C. (2022). Assessing spoken-language interpreting: The method of comparative judgement. Interpreting, 24(1), 5983.

  • Han, C. (2021). Analytic rubric scoring versus comparative judgment: A comparison of two approaches to assessing spoken-language interpreting. Meta, 66(2), 239504.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Han, C. , Shang, X. , (Forthcoming). An item-based, Rasch-calibrated approach to assessing translation quality. Target. Submitted for publication.

    • Search Google Scholar
    • Export Citation
  • Han, C. , & Xiao, X. (2021). A comparative judgment approach to assessing Chinese Sign Language interpreting. Language Testing. Advance online publication.

    • Search Google Scholar
    • Export Citation
  • Hatim, B. , & Mason, I. (1997). The translator as communicator. Routledge.

  • House, J. (2014). Translation quality assessment: Past and present. In J. House (Ed.), Translation: A multidisciplinary approach. Palgrave advances in language and linguistics (pp. 241264). Palgrave Macmillan.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Jones, I. , & Inglis, M. (2015). The problem of assessing problem solving: Can comparative judgement help? Educational Studies in Mathematics, 89(3), 337355.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Jones, I. , Swan, M. , & Pollitt, A. (2015). Assessing mathematical problem solving using comparative judgement. International Journal of Science and Mathematics Education, 13(1), 151177.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Kockaert, J. H. , & Segers, W. (2014). Evaluation de la traduction: La méthode PIE (Preselected Items Evaluation). Turjuman, 23(2), 232250.

    • Search Google Scholar
    • Export Citation
  • Kockaert, J. H. , & Segers, W. (2017). Evaluation of legal translations: PIE method (Preselected Items Evaluation). Journal of Specialised Translation, (27), 148163.

    • Search Google Scholar
    • Export Citation
  • Luce, R. D. (1959). Individual choice behavior. Wiley.

  • Martínez Mateo, R. (2014). A deeper look into metrics for Translation Quality Assessment (TQA): A case study. Miscelanea, (49), 7393.

    • Search Google Scholar
    • Export Citation
  • McAlester, G. (2000). The evaluation of translation into a foreign language. In C. Schäffner , & B. Adab (Eds.), Developing translation competence (pp. 229241). John Benjamins.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • McMahon, S. , & Jones, I. (2015). A comparative judgement approach to teacher assessment. Assessment in Education: Principles, Policy & Practice, 22(3), 368389.

    • Search Google Scholar
    • Export Citation
  • Pym, A. (1992). Translation error analysis and the interface with language teaching. In C. Dollerup , & A. Loddegaard (Eds.), Teaching translation and interpreting: Training talent and experience (pp. 279288). John Benjamins.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • QSR International Pty Ltd . (2015) NVivo (Version 11). [Computer software]. https://www.qsrinternational.com/nvivo-qualitative-data-analysis-software/home.

    • Search Google Scholar
    • Export Citation
  • Sager, J. (1989). Quality and standards: The evaluation of translations. In C. Picken (Ed.), The translator’s handbook (pp. 91102). ASLIB.

    • Search Google Scholar
    • Export Citation
  • Secară, A. (2005). Translation evaluation—A state of the art survey. In Proceeding of the eCoLoRe/MeLLANGE Workshop: Resources and tools for e-Learning in translation and localisation (pp. 3944). St. Jerome.

    • Search Google Scholar
    • Export Citation
  • Steedle, T. J. , & Ferrara, S. (2016). Evaluating comparative judgement as an approach to essay scoring. Applied Measurement in Education, 29(3), 211223.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Thurstone, L. L. (1927). A law of comparative judgement. Psychological Review, 34(4), 273286.

  • Thurstone, L. L. (1959). The measurement of values. Chicago: The University of Chicago Press.

  • Turner, B. , Miranda, L. , & Huang, N. (2010). Error deduction and descriptors – A comparison of two methods of translation test assessment. Translation & Interpreting, 2(1), 1123.

    • Search Google Scholar
    • Export Citation
  • Van Daal, T. , Lesterhuis, M. , Coertjens, L. , Donche, V. , & De Maeyer, S. (2019). Validity of comparative judgement to assess academic writing: Examining implications of its holistic character and building on a shared consensus. Assessment in Education: Principles, Policy & Practice, 26(1), 5974.

    • Search Google Scholar
    • Export Citation
  • Verhavert, S. , Bouwer, R. , Donche, V. , & De Maeyer, S. (2019). A meta-analysis on the reliability of comparative judgement. Assessment in Education: Principles, Policy and Practice, 26(5), 541562.

    • Search Google Scholar
    • Export Citation
  • Verhavert, S. , De Maeyer, S. , Donche, V. , & Coertjens, L. (2018). Scale separation reliability: What does it mean in the context of comparative judgement? Applied Psychological Measurement, 42(6), 428445.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Waddington, C. (2001a). Different methods of evaluating student translations: The question of validity. Meta, 46(2), 311325.

  • Waddington, C. (2001b). Should translations be assessed holistically or through error analysis? Hermes, 14(26), 1538.

  • Williams, M. (1989). The assessment of professional translation quality: Creating credibility out of chaos. Traduction, Terminologie, Redaction, 2(2), 1333.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Williams, M. (2001). The application of argumentation theory to translation quality assessment. Meta, 46(2), 327344.

  • Wu, S. (2010). Assessing simultaneous interpreting: A study on test reliability and examiners’ assessment behaviour. Doctoral dissertation, Newcastle University.

    • Search Google Scholar
    • Export Citation
  • Collapse
  • Expand

 

Author Guidelines are available in PDF format.
Please, download the file from HERE.

 

Editor-in-Chief: Kinga KLAUDY (Eötvös Loránd University, Hungary)

Consulting Editor: Pál HELTAI (Kodolányi János University, Hungary)

Managing Editor: Krisztina KÁROLY (Eötvös Loránd University, Hungary)

EDITORIAL BOARD

  • Andrew CHESTERMAN (University of Helsinki, Finland)
  • Kirsten MALMKJÆR (University of Leicester, UK)
  • Christiane NORD (University of Free State, Bloemfontein, South Africa)
  • Anthony PYM (Universitat Rovira i Virgili, Tarragona, Spain, University of Melbourne, Australia)
  • Mary SNELL-HORNBY (University of Vienna, Austria)
  • Sonja TIRKKONEN-CONDIT (University of Eastern Finland, Joensuu, Finland)

ADVISORY BOARD

  • Mona BAKER (Shanghai International Studies University, China, University of Oslo, Norway)
  • Łucja BIEL (University of Warsaw, Poland)
  • Gloria CORPAS PASTOR (University of Malaga, Spain; University of Wolverhampton, UK)
  • Rodica DIMITRIU (Universitatea „Alexandru Ioan Cuza” Iasi, Romania)
  • Birgitta Englund DIMITROVA (Stockholm University, Sweden)
  • Sylvia KALINA (Cologne Technical University, Germany)
  • Haidee KOTZE (Utrecht University, The Netherlands)
  • Sara LAVIOSA (Università degli Studi di Bari Aldo Moro, Italy)
  • Brian MOSSOP (York University, Toronto, Canada)
  • Orero PILAR (Universidad Autónoma de Barcelona, Spain)
  • Gábor PRÓSZÉKY (Hungarian Research Institute for Linguistics, Hungary)
  • Alessandra RICCARDI (University of Trieste, Italy)
  • Edina ROBIN (Eötvös Loránd University, Hungary)
  • Myriam SALAMA-CARR (University of Manchester, UK)
  • Mohammad Saleh SANATIFAR (independent researcher, Iran)
  • Sanjun SUN (Beijing Foreign Studies University, China)
  • Anikó SOHÁR (Pázmány Péter Catholic University,  Hungary)
  • Sonia VANDEPITTE (University of Gent, Belgium)
  • Albert VERMES (Eszterházy Károly University, Hungary)
  • Yifan ZHU (Shanghai Jiao Tong Univeristy, China)

Prof. Kinga Klaudy
Eötvös Loránd University, Department of Translation and Interpreting
Múzeum krt. 4. Bldg. F, I/9-11, H-1088 Budapest, Hungary
Phone: (+36 1) 411 6500/5894
Fax: (+36 1) 485 5217
E-mail: 

  • WoS Arts & Humanities Citation Index
  • Wos Social Sciences Citation Index
  • WoS Essential Science Indicators
  • Scopus
  • Linguistics Abstracts
  • Linguistics and Language Behaviour Abstracts
  • Translation Studies Abstractst
  • CABELLS Journalytics

2021  
Web of Science  
Total Cites
WoS
214
Journal Impact Factor 1,292
Rank by Impact Factor Linguistics 98/194
Impact Factor
without
Journal Self Cites
1,208
5 Year
Impact Factor
1,210
Journal Citation Indicator 0,85
Rank by Journal Citation Indicator Language & Linguistics 108/370
Linguistics 122/274
Scimago  
Scimago
H-index
19
Scimago
Journal Rank
0,994
Scimago Quartile Score Linguistics and Language 67/1103 (Q1)
Scopus  
Scopus
Cite Score
2,5
Scopus
CIte Score Rank
Language and Linguistics 121/968 (Q1, D2)
Linguistics and Language 128/1032 (Q1, D2)
Scopus
SNIP
1,576

2020  
Total Cites
WoS
169
Journal Impact Factor 1,160
Rank by Impact Factor

Linguistics 99/193 (Q3)
Languages & Linguistics 57/205 (Q2)

Impact Factor
without
Journal Self Cites
1,040
5 Year
Impact Factor
1,095
Journal Citation Indicator 1,01
Rank by Journal Citation Indicator

Linguistics 107/259 (Q2)
Language & Linguistics 94/356 (Q2)

Citable
Items
12
Total
Articles
12
Total
Reviews
0
Scimago
H-index
14
Scimago
Journal Rank
1,257
Scimago Quartile Score

Language and Linguistics Q1
Linguistics and Language Q1

Scopus
Cite Score
93/50=1,9

Scopus
Cite Score Rank

Language and Linguistics 130/879 (Q1)
Linguistics and Language 147/935 (Q1)
Scopus
SNIP
1,670

2019  
Total Cites
WoS
91
Impact Factor 0,360
Impact Factor
without
Journal Self Cites
0,320
5 Year
Impact Factor
0,500
Immediacy
Index
0,083
Citable
Items
12
Total
Articles
12
Total
Reviews
0
Cited
Half-Life
n/a
Citing
Half-Life
12,7
Eigenfactor
Score
0,00018
Article Influence
Score
0,234
% Articles
in
Citable Items
100,00
Normalized
Eigenfactor
0,02306
Average
IF
Percentile
20,053 (Q1)
Scimago
H-index
13
Scimago
Journal Rank
0,648
Scopus
Scite Score
94/51=1,8
Scopus
Scite Score Rank
Language and Linguistics 120/830 (Q1)
Linguistics and Language 135/884 (Q1)
Scopus
SNIP
1.357

Across Languages and Cultures
Publication Model Hybrid
Submission Fee

none

Article Processing Charge 900 EUR/article
Printed Color Illustrations 40 EUR (or 10 000 HUF) + VAT / piece
Regional discounts on country of the funding agency World Bank Lower-middle-income economies: 50%
World Bank Low-income economies: 100%
Further Discounts Editorial Board / Advisory Board members: 50%
Corresponding authors, affiliated to an EISZ member institution subscribing to the journal package of Akadémiai Kiadó: 100%
Subscription fee 2023 Online subsscription: 318 EUR / 384 USD
Print + online subscription: 372 EUR / 452 USD
Subscription Information Online subscribers are entitled access to all back issues published by Akadémiai Kiadó for each title for the duration of the subscription, as well as Online First content for the subscribed content.
Purchase per Title Individual articles are sold on the displayed price.

Across Languages and Cultures
Language English
Size B5
Year of
Foundation
1999
Volumes
per Year
1
Issues
per Year
2
Founder Akadémiai Kiadó
Founder's
Address
H-1117 Budapest, Hungary 1516 Budapest, PO Box 245.
Publisher Akadémiai Kiadó
Publisher's
Address
H-1117 Budapest, Hungary 1516 Budapest, PO Box 245.
Responsible
Publisher
Chief Executive Officer, Akadémiai Kiadó
ISSN 1585-1923 (Print)
ISSN 1588-2519 (Online)

Monthly Content Usage

Abstract Views Full Text Views PDF Downloads
Dec 2022 108 2 4
Jan 2023 90 2 0
Feb 2023 29 3 7
Mar 2023 54 6 10
Apr 2023 71 1 2
May 2023 78 3 10
Jun 2023 16 2 2