Authors:
Charlotte Eben Department of Experimental Psychology, Ghent University, Ghent Belgium

Search for other papers by Charlotte Eben in
Current site
Google Scholar
PubMed
Close
https://orcid.org/0000-0001-9423-1261
,
Beáta Bőthe Département de Psychologie, Université de Montréal, Montréal, Canada

Search for other papers by Beáta Bőthe in
Current site
Google Scholar
PubMed
Close
https://orcid.org/0000-0003-2718-4703
,
Damien Brevers Louvain for Experimental Psychopathology Research Group (LEP), Psychological Sciences Research Institute, UCLouvain, Louvain-la-Neuve, Belgium

Search for other papers by Damien Brevers in
Current site
Google Scholar
PubMed
Close
https://orcid.org/0000-0003-4503-0898
,
Luke Clark Centre for Gambling Research at UBC, Department of Psychology, University of British Columbia, Vancouver, B.C., Canada
Djavad Mowafaghian Centre for Brain Health, University of British Columbia, Vancouver, B.C., Canada

Search for other papers by Luke Clark in
Current site
Google Scholar
PubMed
Close
https://orcid.org/0000-0003-1103-2422
,
Joshua B. Grubbs Department of Psychology, University of New Mexico, Albuquerque, NM, USA
Center for Alcohol, Substance Use, And Addiction (CASAA), University of New Mexico, Albuquerque, NM, USA

Search for other papers by Joshua B. Grubbs in
Current site
Google Scholar
PubMed
Close
https://orcid.org/0000-0002-2642-1351
,
Robert Heirene School of Psychology, University of Plymouth, Plymouth, UK

Search for other papers by Robert Heirene in
Current site
Google Scholar
PubMed
Close
https://orcid.org/0000-0002-5508-7102
,
Anja Kräplin Faculty of Psychology, Technische Universität Dresden, Dresden, Germany

Search for other papers by Anja Kräplin in
Current site
Google Scholar
PubMed
Close
https://orcid.org/0000-0002-1612-3932
,
Karol Lewczuk Institute of Psychology, Cardinal Stefan Wyszynski University in Warsaw, Warsaw, Poland

Search for other papers by Karol Lewczuk in
Current site
Google Scholar
PubMed
Close
https://orcid.org/0000-0003-2437-2450
,
Lucas Palmer Centre for Gambling Research at UBC, Department of Psychology, University of British Columbia, Vancouver, B.C., Canada

Search for other papers by Lucas Palmer in
Current site
Google Scholar
PubMed
Close
https://orcid.org/0000-0003-0961-0350
,
José C. Perales Department of Experimental Psychology; Mind, Brain and Behavior Research Center (CIMCYC), University of Granada, Granada, Spain

Search for other papers by José C. Perales in
Current site
Google Scholar
PubMed
Close
https://orcid.org/0000-0001-5163-8811
,
Jan Peters Department of Psychology, Biological Psychology, University of Cologne, Cologne, Germany

Search for other papers by Jan Peters in
Current site
Google Scholar
PubMed
Close
https://orcid.org/0000-0002-0195-5357
,
Ruth J. van Holst Department of Psychiatry, Amsterdam UMC -University of Amsterdam, Amsterdam, The Netherlands
Center for Urban Mental Health, University of Amsterdam, Amsterdam, The Netherlands

Search for other papers by Ruth J. van Holst in
Current site
Google Scholar
PubMed
Close
, and
Joël Billieux Institute of Psychology, University of Lausanne, Lausanne, Switzerland
Center for Excessive Gambling, Addiction Medicine, Lausanne University Hospital (CHUV), Lausanne, Switzerland

Search for other papers by Joël Billieux in
Current site
Google Scholar
PubMed
Close
https://orcid.org/0000-0002-7388-6194
Open access

Abstract

Open science refers to a set of practices that aim to make scientific research more transparent, accessible, and reproducible, including pre-registration of study protocols, sharing of data and materials, the use of transparent research methods, and open access publishing. In this commentary, we describe and evaluate the current state of open science practices in behavioral addiction research. We highlight the specific value of open science practices for the field; discuss recent field-specific meta-scientific reviews that show the adoption of such practices remains in its infancy; address the challenges to engaging with open science; and make recommendations for how researchers, journals, and scientific institutions can work to overcome these challenges and promote high-quality, transparently reported behavioral addiction research. By collaboratively promoting open science practices, the field can create a more sustainable and productive research environment that benefits both the scientific community and society as a whole.

Abstract

Open science refers to a set of practices that aim to make scientific research more transparent, accessible, and reproducible, including pre-registration of study protocols, sharing of data and materials, the use of transparent research methods, and open access publishing. In this commentary, we describe and evaluate the current state of open science practices in behavioral addiction research. We highlight the specific value of open science practices for the field; discuss recent field-specific meta-scientific reviews that show the adoption of such practices remains in its infancy; address the challenges to engaging with open science; and make recommendations for how researchers, journals, and scientific institutions can work to overcome these challenges and promote high-quality, transparently reported behavioral addiction research. By collaboratively promoting open science practices, the field can create a more sustainable and productive research environment that benefits both the scientific community and society as a whole.

Although behavioral addiction research emerged at the end of the last century (Holden, 2001; Marks, 1990), the nosological status of a wide range of behavioral addictions (with the exception of Gambling and Gaming Disorders) remains debated (Billieux, Schimmenti, Khazaal, Maurage, & Heeren, 2015; Mihordin, 2012; Starcevic, Billieux, & Schimmenti, 2018). Globally, the field is still often considered as an “emerging” or “new” one. We decided to write this commentary to describe and evaluate the open science practices in the field of behavioral addictions to promote awareness and to encourage the field to adopt these practices to further improve research quality in this field. Our objective was to specify what we mean when talking about open science and identify the issues pertaining to the (perceived) status quo in the field of behavioral addictions regarding open science. It is worth acknowledging that we are not the first to call for more open and transparent research in this field, and therefore the current paper is oriented towards avenues and solutions to further integrate and promote open science.

What is ‘open science’?

Open science (also referred to as open research and open scholarship) has been defined as: "An umbrella term reflecting the idea that scientific knowledge of all kinds, where appropriate, should be openly accessible, transparent, rigorous, reproducible, replicable, accumulative, and inclusive, all which are considered fundamental features of the scientific endeavor. Open science consists of principles and behaviors that promote transparent, credible, reproducible, and accessible science. Open science has six major aspects: open data, open methodology, open source, open access, open peer review, and open educational resources” (Elsherif, Flack, Kalandadze, Pennington, & Xiao, 2021). Thus, open science encompasses different practices across the life of a research project, as well as pre-registration of its design, hypotheses, and data analytic plan on an online platform or as Registered Report to have a time-stamped documentation before starting the project (Nosek et al., 2015; for further information see https://forrt.org/glossary/preregistration/).

These practices are essential to prevent well-established problems such as irreplicable and irreproducible research (i.e., failure to find the same result in a different sample using the same or similar methods [replicability] and being unable to find the same results using the same data [reproducibility]), the file drawer issue (i.e., statistically non-significant but important research that is not published and therefore remains hidden from the scientific community), and so-called ‘questionable research practices’ (i.e., problematic research practices that do not constitute ‘misconduct’, but are inconsistent with the principles of scientific integrity; for representative examples, see the next section; for comprehensive overviews see Korbmacher et al., 2023; Pennington, 2023a, 2023b).

Fostering the reliability and transparency of research results should reflect the core values of any research field, including that of behavioral addictions. Indeed, although behavioral addictions research has had a growing impact on international public health policies in recent years (e.g., recognition of Gaming Disorder by WHO, which contributed to the regulation of loot boxes and other gaming-related design features amplifying uncontrolled, and potentially addictive use; see Drummond, Sauer, Hall, Zendle, & Loudon, 2020; Flayelle et al., 2023), it has been criticized for largely not endorsing the current best practices in open science research (e.g., Grubbs, Floyd, Griffin, Jennings, & Kraus, 2022; van Rooij et al., 2018). Video gaming, gambling, online sexual activities, shopping, social networking and on-demand TV streaming are among the most popular non-substance-related leisure activities worldwide (Flayelle et al., 2023), and thus research on these behaviors has the potential to have a widespread impact on modern society. It seems imperative that experts and policy makers make impactful decisions based on transparently reported, reliable, and reproducible research.

In the following sections, we elaborate on the problems that arise from not practicing open science, the current status and challenges that the field of behavioral addictions is facing, and the opportunities and possible solutions that we foresee. Ultimately, open science practices such as sharing data or preregistering confirmatory, exploratory, and qualitative research can improve the quality of evidence that is used for important purposes, from policy making and education to prevention and treatment. Further, endorsing open science can also help to detect questionable research practices in this field.

The problem

‘Questionable research practices’ include (but are not limited to) inappropriate sampling, questionable inferences, underpowered studies, p-hacking, hypothesizing after results are known (HARKing), unclear reporting of methods (e.g., missing information about time of data-collection, ambiguous data analysis plans), or salami slicing of data (i.e., not being sufficiently transparent about multiple use of the same data, or splitting a specific dataset in multiple papers; for example see John, Loewenstein, & Prelec, 2012). These practices are quite common across scientific disciplines (depending on the definition and assessment method, prevalence rates range from 15% to 51.2%; Gopalakrishna et al., 2022; Xie, Wang, & Kong, 2021), largely being attributable to current incentives in scientific publication that focus on quantitative indices of research impact (e.g., impact-factors of peer-reviewed journals, number of citations; for a discussion of a different incentive system see Schönbrodt et al., 2022) which do not correlate with the quality of research (Anderson, Ronning, De Vries, & Martinson, 2007; Dougherty & Horne, 2022; Higginson & Munafò, 2016). This incentive system does not sufficiently acknowledge the relevance of open science practices, in comparison to statistically significant and/or “novel” results. Consequently, many researchers lack the knowledge and/or motivation to apply open science practices (Nosek, Spies, & Motyl, 2012, 2015). This dynamic can have many negative consequences for the behavioral addictions field. For instance, a lack of data sharing can hinder cumulative science where researchers can combine and compare datasets (that are often hard to collect) across studies (see for example Pennington, 2023a, 2023b). This applies to direct and conceptual replication studies, but also to review projects and meta-analyses. Furthermore, shared data allows us to re-use and re-analyze already acquired data sets to generate or test new hypotheses. A lack of transparency could also mean that several studies reporting findings obtained from the same group of participants (i.e., salami slicing) count as cumulative samples in meta-analyses (see Hilgard, Sala, Boot, & Simons, 2019, for a critical account of this issue in relation to the positive effect of videogames on cognitive abilities). As a consequence of these questionable research practices, the reproducibility and validity of findings are jeopardized. The overall quality of the literature and knowledge about behavioral addictions are affected and we, as researchers, may see our scientific integrity as compromised. Open science practices, by contrast, can open up new research perspectives, such as data-driven commentary enabled through data sharing (for an example in the context of a Registered Report, see Amendola, 2023; Billieux & Fournier, 2023).

The current status

In the field of clinical psychology, Tackett and Miller (2019), catalyzed earlier efforts to promote open science practices through their special section in the Journal of Abnormal Psychology. As the field of behavioral addiction is a subfield of clinical psychology, we will first summarize the work by Jennifer Tackett prior to focussing on the work done specifically in the field of behavioral addictions. In their initial work, Tackett et al. (2017) started the conversation about the replication crisis specifically in the field of clinical psychology which was extended in their later work (Tackett, Brandes, King, Markon, 2019). They discuss many open science practices and concerns within the clinical field regarding their use, as well as barriers and possible steps in their field to implementing open science practices. In another paper, Tackett, Brandes, and Reardon (2019) give specific advice on how to use Open Science Framework in the workflow of clinical psychological research and the related advantages.

Building on this work, in the behavioral addictions field, recent efforts have been made to identify why open science practices are not more commonly used and ways to tackle this issue. This is particularly the case in problem gambling research—the longest established domain of research related to behavioral addiction (Yau & Potenza, 2015). Research on problem gambling is thus not an “emerging” field, even if Gambling Disorder has only been recognized as an addictive disorder since 2013 when the DSM-5 (American Psychiatric Association, 2013) was released (it was previously conceptualized as an impulse-control disorder and diagnosed as such in the DSM-IV-TR and previous versions; American Psychiatric Association, 2000). Researchers in this field are increasingly aware that the adoption of open science practices is needed to improve research quality. In 2019, Wohl and colleagues announced that the field of gambling research was lagging behind other scientific fields in acknowledging the replication crisis. The authors called for more openness in the field, including study preregistration and reporting of power analyses, as well as replication studies.

In the wake of that initial article, other researchers proposed new directions for bridging the gap toward the implementation of open science practices in the field of gambling. For instance, Louderback, Wohl, and LaPlante (2021) suggested integrating open science practices with current guidelines for industry funded research in the gambling field. These authors argued that this dynamic could help to foster transparency and thus ensure independent industry funded research. Even though their paper specifically focuses on gambling research, these guidelines could easily be transposed to any industry-funded and non-industry-funded research (see Shi, Potenza, & Turner, 2020 for a similar approach in the field of gaming disorder).

Important new insights come from studies that have investigated knowledge of open science and its presence in the field of gambling research and video gaming. LaPlante, Louderback, and Abarbanel (2021) found that, not surprisingly, only a minority of gambling researchers (attending a 2019 conference) used open science practices, with many of them still having concerns and doubts on how to implement them. Among the most common concerns were privacy issues when sharing code, material, and data, and the fear that others would use the data code or materials without appropriate acknowledgement (LaPlante et al., 2021). Another study by Louderback et al. (2022) investigated the use of open science practices in a random sample of 500 gambling research articles. They found that open access publishing was the most used practice (35.2% of articles), while the use of other practices was very low (0–15%). Interestingly, these authors also observed that studies which adopted at least one open science practice received more citations than papers that did not adopt any open science practices (for similar observations in other fields, see Colavizza, Hrynaszkiewicz, Staden, Whitaker, & McGillivray, 2020; McKiernan et al., 2016; Piwowar & Vision, 2013; Wang, Liu, Mao, & Fang, 2015). Despite limited adoption of open science practices in the field to date, there is clear evidence of a change in perspective. As well as the above-mentioned articles by Wohl, Tabri, and Zelenski (2019) and Louderback et al. (2021), behavioral addiction researchers have called for preregistrations and Registered Reports of qualitative research in video gaming (Karhulahti, 2022), and more replication studies in the gambling field (e.g., Heirene, 2021; LaPlante, 2019).

Whilst the many calls for improved, open research are positive, the limited uptake of open science practices to date in the behavioral addiction field is concerning. As mentioned, research findings have the potential to directly influence the way policies are established, how education and prevention are conducted, and ultimately how patients are treated. Accordingly, as is the case for clinical psychology (Grubbs, 2022; Tackett, Brandes, & Reardon, 2019; Tackett, Brandes, King, et al., 2019) and mental health research more broadly, the risks of having irreplicable, inaccurate, or unclear research appear high for behavioral addiction research. There is an urgent need for behavioral addiction researchers to engage in open science practices to allow full and proper evaluations of their conclusions and to facilitate replications.

The current challenges

The above-mentioned studies highlight that, although open-access publishing is widely adopted, our field is still lacking a satisfactory level of open and transparent research practices. One primary reason for this appears to be that researchers are not sufficiently educated about open science practices. Additional perceived barriers include the perception that using open science practices may take more time than the ‘traditional’ approach to conducting research, that shared datasets and materials may be used without acknowledgement, and that openly sharing parts of the research process (e.g., analysis code) may lead to criticism (Gownaris et al., 2022). In the following section we address some of these issues and propose possible ways to tackle them. After all, many of the barriers impeding open science practices can be addressed by appropriate transfer of knowledge about open science.

First of all, we would like to elaborate on the finding that open access publishing seems to be the most used open science practice (Gownaris et al., 2022; Louderback et al., 2022). On the one hand, we agree that open access publishing is crucial to largely and fairly distribute research results. Knowledge can only be used if there is access to it. Access may be more difficult for researchers at unaffiliated faculties or non-western universities, where access to expensive journals is limited. For example, some authors have argued that different types of resources (e.g., financial) and infrastructural deficiencies in non-western or non-economically developed societies hinder the generation and dissemination of knowledge in and from these societies (Au, 2007; Ross-Hellauer, 2022; Westwood & Jack, 2007). Open access publishing partly solves this problem and can easily be achieved: if a researcher or institution does not have the resources to choose the open access option of the journals provided, there is always the option to use green open access and publish a preprint on any preprint server for a citable time-stamped publication format. In order to make traditionally published papers available, the researcher can publish a post-acceptance version of the paper on an institutional or personal website. Information about the open access options and embargo periods for these options can be easily found on the Sherpa Romeo website (https://v2.sherpa.ac.uk/romeo/). On the other hand, in various countries, researchers pay to make their papers open access purely because they are required to. Many funders and countries require individual researchers to pay open access fees, otherwise they would be ineligible for future funding. But not every research institution has the ability to pay open access publishing fees, which can create a further gap between wealthy and less wealthy researchers and countries. Lastly, many so-called ‘Open Access’ publishers do not necessarily value good and open research practices. Many ‘predatory’ journals publish papers without a sufficiently rigorous peer-review process to profit from the open access fees. The concerning quality of such papers may in some cases contribute to creating negative attitudes towards open access publishing in general (Shen & Björk, 2015). Thus, we would question whether the adoption of open access publishing alone qualifies as adhering to open science principles. Perhaps more importantly, open science should refer to the process of science and not just the product of science. That is, openly sharing results via open access publications is a commendable final step in the scientific process, but open science can and should start long before the publication of results.

Second, we would like to address researchers' concerns that practicing open science takes more time, ultimately leading to less research output, which in turn could especially jeopardize the careers of early career researchers (for further information on open science in early career researchers [ECRs]) see Allen & Mehler, 2019). While we agree that initially starting to engage with open science practices is demanding, this does not necessarily equate to less research output. More to the point, given that all results produced during the process of research can and should be considered outputs (i.e., published protocols, citable datasets, published papers, replicable analytical code), practicing open science might lead to more output because each aspect of the research process demonstrates a researcher's productivity and also provides an output that future researchers may use and acknowledge via citation. Moreover, preregistration and especially Registered Reports can facilitate the publication of null results. This can be a boon to burgeoning researchers for whom, in times past, may have had to relegate such results to their file draw, resulting in nothing more than wasted time and effort.

The solutions

The above concerns and objections, though very real, are hardly new and are not exclusive to behavioral addictions research. Clinical psychological science has been grappling with issues around the implementation of open science principles for several years (Tackett, Brandes, King, et al., 2019; Tackett et al., 2017). As such, there are already important recommendations for clinical psychologists and mental health researchers that address many of the above concerns (e.g., Tackett, Brandes, & Reardon, 2019). Flowing from such recommendations, below we have outlined a number of suggestions and insights for behavioral addictions research more broadly.

Most importantly, implementing open science principles is not an ‘all-or-nothing’ process; rather, it can better be described as a ‘buffet approach’ wherein researchers might choose what works best for them (for a discussion of the ‘buffet approach’ see Bergmann, 2023). When starting to implement open science in your workflow, it can be overwhelming to keep track of which practices are considered to be useful and how to implement them. As such, if you are a researcher who wants to address the above-mentioned issues, we would advise starting with practices for which you can see a clear benefit for you and the field, and which are easiest to implement in your current workflow. For example, perhaps your data is already fully anonymized and you have the consent of participants to share the data due to a standard consent form. In such a case, sharing your data alongside your publication might be an ideal first step. Another possibility is to start with writing a detailed preregistration. In such cases, the time spent writing up the hypotheses and methods before collecting data is time saved when completing an ethics application and when writing the manuscript after collecting data. Thus, study preregistration does not necessarily take more time, it just shifts when in the project you spend this time on writing (for a similar discussion see Heirene et al., 2021). Importantly, preregistration does not prevent or prohibit you from performing further (transparent) exploratory analysis (Höfler, Scherbaum, Kanske, McDonald, & Miller, 2022). As in these examples, it is possible to implement open science in small steps into your workflow and if everybody only changes one habit, we can make a difference in the field. Therefore, leading-by-example can arguably be our most powerful tool to make a change in the field as single researchers.

Moving beyond individual changes we can make as researchers, there are also individual changes we can make as scientists within our fields. In the peer review process, referees can directly ask and encourage authors to share data and materials, and add the “standard reviewer disclosure request”, if necessary (https://osf.io/hadz3/). Furthermore, while we applaud journal policies that mandate data sharing, there is a concern that some journals do not follow up on these policies such that data sharing is not always enforced (Gabelica, Bojčić, & Puljak, 2022). In this case, the community can urge journals to implement and enforce the adoption of open science practices by providing open science badges after verification or making data, code and material sharing mandatory (Thibault, Pennington, & Munafò, 2023). There is a growing number of networks and repositories, such as the Open Science Framework and national reproducibility networks which support sharing practices. However, to prevent “openwashing” (e.g. provide supposedly open data or code that is not understandable to others), it is important to at least randomly verify these materials. This process could also benefit from making requirements for data sharing transparent. For example, journals could specify information that needs to be included as accompanying information, such as Readme files that details specific information on each column of the data file. Furthermore, journals publishing behavioral addiction research should more often enable Registered Reports as an article type. To our knowledge, specifically in the field of behavioral addictions, only the journals Addiction Research and Theory and Psychology of Addictive Behaviors offer “Registered Reports” as an article type to date (see Karhulahti et al., 2022; Grubbs et al., 2022 for examples in these two journals). Registered Reports help to evaluate the importance of a research question and the suitability of the design to answer this particular question before data collection begins, which can save time and effort in getting the work published after data collection (Chambers, Feredoes, Muthukumaraswamy, & Jetchells, 2014; Chambers & Tzavella, 2022; Nosek & Lakens, 2014).

Another form of publishing that might address some of our concerns mentioned above is publishing (Registered Reports) via the Peer Community In (PCI; https://peercommunityin.org/pci-and-journals/), which describes a standardized review process for preprints. After peer-review via PCI, preprints become valid and citable articles. These articles can usually stay on a preprint server without being published in journals, but have still gone through a thorough peer-review process. Alternatively, these articles can be published in the Peer Community Journal as it is, immediately, and at no costs. One last option for publication is to submit articles which received an in-principle-acceptance by PCI Registered Report (PCI RR) to PCI friendly journals. These journals typically immediately accept the PCI RR accepted article without further review. Lastly, there are also PCI RR-interested journals which may consider the in-principle-accepted version but provide no commitment (for further information see https://rr.peercommunityin.org/about/pci_rr_friendly_journals and https://rr.peercommunityin.org/about/pci_rr_interested_journals). Because of this, open access publishing is possible without any costly open access fees and PCI recommended articles are increasingly recognized by many scientific commissions (for more information on this publication type see Chambers & Tzavella, 2022; Pennington & Heim, 2022).

The entire behavioural addictions community can take action as well. For example, to transfer knowledge as a community, we can use conferences, teaching, and workshops to inform about the replication crisis and the value of transparency and replication, focusing on long-term benefits for the students and the field. For example, Louderback et al. (2022) suggest that ECRs especially can make a positive change for the field, but only if knowledge about open science practices is taught. Here, we encourage the readers to have a look at the website of the Framework for Open and Reproducible Research Training (FORRT; https://forrt.org/). The site provides many useful resources to support teaching open and reproducible research practices (Parsons et al., 2022; Pownall et al., 2021).

When starting a PhD program, it might be particularly helpful and impactful to reflect on the usefulness of adopting certain open science practices. For example, a replication attempt of the most impactful study for the project could be a good starting point (for a discussion see Wagge et al., 2019) as well as preregistration in our projects, again in accordance with the ‘buffet-approach’: adopt what works best for you. This can also reduce the perceived stress experienced by PhD students who might otherwise hunt for significant results. Here we urge the senior researchers among our readers to support the efforts of their ECRs to practice open science. Graduate programs could mandate open science practices as the default for a PhD dissertation (e.g., at least one data chapter should include a preregistered protocol; or that a justification must be provided if data cannot be uploaded to a university archive). Moreover, it would be useful to pay more attention to the quality of research when evaluating CVs and achievements rather than focusing on the number of publications and significant results. This approach is in line with the idea of ‘slow science’ which focuses on the quality, transparency, and rigor of research projects rather than the simple output in the form of a publication (Frith, 2020). Again, in our opinion other research outputs such as code, data and materials should also be considered valid outputs in the evaluation process. This approach, however, requires a shift in our current (publication-oriented) incentive system by funders, publishers, governments, and institutions (Stewart et al., 2022; for such an incentive system see e.g., the DORA declaration: https://sfdora.org/).

Lastly, as a community we can and should take larger-scale efforts to tackle issues of lacking high-powered replications and generally poorly powered studies. Data sharing and combined efforts can lead to ‘multi-lab’ approaches as they have done in cognitive and social psychology (i.e., the ‘many labs’ efforts, the Psychological Science Accelerator efforts, or the Reproducibility Project: Psychology initiated by the Center for Open Science), but also in the field of compulsive sexual behaviors/problematic pornography use recently (i.e., International Sex Survey; Bőthe et al., 2021; but see Pennington, Jones, Tzavella, Chambers, & Button, 2022, for the promotion of such efforts in addiction research in general). A key component of these approaches is that several labs and institutions involved combine their efforts to collect bigger and more diverse samples using high-quality research methods. A similar approach can be used to conduct replication studies that can achieve the statistical power required to reliably support the presence or absence of an effect (Heirene, 2021). Decision-making around which studies or effects we should try to replicate can be taken as a research community. Here we suggest approaches such as an expert consensus of key effects, or a systematic approach as described in Isager et al. (2021; for further discussion on which studies deserve replication attempts see Heirene, 2021). With large-scale, community efforts we can not only start tackling the issue of poorly powered (replication) studies, but also combine our efforts in an efficient way to identify the studies that deserve a replication attempt most.

To support the transition to a more open and transparent behavioral addictions field, we have summarized our recommendations in Table 1.

Table 1.

Opportunities to increase open science practices in the field and their actors

What?Who?
Start with what works best for youIndividual
Preregister the study as preregistration or (PCI) Registered ReportIndividual
Ensure that participants agree and sign the informed consent that allows data sharingIndividual
Share data and materials on a public repositoryIndividual
Publish open access or post the work on a public preprint server or institutional websiteIndividual
Ask authors to share data, code, and/or materials in the peer review processIndividual
Urge journals to implement open science practicesIndividual/community
Graduate programs mandating/encouraging open science practicesInstitution
Offer funding schemes for academics to use or enhance their open science toolkitInstitution/funding bodies
Adopt the idea of ‘slow science’Institution/funding bodies/community
Embedding replication and open science practices in PhD programsInstitution
Enforce the journal open science policiesJournals
Offer registered report optionsJournals
Transfer knowledgeCommunity
Evaluate quality of research not quantity (in grant reviews and reviews of individuals)Community/funding bodies/institution
Evaluate researchers on a broad range of research contributions and outputsCommunity
Take larger scale replication effortsCommunity

Conclusion

In conclusion, we believe that if the behavioral addictions research community sends the signal that we value more open, transparent and reproducible research, we will be able to achieve this. It will require a step-by-step community effort — each person must change their day-to-day research habits to align with the principles of open science; we must educate the next generation of researchers sufficiently so that conducting open, transparent research is the norm. In doing so, we can transition behavioral addictions research towards a more transparent, reliable, and reproducible field. Ultimately, this will allow us to obtain a more fine-grained understanding of processes underlying behavioral addictions and to develop effective prevention programs and clinical interventions for those experiencing them.

Funding sources

This work was supported by an ERC Consolidator grant awarded to Frederick Verbruggen (PI of CE; European Union's Horizon 2020 research and innovation programme, grant agreement No 769595).

Authors' contribution

CE initiated the idea of writing this opinion piece. JB helped to make contact with all authors. CE and JB wrote the initial draft. BB, DB, LC, JBG, RH, AK, KL, LP, JCP, JP, and RJV gave input on the topics and provided critical revisions on the manuscript. All authors approved the final version.

Conflict of interest

JB is associate editor for the Journal of Behavioral Addictions. LC has served as an associate editor of JBA in 2022. LC is the Director of the Centre for Gambling Research at UBC, which is supported by funding from the Province of British Columbia and the British Columbia Lottery Corporation (BCLC), a Canadian Crown Corporation. The Province of BC government and the BCLC had no role in the preparation of this manuscript, and impose no constraints on publishing. LC has received a speaker/travel honorarium from the International Center for Responsible Gaming (US) and Scientific Affairs (Germany), and has received fees for academic services from the International Center for Responsible Gaming (US), GambleAware (UK), Gambling Research Exchange Ontario (Canada) and Gambling Research Australia. He has not received any further direct or indirect payments from the gambling industry or groups substantially funded by gambling. RH worked for the Gambling Treatment and Research Centre at the University of Sydney from 2019 to 2021 where his work was partially funded by Responsible Wagering Australia, an organization representing multiple Australian online wagering operators. He has not received any further direct or indirect payments from the gambling industry. RJV currently serves as associate editor of JBA and has not received any direct or indirect payments from the gambling industry or groups substantially funded by gambling.

References

  • Allen, C., & Mehler, D. M. A. (2019). Open science challenges, benefits and tips in early career and beyond. PLOS Biology, 17(5), e3000246. https://doi.org/10.1371/journal.pbio.3000246.

    • Search Google Scholar
    • Export Citation
  • Amendola, S. (2023). Commentary on Karhulahti et al. (2022): Exploring gaming disorder from the harmful dysfunction analysis perspective. Addiction Research & Theory, 0(0), 12. https://doi.org/10.1080/16066359.2023.2173743.

    • Search Google Scholar
    • Export Citation
  • American Psychiatric Association. (2000). Diagnostic and statistical manual of mental disorders (4 text revised). Publisher.

  • American Psychiatric Association. (2013). Diagnostic and statistical manual of mental disorders (5th ed.). Publisher.

  • Anderson, M. S., Ronning, E. A., De Vries, R., & Martinson, B. C. (2007). The perverse effects of competition on scientists’ work and relationships. Science and Engineering Ethics, 13(4), 437461. https://doi.org/10.1007/s11948-007-9042-5.

    • Search Google Scholar
    • Export Citation
  • Au, K. (2007). Self-confidence does not come isolated from the environment. Asia Pacific Journal of Management, 24(4), 491496. https://doi.org/10.1007/s10490-007-9047-2.

    • Search Google Scholar
    • Export Citation
  • Bergmann, C. (2023, April 16). The buffet approach to open science. CogTales. https://cogtales.wordpress.com/2023/04/16/the-buffet-approach-to-open-science/.

    • Search Google Scholar
    • Export Citation
  • Billieux, J., & Fournier, L. (2023). Commentary on Karhulahti et al. (2022): Addressing ontological diversity in gaming disorder measurement from an item-based psychometric perspective. Addiction Research & Theory, 31(3), 170173. https://doi.org/10.1080/16066359.2022.2125508.

    • Search Google Scholar
    • Export Citation
  • Billieux, J., Schimmenti, A., Khazaal, Y., Maurage, P., & Heeren, A. (2015). Are we overpathologizing everyday life? A tenable blueprint for behavioral addiction research. Journal of Behavioral Addictions, 4(3), 119123. https://doi.org/10.1556/2006.4.2015.009.

    • Search Google Scholar
    • Export Citation
  • Bőthe, B., Koós, M., Nagy, L., Kraus, S. W., Potenza, M. N., & Demetrovics, Z. (2021). International Sex Survey: Study protocol of a large, cross-cultural collaborative study in 45 countries. Journal of Behavioral Addictions, 10(3), 632645. https://doi.org/10.1556/2006.2021.00063.

    • Search Google Scholar
    • Export Citation
  • Chambers, C., Feredoes, E., Muthukumaraswamy, S., & Jetchells, P. (2014). Instead of “playing the game” it is time to change the rules: Registered Reports at AIMS Neuroscience and beyond. AIMS Neuroscience, 1(1), 417. https://doi.org/10.3934/Neuroscience.2014.1.4.

    • Search Google Scholar
    • Export Citation
  • Chambers, C., & Tzavella, L. (2022). The past, present and future of Registered Reports. Nature Human Behaviour, 6(1), Article 1. https://doi.org/10.1038/s41562-021-01193-7.

    • Search Google Scholar
    • Export Citation
  • Colavizza, G., Hrynaszkiewicz, I., Staden, I., Whitaker, K., & McGillivray, B. (2020). The citation advantage of linking publications to research data. Plos One, 15(4), e0230416. https://doi.org/10.1371/journal.pone.0230416.

    • Search Google Scholar
    • Export Citation
  • Dougherty, M. R., & Horne, Z. (2022). Citation counts and journal impact factors do not capture some indicators of research quality in the behavioural and brain sciences. Royal Society Open Science, 9(8), 220334. https://doi.org/10.1098/rsos.220334.

    • Search Google Scholar
    • Export Citation
  • Drummond, A., Sauer, J. D., Hall, L. C., Zendle, D., & Loudon, M. R. (2020). Why loot boxes could be regulated as gambling. Nature Human Behaviour, 4(10), Article 10. https://doi.org/10.1038/s41562-020-0900-3.

    • Search Google Scholar
    • Export Citation
  • Elsherif, M., Flack, Z., Kalandadze, T., Pennington, C. R., & Xiao, Q. (2021). Definition of open science -FORRT - Framework for open and reproducible research training. FORRT - Framework for Open and Reproducible Research Training. https://forrt.org/glossary/open-science/.

    • Search Google Scholar
    • Export Citation
  • Flayelle, M., Brevers, D., King, D. L., Maurage, P., Perales, J. C., & Billieux, J. (2023). A taxonomy of technology design features that promote potentially addictive online behaviours. Nature Reviews Psychology, 2, 136150. https://doi.org/10.1038/s44159-023-00153-4.

    • Search Google Scholar
    • Export Citation
  • Frith, U. (2020). Fast lane to slow science. Trends in Cognitive Sciences, 24(1), 12. https://doi.org/10.1016/j.tics.2019.10.007.

  • Gabelica, M., Bojčić, R., & Puljak, L. (2022). Many researchers were not compliant with their published data sharing statement: A mixed-methods study. Journal of Clinical Epidemiology, 150, 3341. https://doi.org/10.1016/j.jclinepi.2022.05.019.

    • Search Google Scholar
    • Export Citation
  • Gopalakrishna, G., Riet, G. ter, Vink, G., Stoop, I., Wicherts, J. M., & Bouter, L. M. (2022). Prevalence of questionable research practices, research misconduct and their potential explanatory factors: A survey among academic researchers in The Netherlands. Plos One, 17(2), e0263023. https://doi.org/10.1371/journal.pone.0263023.

    • Search Google Scholar
    • Export Citation
  • Gownaris, N. J., Vermeir, K., Bittner, M.-I., Gunawardena, L., Kaur-Ghumaan, S., Lepenies, R., … Zakari, I. S. (2022). Barriers to full participation in the open science life cycle among early career researchers. Data Science Journal, 21(1), Article 1. https://doi.org/10.5334/dsj-2022-002.

    • Search Google Scholar
    • Export Citation
  • Grubbs, J. B. (2022). The cost of crisis in clinical psychological science. Behavioral and Brain Sciences, 45, e18. https://doi.org/10.1017/S0140525X21000388.

    • Search Google Scholar
    • Export Citation
  • Grubbs, J. B., Floyd, C. G., Griffin, K. R., Jennings, T. L., & Kraus, S. W. (2022). Moral incongruence and addiction: A registered report. Psychology of Addictive Behaviors, 36(7), 749. https://doi.org/10.1037/adb0000876.

    • Search Google Scholar
    • Export Citation
  • Heirene, R. M. (2021). A call for replications of addiction research: Which studies should we replicate and what constitutes a ‘successful’ replication? Addiction Research & Theory, 29(2), 8997. https://doi.org/10.1080/16066359.2020.1751130.

    • Search Google Scholar
    • Export Citation
  • Heirene, R., LaPlante, D., Louderback, E. R., Keen, B., Bakker, M., Serafimovska, A., & Gainsbury, S. M. (2021). Preregistration specificity & adherence: A review of preregistered gambling studies & cross-disciplinary comparison. PsyArXiv. https://doi.org/10.31234/osf.io/nj4es.

    • Search Google Scholar
    • Export Citation
  • Higginson, A. D., & Munafò, M. R. (2016). Current incentives for scientists lead to underpowered studies with erroneous conclusions. PLOS Biology, 14(11), e2000995. https://doi.org/10.1371/journal.pbio.2000995.

    • Search Google Scholar
    • Export Citation
  • Hilgard, J., Sala, G., Boot, W. R., & Simons, D. J. (2019). Overestimation of action-game training effects: Publication bias and salami slicing. Collabra: Psychology, 5(1), 30. https://doi.org/10.1525/collabra.231.

    • Search Google Scholar
    • Export Citation
  • Höfler, M., Scherbaum, S., Kanske, P., McDonald, B., & Miller, R. (2022). Means to valuable exploration: I. The blending of confirmation and exploration and how to resolve it. Meta-Psychology, 6. https://doi.org/10.15626/MP.2021.2837.

    • Search Google Scholar
    • Export Citation
  • Holden, C. (2001). “Behavioral” addictions: Do they exist? Science, 294(5544), 980982. https://doi.org/10.1126/science.294.5544.980.

    • Search Google Scholar
    • Export Citation
  • Isager, P. M., van Aert, R. C. M., Bahník, Š., Brandt, M. J., DeSoto, K. A., Giner-Sorolla, R., … Lakens, D. (2021). Deciding what to replicate: A decision model for replication study selection under resource and knowledge constraints. Psychological Methods, 28(2), 438451. https://doi.org/10.1037/met0000438.

    • Search Google Scholar
    • Export Citation
  • John, L. K., Loewenstein, G., & Prelec, D. (2012). Measuring the prevalence of questionable research practices with incentives for truth telling. Psychological Science, 23(5), 524532. https://doi.org/10.1177/0956797611430953.

    • Search Google Scholar
    • Export Citation
  • Karhulahti, V.-M. (2022). Registered reports for qualitative research. Nature Human Behaviour, 6(1), Article 1. https://doi.org/10.1038/s41562-021-01265-8.

    • Search Google Scholar
    • Export Citation
  • Karhulahti, V.-M., Vahlo, J., Martončik, M., Munukka, M., Koskimaa, R., & Bonsdorff, M. von. (2022). Ontological diversity in gaming disorder measurement: A nationally representative registered report. Addiction, Research & Theory, 31(3), 157167. https://doi.org/10.31234/osf.io/qytrs.

    • Search Google Scholar
    • Export Citation
  • Korbmacher, M., Azevedo, F., Pennington, C. R., Hartmann, H., Pownall, M., Schmidt, K., … Evans, T. R. (2023). The replication crisis has led to positive structural, procedural, and community changes. MetaArXiv, 1, 3. https://doi.org/10.31222/osf.io/r6cvx.

    • Search Google Scholar
    • Export Citation
  • LaPlante, D. A. (2019). Replication is fundamental, but is it common? A call for scientific self-reflection and contemporary research practices in gambling-related research. International Gambling Studies, 19(3), 362368. https://doi.org/10.1080/14459795.2019.1672768.

    • Search Google Scholar
    • Export Citation
  • LaPlante, D. A., Louderback, E. R., & Abarbanel, B. (2021). Gambling researchers’ use and views of open science principles and practices: A brief report. International Gambling Studies, 21(3), 381394. https://doi.org/10.1080/14459795.2021.1891272.

    • Search Google Scholar
    • Export Citation
  • Louderback, E. R., Gainsbury, S. M., Heirene, R. M., Amichia, K., Grossman, A., Bernhard, B. J.LaPlante, D. A. (2022). Open science practices in gambling research publications (2016–2019): A scoping review. Journal of Gambling Studies, 39, 9871011. https://doi.org/10.1007/s10899-022-10120-y.

    • Search Google Scholar
    • Export Citation
  • Louderback, E. R., Wohl, M. J. A., & LaPlante, D. A. (2021). Integrating open science practices into recommendations for accepting gambling industry research funding. Addiction Research & Theory, 29(1), 7987. https://doi.org/10.1080/16066359.2020.1767774.

    • Search Google Scholar
    • Export Citation
  • Marks, I. (1990). Behavioural (non-chemical) addictions. British Journal of Addiction, 85(11), 13891394. https://doi.org/10.1111/j.1360-0443.1990.tb01618.x.

    • Search Google Scholar
    • Export Citation
  • McKiernan, E. C., Bourne, P. E., Brown, C. T., Buck, S., Kenall, A., Lin, J., … Yarkoni, T. (2016). How open science helps researchers succeed. ELife, 5, e16800. https://doi.org/10.7554/eLife.16800.

    • Search Google Scholar
    • Export Citation
  • Mihordin, R. (2012). Behavioral addiction—Quo Vadis? The Journal of Nervous and Mental Disease, 200(6), 489491. https://doi.org/10.1097/NMD.0b013e318257c503.

    • Search Google Scholar
    • Export Citation
  • Nosek, B. A., Alter, G., Banks, G. C., Borsboom, D., Bowman, S. D., Breckler, S. J., … Yarkoni, T. (2015). Promoting an open research culture. Science, 348(6242), 14221425. https://doi.org/10.1126/science.aab2374.

    • Search Google Scholar
    • Export Citation
  • Nosek, B. A., & Lakens, D. (2014). Registered reports: A method to increase the credibility of published results. Social Psychology, 45(3), 137141. https://doi.org/10.1027/1864-9335/a000192 [Accessed 137].

    • Search Google Scholar
    • Export Citation
  • Nosek, B. A., Spies, J. R., & Motyl, M. (2012). Scientific Utopia: II. Restructuring incentives and practices to promote truth over publishability. Perspectives on Psychological Science, 7(6), 615631. https://doi.org/10.1177/1745691612459058.

    • Search Google Scholar
    • Export Citation
  • Parsons, S., Azevedo, F., Elsherif, M. M., Guay, S., Shahim, O. N., Govaart, G. H., … Aczel, B. (2022). A community-sourced glossary of open scholarship terms. Nature Human Behaviour, 6(3), Article 3. https://doi.org/10.1038/s41562-021-01269-4.

    • Search Google Scholar
    • Export Citation
  • Pennington, C. (2023a). A student's guide to open science: Using the replication crisis to reform psychology. Open University Press. https://www.mheducation.co.uk/a-student-s-guide-to-open-science-using-the-replication-crisis-to-reform-psychology-9780335251162-emea-group.

    • Search Google Scholar
    • Export Citation
  • Pennington, C. R. (2023b). Open data through Registered Reports can accelerate cumulative knowledge. Addiction Research & Theory, 31(3), 155156. https://doi.org/10.1080/16066359.2023.2176848.

    • Search Google Scholar
    • Export Citation
  • Pennington, C. R., & Heim, D. (2022). Reshaping the publication process: Addiction research and theory joins peer community in registered reports. Addiction Research & Theory, 30(1), 14. https://doi.org/10.1080/16066359.2021.1931142.

    • Search Google Scholar
    • Export Citation
  • Pennington, C. R., Jones, A. J., Tzavella, L., Chambers, C., & Button, K. S. (2022). Beyond online participant crowdsourcing: The benefits and opportunities of big team addiction science. Experimental and Clinical Psychopharmacology, 30(4), 444451. https://doi.org/10.1037/pha0000541.

    • Search Google Scholar
    • Export Citation
  • Piwowar, H. A., & Vision, T. J. (2013). Data reuse and the open data citation advantage. PeerJ, 1, e175. https://doi.org/10.7717/peerj.175.

    • Search Google Scholar
    • Export Citation
  • Pownall, M., Azevedo, F., Aldoh, A., Elsherif, M., Vasilev, M., Pennington, C. R., … Parsons, S. (20211223). Embedding open and reproducible science into teaching: A bank of lesson plans and resources. Scholarship of Teaching and Learning in Psychology. Advance online publication. https://doi.org/10.1037/stl0000307.

    • Search Google Scholar
    • Export Citation
  • Ross-Hellauer, T. (2022). Open science, done wrong, will compound inequities. Nature, 603(7901), 363–363. https://doi.org/10.1038/d41586-022-00724-0.

    • Search Google Scholar
    • Export Citation
  • Schönbrodt, F., Gärtner, A., Frank, M., Gollwitzer, M., Ihle, M., Mischkowski, D., … Leising, D. (2022). Responsible research assessment I: Implementing DORA for hiring and promotion in psychology. PsyArXiv. https://doi.org/10.31234/osf.io/rgh5b.

    • Search Google Scholar
    • Export Citation
  • Shen, C., & Björk, B.-C. (2015). ‘Predatory’ open access: A longitudinal study of article volumes and market characteristics. BMC Medicine, 13(1), 230. https://doi.org/10.1186/s12916-015-0469-2.

    • Search Google Scholar
    • Export Citation
  • Shi, J., Potenza, M. N., & Turner, N. E. (2020). Commentary on: “The future of gaming disorder research and player protection: What role should the video gaming industry and researchers play?”. International Journal of Mental Health and Addiction, 18(3), 791799. https://doi.org/10.1007/s11469-019-00153-7.

    • Search Google Scholar
    • Export Citation
  • Starcevic, V., Billieux, J., & Schimmenti, A. (2018). Selfitis, selfie addiction, Twitteritis: Irresistible appeal of medical terminology for problematic behaviours in the digital age. Australian & New Zealand Journal of Psychiatry, 52(5), 408409. https://doi.org/10.1177/0004867418763532.

    • Search Google Scholar
    • Export Citation
  • Stewart, S. L. K., Pennington, C. R., da Silva, G. R., Ballou, N., Butler, J., Dienes, Z., … Samara, A. (2022). Reforms to improve reproducibility and quality must be coordinated across the research ecosystem: The view from the UKRN Local Network Leads. BMC Research Notes, 15(1), 58. https://doi.org/10.1186/s13104-022-05949-w.

    • Search Google Scholar
    • Export Citation
  • Tackett, J. L., Brandes, C. M., King, K. M., & Markon, K. E. (2019). Psychology’s replication crisis and clinical psychological science. Annual Review of Clinical Psychology, 15(1), 579604. https://doi.org/10.1146/annurev-clinpsy-050718-095710.

    • Search Google Scholar
    • Export Citation
  • Tackett, J. L., Brandes, C. M., & Reardon, K. W. (2019). Leveraging the Open Science Framework in clinical psychological assessment research. Psychological Assessment, 31, 13861394. https://doi.org/10.1037/pas0000583.

    • Search Google Scholar
    • Export Citation
  • Tackett, J. L., Lilienfeld, S. O., Patrick, C. J., Johnson, S. L., Krueger, R. F., Miller, J. D., … Shrout, P. E. (2017). It’s time to broaden the replicability conversation: Thoughts for and from clinical psychological science. Perspectives on Psychological Science, 12(5), 742756. https://doi.org/10.1177/1745691617690042.

    • Search Google Scholar
    • Export Citation
  • Tackett, J. L., & Miller, J. D. (2019). Introduction to the special section on increasing replicability, transparency, and openness in clinical psychology. Journal of Abnormal Psychology, 128(6), 487. https://doi.org/10.1037/abn0000455.

    • Search Google Scholar
    • Export Citation
  • Thibault, R. T., Pennington, C. R., & Munafò, M. R. (2023). Reflections on preregistration: Core criteria, badges, complementary workflows. Journal of Trial & Error, 2(1). https://doi.org/10.36850/mr6.

    • Search Google Scholar
    • Export Citation
  • van Rooij, A. J., Ferguson, C. J., Colder Carras, M., Kardefelt-Winther, D., Shi, J., Aarseth, E., … Przybylski, A. K. (2018). A weak scientific basis for gaming disorder: Let us err on the side of caution. Journal of Behavioral Addictions, 7(1), 19. https://doi.org/10.1556/2006.7.2018.19.

    • Search Google Scholar
    • Export Citation
  • Wagge, J. R., Brandt, M. J., Lazarevic, L. B., Legate, N., Christopherson, C., Wiggins, B., & Grahe, J. E. (2019). Publishing research with undergraduate students via replication work: The collaborative replications and education project. Frontiers in Psychology, 10. https://doi.org/10.3389/fpsyg.2019.00247.

    • Search Google Scholar
    • Export Citation
  • Wang, X., Liu, C., Mao, W., & Fang, Z. (2015). The open access advantage considering citation, article usage and social media attention. Scientometrics, 103(2), 555564. https://doi.org/10.1007/s11192-015-1547-0.

    • Search Google Scholar
    • Export Citation
  • Westwood, R. I., & Jack, G. (2007). Manifesto for a post‐colonial international business and management studies: A provocation. Critical Perspectives on International Business, 3(3), 246265. https://doi.org/10.1108/17422040710775021.

    • Search Google Scholar
    • Export Citation
  • Wohl, M. J. A., Tabri, N., & Zelenski, J. M. (2019). The need for open science practices and well-conducted replications in the field of gambling studies. International Gambling Studies, 19(3), 369376. https://doi.org/10.1080/14459795.2019.1672769.

    • Search Google Scholar
    • Export Citation
  • Xie, Y., Wang, K., & Kong, Y. (2021). Prevalence of research misconduct and questionable research practices: A systematic review and meta-analysis. Science and Engineering Ethics, 27(4), 41. https://doi.org/10.1007/s11948-021-00314-9.

    • Search Google Scholar
    • Export Citation
  • Yau, M. Y. H. C., & Potenza, D. M. N. (2015). Gambling disorder and other behavioral addictions: Recognition and treatment. Harvard Review of Psychiatry, 23(2), 134. https://doi.org/10.1097/HRP.0000000000000051.

    • Search Google Scholar
    • Export Citation
  • Allen, C., & Mehler, D. M. A. (2019). Open science challenges, benefits and tips in early career and beyond. PLOS Biology, 17(5), e3000246. https://doi.org/10.1371/journal.pbio.3000246.

    • Search Google Scholar
    • Export Citation
  • Amendola, S. (2023). Commentary on Karhulahti et al. (2022): Exploring gaming disorder from the harmful dysfunction analysis perspective. Addiction Research & Theory, 0(0), 12. https://doi.org/10.1080/16066359.2023.2173743.

    • Search Google Scholar
    • Export Citation
  • American Psychiatric Association. (2000). Diagnostic and statistical manual of mental disorders (4 text revised). Publisher.

  • American Psychiatric Association. (2013). Diagnostic and statistical manual of mental disorders (5th ed.). Publisher.

  • Anderson, M. S., Ronning, E. A., De Vries, R., & Martinson, B. C. (2007). The perverse effects of competition on scientists’ work and relationships. Science and Engineering Ethics, 13(4), 437461. https://doi.org/10.1007/s11948-007-9042-5.

    • Search Google Scholar
    • Export Citation
  • Au, K. (2007). Self-confidence does not come isolated from the environment. Asia Pacific Journal of Management, 24(4), 491496. https://doi.org/10.1007/s10490-007-9047-2.

    • Search Google Scholar
    • Export Citation
  • Bergmann, C. (2023, April 16). The buffet approach to open science. CogTales. https://cogtales.wordpress.com/2023/04/16/the-buffet-approach-to-open-science/.

    • Search Google Scholar
    • Export Citation
  • Billieux, J., & Fournier, L. (2023). Commentary on Karhulahti et al. (2022): Addressing ontological diversity in gaming disorder measurement from an item-based psychometric perspective. Addiction Research & Theory, 31(3), 170173. https://doi.org/10.1080/16066359.2022.2125508.

    • Search Google Scholar
    • Export Citation
  • Billieux, J., Schimmenti, A., Khazaal, Y., Maurage, P., & Heeren, A. (2015). Are we overpathologizing everyday life? A tenable blueprint for behavioral addiction research. Journal of Behavioral Addictions, 4(3), 119123. https://doi.org/10.1556/2006.4.2015.009.

    • Search Google Scholar
    • Export Citation
  • Bőthe, B., Koós, M., Nagy, L., Kraus, S. W., Potenza, M. N., & Demetrovics, Z. (2021). International Sex Survey: Study protocol of a large, cross-cultural collaborative study in 45 countries. Journal of Behavioral Addictions, 10(3), 632645. https://doi.org/10.1556/2006.2021.00063.

    • Search Google Scholar
    • Export Citation
  • Chambers, C., Feredoes, E., Muthukumaraswamy, S., & Jetchells, P. (2014). Instead of “playing the game” it is time to change the rules: Registered Reports at AIMS Neuroscience and beyond. AIMS Neuroscience, 1(1), 417. https://doi.org/10.3934/Neuroscience.2014.1.4.

    • Search Google Scholar
    • Export Citation
  • Chambers, C., & Tzavella, L. (2022). The past, present and future of Registered Reports. Nature Human Behaviour, 6(1), Article 1. https://doi.org/10.1038/s41562-021-01193-7.

    • Search Google Scholar
    • Export Citation
  • Colavizza, G., Hrynaszkiewicz, I., Staden, I., Whitaker, K., & McGillivray, B. (2020). The citation advantage of linking publications to research data. Plos One, 15(4), e0230416. https://doi.org/10.1371/journal.pone.0230416.

    • Search Google Scholar
    • Export Citation
  • Dougherty, M. R., & Horne, Z. (2022). Citation counts and journal impact factors do not capture some indicators of research quality in the behavioural and brain sciences. Royal Society Open Science, 9(8), 220334. https://doi.org/10.1098/rsos.220334.

    • Search Google Scholar
    • Export Citation
  • Drummond, A., Sauer, J. D., Hall, L. C., Zendle, D., & Loudon, M. R. (2020). Why loot boxes could be regulated as gambling. Nature Human Behaviour, 4(10), Article 10. https://doi.org/10.1038/s41562-020-0900-3.

    • Search Google Scholar
    • Export Citation
  • Elsherif, M., Flack, Z., Kalandadze, T., Pennington, C. R., & Xiao, Q. (2021). Definition of open science -FORRT - Framework for open and reproducible research training. FORRT - Framework for Open and Reproducible Research Training. https://forrt.org/glossary/open-science/.

    • Search Google Scholar
    • Export Citation
  • Flayelle, M., Brevers, D., King, D. L., Maurage, P., Perales, J. C., & Billieux, J. (2023). A taxonomy of technology design features that promote potentially addictive online behaviours. Nature Reviews Psychology, 2, 136150. https://doi.org/10.1038/s44159-023-00153-4.

    • Search Google Scholar
    • Export Citation
  • Frith, U. (2020). Fast lane to slow science. Trends in Cognitive Sciences, 24(1), 12. https://doi.org/10.1016/j.tics.2019.10.007.

  • Gabelica, M., Bojčić, R., & Puljak, L. (2022). Many researchers were not compliant with their published data sharing statement: A mixed-methods study. Journal of Clinical Epidemiology, 150, 3341. https://doi.org/10.1016/j.jclinepi.2022.05.019.

    • Search Google Scholar
    • Export Citation
  • Gopalakrishna, G., Riet, G. ter, Vink, G., Stoop, I., Wicherts, J. M., & Bouter, L. M. (2022). Prevalence of questionable research practices, research misconduct and their potential explanatory factors: A survey among academic researchers in The Netherlands. Plos One, 17(2), e0263023. https://doi.org/10.1371/journal.pone.0263023.

    • Search Google Scholar
    • Export Citation
  • Gownaris, N. J., Vermeir, K., Bittner, M.-I., Gunawardena, L., Kaur-Ghumaan, S., Lepenies, R., … Zakari, I. S. (2022). Barriers to full participation in the open science life cycle among early career researchers. Data Science Journal, 21(1), Article 1. https://doi.org/10.5334/dsj-2022-002.

    • Search Google Scholar
    • Export Citation
  • Grubbs, J. B. (2022). The cost of crisis in clinical psychological science. Behavioral and Brain Sciences, 45, e18. https://doi.org/10.1017/S0140525X21000388.

    • Search Google Scholar
    • Export Citation
  • Grubbs, J. B., Floyd, C. G., Griffin, K. R., Jennings, T. L., & Kraus, S. W. (2022). Moral incongruence and addiction: A registered report. Psychology of Addictive Behaviors, 36(7), 749. https://doi.org/10.1037/adb0000876.

    • Search Google Scholar
    • Export Citation
  • Heirene, R. M. (2021). A call for replications of addiction research: Which studies should we replicate and what constitutes a ‘successful’ replication? Addiction Research & Theory, 29(2), 8997. https://doi.org/10.1080/16066359.2020.1751130.

    • Search Google Scholar
    • Export Citation
  • Heirene, R., LaPlante, D., Louderback, E. R., Keen, B., Bakker, M., Serafimovska, A., & Gainsbury, S. M. (2021). Preregistration specificity & adherence: A review of preregistered gambling studies & cross-disciplinary comparison. PsyArXiv. https://doi.org/10.31234/osf.io/nj4es.

    • Search Google Scholar
    • Export Citation
  • Higginson, A. D., & Munafò, M. R. (2016). Current incentives for scientists lead to underpowered studies with erroneous conclusions. PLOS Biology, 14(11), e2000995. https://doi.org/10.1371/journal.pbio.2000995.

    • Search Google Scholar
    • Export Citation
  • Hilgard, J., Sala, G., Boot, W. R., & Simons, D. J. (2019). Overestimation of action-game training effects: Publication bias and salami slicing. Collabra: Psychology, 5(1), 30. https://doi.org/10.1525/collabra.231.

    • Search Google Scholar
    • Export Citation
  • Höfler, M., Scherbaum, S., Kanske, P., McDonald, B., & Miller, R. (2022). Means to valuable exploration: I. The blending of confirmation and exploration and how to resolve it. Meta-Psychology, 6. https://doi.org/10.15626/MP.2021.2837.

    • Search Google Scholar
    • Export Citation
  • Holden, C. (2001). “Behavioral” addictions: Do they exist? Science, 294(5544), 980982. https://doi.org/10.1126/science.294.5544.980.

    • Search Google Scholar
    • Export Citation
  • Isager, P. M., van Aert, R. C. M., Bahník, Š., Brandt, M. J., DeSoto, K. A., Giner-Sorolla, R., … Lakens, D. (2021). Deciding what to replicate: A decision model for replication study selection under resource and knowledge constraints. Psychological Methods, 28(2), 438451. https://doi.org/10.1037/met0000438.

    • Search Google Scholar
    • Export Citation
  • John, L. K., Loewenstein, G., & Prelec, D. (2012). Measuring the prevalence of questionable research practices with incentives for truth telling. Psychological Science, 23(5), 524532. https://doi.org/10.1177/0956797611430953.

    • Search Google Scholar
    • Export Citation
  • Karhulahti, V.-M. (2022). Registered reports for qualitative research. Nature Human Behaviour, 6(1), Article 1. https://doi.org/10.1038/s41562-021-01265-8.

    • Search Google Scholar
    • Export Citation
  • Karhulahti, V.-M., Vahlo, J., Martončik, M., Munukka, M., Koskimaa, R., & Bonsdorff, M. von. (2022). Ontological diversity in gaming disorder measurement: A nationally representative registered report. Addiction, Research & Theory, 31(3), 157167. https://doi.org/10.31234/osf.io/qytrs.

    • Search Google Scholar
    • Export Citation
  • Korbmacher, M., Azevedo, F., Pennington, C. R., Hartmann, H., Pownall, M., Schmidt, K., … Evans, T. R. (2023). The replication crisis has led to positive structural, procedural, and community changes. MetaArXiv, 1, 3. https://doi.org/10.31222/osf.io/r6cvx.

    • Search Google Scholar
    • Export Citation
  • LaPlante, D. A. (2019). Replication is fundamental, but is it common? A call for scientific self-reflection and contemporary research practices in gambling-related research. International Gambling Studies, 19(3), 362368. https://doi.org/10.1080/14459795.2019.1672768.

    • Search Google Scholar
    • Export Citation
  • LaPlante, D. A., Louderback, E. R., & Abarbanel, B. (2021). Gambling researchers’ use and views of open science principles and practices: A brief report. International Gambling Studies, 21(3), 381394. https://doi.org/10.1080/14459795.2021.1891272.

    • Search Google Scholar
    • Export Citation
  • Louderback, E. R., Gainsbury, S. M., Heirene, R. M., Amichia, K., Grossman, A., Bernhard, B. J.LaPlante, D. A. (2022). Open science practices in gambling research publications (2016–2019): A scoping review. Journal of Gambling Studies, 39, 9871011. https://doi.org/10.1007/s10899-022-10120-y.

    • Search Google Scholar
    • Export Citation
  • Louderback, E. R., Wohl, M. J. A., & LaPlante, D. A. (2021). Integrating open science practices into recommendations for accepting gambling industry research funding. Addiction Research & Theory, 29(1), 7987. https://doi.org/10.1080/16066359.2020.1767774.

    • Search Google Scholar
    • Export Citation
  • Marks, I. (1990). Behavioural (non-chemical) addictions. British Journal of Addiction, 85(11), 13891394. https://doi.org/10.1111/j.1360-0443.1990.tb01618.x.

    • Search Google Scholar
    • Export Citation
  • McKiernan, E. C., Bourne, P. E., Brown, C. T., Buck, S., Kenall, A., Lin, J., … Yarkoni, T. (2016). How open science helps researchers succeed. ELife, 5, e16800. https://doi.org/10.7554/eLife.16800.

    • Search Google Scholar
    • Export Citation
  • Mihordin, R. (2012). Behavioral addiction—Quo Vadis? The Journal of Nervous and Mental Disease, 200(6), 489491. https://doi.org/10.1097/NMD.0b013e318257c503.

    • Search Google Scholar
    • Export Citation
  • Nosek, B. A., Alter, G., Banks, G. C., Borsboom, D., Bowman, S. D., Breckler, S. J., … Yarkoni, T. (2015). Promoting an open research culture. Science, 348(6242), 14221425. https://doi.org/10.1126/science.aab2374.

    • Search Google Scholar
    • Export Citation
  • Nosek, B. A., & Lakens, D. (2014). Registered reports: A method to increase the credibility of published results. Social Psychology, 45(3), 137141. https://doi.org/10.1027/1864-9335/a000192 [Accessed 137].

    • Search Google Scholar
    • Export Citation
  • Nosek, B. A., Spies, J. R., & Motyl, M. (2012). Scientific Utopia: II. Restructuring incentives and practices to promote truth over publishability. Perspectives on Psychological Science, 7(6), 615631. https://doi.org/10.1177/1745691612459058.

    • Search Google Scholar
    • Export Citation
  • Parsons, S., Azevedo, F., Elsherif, M. M., Guay, S., Shahim, O. N., Govaart, G. H., … Aczel, B. (2022). A community-sourced glossary of open scholarship terms. Nature Human Behaviour, 6(3), Article 3. https://doi.org/10.1038/s41562-021-01269-4.

    • Search Google Scholar
    • Export Citation
  • Pennington, C. (2023a). A student's guide to open science: Using the replication crisis to reform psychology. Open University Press. https://www.mheducation.co.uk/a-student-s-guide-to-open-science-using-the-replication-crisis-to-reform-psychology-9780335251162-emea-group.

    • Search Google Scholar
    • Export Citation
  • Pennington, C. R. (2023b). Open data through Registered Reports can accelerate cumulative knowledge. Addiction Research & Theory, 31(3), 155156. https://doi.org/10.1080/16066359.2023.2176848.

    • Search Google Scholar
    • Export Citation
  • Pennington, C. R., & Heim, D. (2022). Reshaping the publication process: Addiction research and theory joins peer community in registered reports. Addiction Research & Theory, 30(1), 14. https://doi.org/10.1080/16066359.2021.1931142.

    • Search Google Scholar
    • Export Citation
  • Pennington, C. R., Jones, A. J., Tzavella, L., Chambers, C., & Button, K. S. (2022). Beyond online participant crowdsourcing: The benefits and opportunities of big team addiction science. Experimental and Clinical Psychopharmacology, 30(4), 444451. https://doi.org/10.1037/pha0000541.

    • Search Google Scholar
    • Export Citation
  • Piwowar, H. A., & Vision, T. J. (2013). Data reuse and the open data citation advantage. PeerJ, 1, e175. https://doi.org/10.7717/peerj.175.

    • Search Google Scholar
    • Export Citation
  • Pownall, M., Azevedo, F., Aldoh, A., Elsherif, M., Vasilev, M., Pennington, C. R., … Parsons, S. (20211223). Embedding open and reproducible science into teaching: A bank of lesson plans and resources. Scholarship of Teaching and Learning in Psychology. Advance online publication. https://doi.org/10.1037/stl0000307.

    • Search Google Scholar
    • Export Citation
  • Ross-Hellauer, T. (2022). Open science, done wrong, will compound inequities. Nature, 603(7901), 363–363. https://doi.org/10.1038/d41586-022-00724-0.

    • Search Google Scholar
    • Export Citation
  • Schönbrodt, F., Gärtner, A., Frank, M., Gollwitzer, M., Ihle, M., Mischkowski, D., … Leising, D. (2022). Responsible research assessment I: Implementing DORA for hiring and promotion in psychology. PsyArXiv. https://doi.org/10.31234/osf.io/rgh5b.

    • Search Google Scholar
    • Export Citation
  • Shen, C., & Björk, B.-C. (2015). ‘Predatory’ open access: A longitudinal study of article volumes and market characteristics. BMC Medicine, 13(1), 230. https://doi.org/10.1186/s12916-015-0469-2.

    • Search Google Scholar
    • Export Citation
  • Shi, J., Potenza, M. N., & Turner, N. E. (2020). Commentary on: “The future of gaming disorder research and player protection: What role should the video gaming industry and researchers play?”. International Journal of Mental Health and Addiction, 18(3), 791799. https://doi.org/10.1007/s11469-019-00153-7.

    • Search Google Scholar
    • Export Citation
  • Starcevic, V., Billieux, J., & Schimmenti, A. (2018). Selfitis, selfie addiction, Twitteritis: Irresistible appeal of medical terminology for problematic behaviours in the digital age. Australian & New Zealand Journal of Psychiatry, 52(5), 408409. https://doi.org/10.1177/0004867418763532.

    • Search Google Scholar
    • Export Citation
  • Stewart, S. L. K., Pennington, C. R., da Silva, G. R., Ballou, N., Butler, J., Dienes, Z., … Samara, A. (2022). Reforms to improve reproducibility and quality must be coordinated across the research ecosystem: The view from the UKRN Local Network Leads. BMC Research Notes, 15(1), 58. https://doi.org/10.1186/s13104-022-05949-w.

    • Search Google Scholar
    • Export Citation
  • Tackett, J. L., Brandes, C. M., King, K. M., & Markon, K. E. (2019). Psychology’s replication crisis and clinical psychological science. Annual Review of Clinical Psychology, 15(1), 579604. https://doi.org/10.1146/annurev-clinpsy-050718-095710.

    • Search Google Scholar
    • Export Citation
  • Tackett, J. L., Brandes, C. M., & Reardon, K. W. (2019). Leveraging the Open Science Framework in clinical psychological assessment research. Psychological Assessment, 31, 13861394. https://doi.org/10.1037/pas0000583.

    • Search Google Scholar
    • Export Citation
  • Tackett, J. L., Lilienfeld, S. O., Patrick, C. J., Johnson, S. L., Krueger, R. F., Miller, J. D., … Shrout, P. E. (2017). It’s time to broaden the replicability conversation: Thoughts for and from clinical psychological science. Perspectives on Psychological Science, 12(5), 742756. https://doi.org/10.1177/1745691617690042.

    • Search Google Scholar
    • Export Citation
  • Tackett, J. L., & Miller, J. D. (2019). Introduction to the special section on increasing replicability, transparency, and openness in clinical psychology. Journal of Abnormal Psychology, 128(6), 487. https://doi.org/10.1037/abn0000455.

    • Search Google Scholar
    • Export Citation
  • Thibault, R. T., Pennington, C. R., & Munafò, M. R. (2023). Reflections on preregistration: Core criteria, badges, complementary workflows. Journal of Trial & Error, 2(1). https://doi.org/10.36850/mr6.

    • Search Google Scholar
    • Export Citation
  • van Rooij, A. J., Ferguson, C. J., Colder Carras, M., Kardefelt-Winther, D., Shi, J., Aarseth, E., … Przybylski, A. K. (2018). A weak scientific basis for gaming disorder: Let us err on the side of caution. Journal of Behavioral Addictions, 7(1), 19. https://doi.org/10.1556/2006.7.2018.19.

    • Search Google Scholar
    • Export Citation
  • Wagge, J. R., Brandt, M. J., Lazarevic, L. B., Legate, N., Christopherson, C., Wiggins, B., & Grahe, J. E. (2019). Publishing research with undergraduate students via replication work: The collaborative replications and education project. Frontiers in Psychology, 10. https://doi.org/10.3389/fpsyg.2019.00247.

    • Search Google Scholar
    • Export Citation
  • Wang, X., Liu, C., Mao, W., & Fang, Z. (2015). The open access advantage considering citation, article usage and social media attention. Scientometrics, 103(2), 555564. https://doi.org/10.1007/s11192-015-1547-0.

    • Search Google Scholar
    • Export Citation
  • Westwood, R. I., & Jack, G. (2007). Manifesto for a post‐colonial international business and management studies: A provocation. Critical Perspectives on International Business, 3(3), 246265. https://doi.org/10.1108/17422040710775021.

    • Search Google Scholar
    • Export Citation
  • Wohl, M. J. A., Tabri, N., & Zelenski, J. M. (2019). The need for open science practices and well-conducted replications in the field of gambling studies. International Gambling Studies, 19(3), 369376. https://doi.org/10.1080/14459795.2019.1672769.

    • Search Google Scholar
    • Export Citation
  • Xie, Y., Wang, K., & Kong, Y. (2021). Prevalence of research misconduct and questionable research practices: A systematic review and meta-analysis. Science and Engineering Ethics, 27(4), 41. https://doi.org/10.1007/s11948-021-00314-9.

    • Search Google Scholar
    • Export Citation
  • Yau, M. Y. H. C., & Potenza, D. M. N. (2015). Gambling disorder and other behavioral addictions: Recognition and treatment. Harvard Review of Psychiatry, 23(2), 134. https://doi.org/10.1097/HRP.0000000000000051.

    • Search Google Scholar
    • Export Citation
  • Collapse
  • Expand
The author instruction is available in PDF.
Please, download the file from HERE

Dr. Zsolt Demetrovics
Institute of Psychology, ELTE Eötvös Loránd University
Address: Izabella u. 46. H-1064 Budapest, Hungary
Phone: +36-1-461-2681
E-mail: jba@ppk.elte.hu

Indexing and Abstracting Services:

  • Web of Science [Science Citation Index Expanded (also known as SciSearch®)
  • Journal Citation Reports/Science Edition
  • Social Sciences Citation Index®
  • Journal Citation Reports/ Social Sciences Edition
  • Current Contents®/Social and Behavioral Sciences
  • EBSCO
  • GoogleScholar
  • PsycINFO
  • PubMed Central
  • SCOPUS
  • Medline
  • CABI
  • CABELLS Journalytics

2022  
Web of Science  
Total Cites
WoS
5713
Journal Impact Factor 7.8
Rank by Impact Factor

Psychiatry (SCIE) 18/155
Psychiatry (SSCI) 13/144

Impact Factor
without
Journal Self Cites
7.2
5 Year
Impact Factor
8.9
Journal Citation Indicator 1.42
Rank by Journal Citation Indicator

Psychiatry 35/264

Scimago  
Scimago
H-index
69
Scimago
Journal Rank
1.918
Scimago Quartile Score Clinical Psychology Q1
Medicine (miscellaneous) Q1
Psychiatry and Mental Health Q1
Scopus  
Scopus
Cite Score
11.1
Scopus
Cite Score Rank
Clinical Psychology 10/292 (96th PCTL)
Psychiatry and Mental Health 30/531 (94th PCTL)
Medicine (miscellaneous) 25/309 (92th PCTL)
Scopus
SNIP
1.966

 

 
2021  
Web of Science  
Total Cites
WoS
5223
Journal Impact Factor 7,772
Rank by Impact Factor Psychiatry SCIE 26/155
Psychiatry SSCI 19/142
Impact Factor
without
Journal Self Cites
7,130
5 Year
Impact Factor
9,026
Journal Citation Indicator 1,39
Rank by Journal Citation Indicator

Psychiatry 34/257

Scimago  
Scimago
H-index
56
Scimago
Journal Rank
1,951
Scimago Quartile Score Clinical Psychology (Q1)
Medicine (miscellaneous) (Q1)
Psychiatry and Mental Health (Q1)
Scopus  
Scopus
Cite Score
11,5
Scopus
CIte Score Rank
Clinical Psychology 5/292 (D1)
Psychiatry and Mental Health 20/529 (D1)
Medicine (miscellaneous) 17/276 (D1)
Scopus
SNIP
2,184

2020  
Total Cites 4024
WoS
Journal
Impact Factor
6,756
Rank by Psychiatry (SSCI) 12/143 (Q1)
Impact Factor Psychiatry 19/156 (Q1)
Impact Factor 6,052
without
Journal Self Cites
5 Year 8,735
Impact Factor
Journal  1,48
Citation Indicator  
Rank by Journal  Psychiatry 24/250 (Q1)
Citation Indicator   
Citable 86
Items
Total 74
Articles
Total 12
Reviews
Scimago 47
H-index
Scimago 2,265
Journal Rank
Scimago Clinical Psychology Q1
Quartile Score Psychiatry and Mental Health Q1
  Medicine (miscellaneous) Q1
Scopus 3593/367=9,8
Scite Score  
Scopus Clinical Psychology 7/283 (Q1)
Scite Score Rank Psychiatry and Mental Health 22/502 (Q1)
Scopus 2,026
SNIP  
Days from  38
submission  
to 1st decision  
Days from  37
acceptance  
to publication  
Acceptance 31%
Rate  

2019  
Total Cites
WoS
2 184
Impact Factor 5,143
Impact Factor
without
Journal Self Cites
4,346
5 Year
Impact Factor
5,758
Immediacy
Index
0,587
Citable
Items
75
Total
Articles
67
Total
Reviews
8
Cited
Half-Life
3,3
Citing
Half-Life
6,8
Eigenfactor
Score
0,00597
Article Influence
Score
1,447
% Articles
in
Citable Items
89,33
Normalized
Eigenfactor
0,7294
Average
IF
Percentile
87,923
Scimago
H-index
37
Scimago
Journal Rank
1,767
Scopus
Scite Score
2540/376=6,8
Scopus
Scite Score Rank
Cllinical Psychology 16/275 (Q1)
Medicine (miscellenous) 31/219 (Q1)
Psychiatry and Mental Health 47/506 (Q1)
Scopus
SNIP
1,441
Acceptance
Rate
32%

 

Journal of Behavioral Addictions
Publication Model Gold Open Access
Submission Fee none
Article Processing Charge 990 EUR/article for articles submitted after 30 April 2023 (850 EUR for articles submitted prior to this date)
Regional discounts on country of the funding agency World Bank Lower-middle-income economies: 50%
World Bank Low-income economies: 100%
Further Discounts Corresponding authors, affiliated to an EISZ member institution subscribing to the journal package of Akadémiai Kiadó: 100%.
Subscription Information Gold Open Access

Journal of Behavioral Addictions
Language English
Size A4
Year of
Foundation
2011
Volumes
per Year
1
Issues
per Year
4
Founder Eötvös Loránd Tudományegyetem
Founder's
Address
H-1053 Budapest, Hungary Egyetem tér 1-3.
Publisher Akadémiai Kiadó
Publisher's
Address
H-1117 Budapest, Hungary 1516 Budapest, PO Box 245.
Responsible
Publisher
Chief Executive Officer, Akadémiai Kiadó
ISSN 2062-5871 (Print)
ISSN 2063-5303 (Online)

Senior editors

Editor(s)-in-Chief: Zsolt DEMETROVICS

Assistant Editor(s): Csilla ÁGOSTON

Associate Editors

  • Stephanie ANTONS (Universitat Duisburg-Essen, Germany)
  • Joel BILLIEUX (University of Lausanne, Switzerland)
  • Beáta BŐTHE (University of Montreal, Canada)
  • Matthias BRAND (University of Duisburg-Essen, Germany)
  • Ruth J. van HOLST (Amsterdam UMC, The Netherlands)
  • Daniel KING (Flinders University, Australia)
  • Gyöngyi KÖKÖNYEI (ELTE Eötvös Loránd University, Hungary)
  • Ludwig KRAUS (IFT Institute for Therapy Research, Germany)
  • Marc N. POTENZA (Yale University, USA)
  • Hans-Jurgen RUMPF (University of Lübeck, Germany)

Editorial Board

  • Max W. ABBOTT (Auckland University of Technology, New Zealand)
  • Elias N. ABOUJAOUDE (Stanford University School of Medicine, USA)
  • Hojjat ADELI (Ohio State University, USA)
  • Alex BALDACCHINO (University of Dundee, United Kingdom)
  • Alex BLASZCZYNSKI (University of Sidney, Australia)
  • Judit BALÁZS (ELTE Eötvös Loránd University, Hungary)
  • Kenneth BLUM (University of Florida, USA)
  • Henrietta BOWDEN-JONES (Imperial College, United Kingdom)
  • Wim VAN DEN BRINK (University of Amsterdam, The Netherlands)
  • Gerhard BÜHRINGER (Technische Universität Dresden, Germany)
  • Sam-Wook CHOI (Eulji University, Republic of Korea)
  • Damiaan DENYS (University of Amsterdam, The Netherlands)
  • Jeffrey L. DEREVENSKY (McGill University, Canada)
  • Naomi FINEBERG (University of Hertfordshire, United Kingdom)
  • Marie GRALL-BRONNEC (University Hospital of Nantes, France)
  • Jon E. GRANT (University of Minnesota, USA)
  • Mark GRIFFITHS (Nottingham Trent University, United Kingdom)
  • Anneke GOUDRIAAN (University of Amsterdam, The Netherlands)
  • Heather HAUSENBLAS (Jacksonville University, USA)
  • Tobias HAYER (University of Bremen, Germany)
  • Susumu HIGUCHI (National Hospital Organization Kurihama Medical and Addiction Center, Japan)
  • David HODGINS (University of Calgary, Canada)
  • Eric HOLLANDER (Albert Einstein College of Medicine, USA)
  • Jaeseung JEONG (Korea Advanced Institute of Science and Technology, Republic of Korea)
  • Yasser KHAZAAL (Geneva University Hospital, Switzerland)
  • Orsolya KIRÁLY (Eötvös Loránd University, Hungary)
  • Emmanuel KUNTSCHE (La Trobe University, Australia)
  • Hae Kook LEE (The Catholic University of Korea, Republic of Korea)
  • Michel LEJOXEUX (Paris University, France)
  • Anikó MARÁZ (Humboldt-Universität zu Berlin, Germany)
  • Giovanni MARTINOTTI (‘Gabriele d’Annunzio’ University of Chieti-Pescara, Italy)
  • Astrid MÜLLER  (Hannover Medical School, Germany)
  • Frederick GERARD MOELLER (University of Texas, USA)
  • Daniel Thor OLASON (University of Iceland, Iceland)
  • Nancy PETRY (University of Connecticut, USA)
  • Bettina PIKÓ (University of Szeged, Hungary)
  • Afarin RAHIMI-MOVAGHAR (Teheran University of Medical Sciences, Iran)
  • József RÁCZ (Hungarian Academy of Sciences, Hungary)
  • Rory C. REID (University of California Los Angeles, USA)
  • Marcantanio M. SPADA (London South Bank University, United Kingdom)
  • Daniel SPRITZER (Study Group on Technological Addictions, Brazil)
  • Dan J. STEIN (University of Cape Town, South Africa)
  • Sherry H. STEWART (Dalhousie University, Canada)
  • Attila SZABÓ (Eötvös Loránd University, Hungary)
  • Ferenc TÚRY (Semmelweis University, Hungary)
  • Alfred UHL (Austrian Federal Health Institute, Austria)
  • Róbert URBÁN  (ELTE Eötvös Loránd University, Hungary)
  • Johan VANDERLINDEN (University Psychiatric Center K.U.Leuven, Belgium)
  • Alexander E. VOISKOUNSKY (Moscow State University, Russia)
  • Aviv M. WEINSTEIN  (Ariel University, Israel)
  • Kimberly YOUNG (Center for Internet Addiction, USA)

 

Monthly Content Usage

Abstract Views Full Text Views PDF Downloads
Nov 2023 0 152 123
Dec 2023 0 287 84
Jan 2024 0 239 123
Feb 2024 0 177 86
Mar 2024 0 137 92
Apr 2024 0 129 44
May 2024 0 0 0