View More View Less
  • 1 Doctoral School of Education, University of Szeged, Szeged, Hungary
Open access

Abstract

Data-driven decision-making (DDDM) has been playing an increasing role in contemporary teaching, since it includes systematic collection, analysis, and application of data to improve students’ educational performance. However, little is known about the affective factors that influence this data-based practice. Thus, the purpose of this study was to systematically examine previous research on the affective factors that influence DDDM based on the following criteria: (1) the level of DDDM usage; (2) the emphasis of affective factors; and (3) the nature of the interventions and their effects on teachers. According to the findings, this literature review showed how little DDDM-related affective factors have been researched, even though the knowledge of DDDM can help expand its application in the educational field. For example, although the most widely used tool is the Data-Driven Decision-Making Efficacy and Anxiety Inventory (3D-MEA), which has shown promising results in terms of measuring the efficacy and development of data literacy, other affective components have yet to be tested, due to their novelty in the field. The implication of the findings is that obtaining more information about DDDM and its affective elements can help reduce teachers’ anxiety toward this approach and ultimately enhance the overall educational process.

Abstract

Data-driven decision-making (DDDM) has been playing an increasing role in contemporary teaching, since it includes systematic collection, analysis, and application of data to improve students’ educational performance. However, little is known about the affective factors that influence this data-based practice. Thus, the purpose of this study was to systematically examine previous research on the affective factors that influence DDDM based on the following criteria: (1) the level of DDDM usage; (2) the emphasis of affective factors; and (3) the nature of the interventions and their effects on teachers. According to the findings, this literature review showed how little DDDM-related affective factors have been researched, even though the knowledge of DDDM can help expand its application in the educational field. For example, although the most widely used tool is the Data-Driven Decision-Making Efficacy and Anxiety Inventory (3D-MEA), which has shown promising results in terms of measuring the efficacy and development of data literacy, other affective components have yet to be tested, due to their novelty in the field. The implication of the findings is that obtaining more information about DDDM and its affective elements can help reduce teachers’ anxiety toward this approach and ultimately enhance the overall educational process.

Introduction/background

Data-driven decision-making (DDDM) has been playing an increasing role in contemporary teaching, since it includes conscious, systematic data collection, analysis, and application to enhance students’ performance and support educational matters (Marsh, Pane, & Hamilton, 2006). For example, DDDM-related tools are used to measure students’ diagnostic, formative, and summative knowledge in national competency tests. A similar system is the academically certified eDia (Csapó & Molnár, 2019). This Hungarian diagnostic support tool provides immediate feedback to teachers about their students’ performance, which can be used to enhance teaching, learning, and pedagogical planning. Teachers can also collect the academic results from each subject and compare them with previous ones in order to form their lesson plans accordingly (Bús, 2015; Means, Padilla, & Gallagher, 2010). Besides such measuring systems, examinations also have a significant effect on teaching and learning processes. However, although examinations are a natural part of the pedagogical process (Vigh, 2007), DDDM can help create a different approach to measurement data.

In order to implement this in accordance with DDDM, it is important to provide valid and quality data for teachers (Ronka, Geier, & Marciniak, 2010; Széll, 2015). It also crucial that teachers have a positive attitude toward data usage and measuring programs, since any anxiety regarding these factors ultimately affect their classroom activities (Firestone et al., 2002; Tóth, 2011, 2015). However, our knowledge about the affective factors that influence DDDM is scarce. Meanwhile, the application of DDDM in classroom practices is influenced by the different characteristics of teachers. According to Dunn, Airola, Lo, and Garrison (2013), teachers’ impressions about their effectiveness and anxiety can be the main factors that strengthen or weaken their dedication toward DDDM.

Provided that teachers perform their tasks in accordance with this approach, it is assumed that they see themselves as efficient in the usage and analysis of measurement information during their work. Conversely, possible reasons behind the lack of systematic usage of measurement information can include: having anxiety when working with data and statistical analyses (Samuel, 2008); feeling inefficient at work or possessing limited training and experience in data analysis, usage, and collection in the given field (Dunn et al., 2013); and working in a nonsupportive institutional environment (Ronka et al., 2010).

Purpose

The purpose of this study was to examine previous research on the affective factors that influence DDDM. The main focus was on the affective aspects influencing DDDM (RQ1) in the literature and the measurement tools used to inspect such aspects (RQ2). It is hoped that the findings of this literature review on DDDM-related affective factors can contribute to the understanding and application of DDDM and be used to determine future research directions.

Methods

In the first phase, DDDM-related books and peer-reviewed articles were collected through electronic databases (e.g., ERIC, Science Direct, and Google Scholar) and filtered based on the condition that they approached the topic empirically. The temporal scope of the literature review was between 2010 and 2020. The following keywords were used: teacher beliefs on DDDM, affective factors of DDDM, engagement to DDDM. In the second phase, the condition was the linkage to practical applications and affective factors. Overall, 25 publications were selected and categorized into two groups: the first group, which focused on the affective elements of DDDM; and the second group, which focused on the empirical analyses of teachers’ data literacy and intervention-related affective elements. Moreover, the criteria in this study were as follows: (1) the level of DDDM usage (N = 8); (2) the emphasis of affective factors (N = 7); and (3) the nature of the interventions and their effects on teachers (N = 10).

Results

Based on the systematic review of the DDDM-related literature, a significant aspect was the distinction between teachers’ effectiveness and their perceived level of efficacy. In this regard, effectiveness refers to the self-reflective, future-oriented supposition in which individuals possess the required knowledge and ability to successfully perform certain tasks. However, in terms of teachers, it applies to the influential factor on students’ educational performance (Bandura, 1997). As for teachers’ sense of efficacy, it is possible to obtain a clearer picture about how they evaluate their own DDDM-related efficacy. For example, provided that teachers’ perceived sense of efficacy is high, students are positively affected in terms of the results (Dunn et al., 2013). Thus, the higher the DDDM-related sense of efficacy, the more that teachers effectively use DDDM to enhance their teaching and foster their students’ knowledge. The effectiveness of DDDM can also be increased through DDDM-related interventions, as shown in previous research on their effects on teachers and teacher trainees (Green, Schmitt-Wilson, & Versland, 2016; Reeves & Chiang, 2018). In the next chapter, I would highlight some researches in systematic order, by presenting some great examples and the most important methods of measuring affective factors of DDDM.

DDDM-related efficacy and anxiety

According to the results, one of the heavily focused areas was teachers’ self-perceptions of their data usage and efficiency. In this regard, DDDM-related sense of efficacy refers to the dedication and actions necessary for carrying out successful interventions (Airola & Dunn, 2011). Such efficacy also influences teachers’ aims, perseverance, and motivation. According to previous studies, if a teacher believes that he/she will be successful in the implementation of a task, then the task itself might trigger positive emotional reactions, make them more open to try new practices (Straub, 2009), and be more resilient when tackling other task-related problems (Bruce, Esmonde, Ross, Dooki, & Beatty, 2010). In general, the starting point of these efficacy-related studies was teachers’ self-perceptions of data usage.

It is also important to note that one of the influencing factors regarding the sense of efficacy was anxiety (Bandura, 1997). In this case, the higher the level of anxiety, the less effective the teacher feels. Meanwhile, the expectation that teachers are required to understand and apply DDDM can also raise their stress level, which, in turn, can increase their feeling of incompetence and negatively affect their application of this approach. However, there are limited resources about the affective aspects of DDDM, even though DDDM-related efficacy and anxiety have a significant influence on its application.

As for teachers’ sense of efficacy and anxiety, Dunn et al. (2013) aimed to optimize such aspects by creating the well-established Data-Driven Decision-Making Efficacy and Anxiety Inventory (3D-MEA). Before this complex measurement tool was introduced, questionnaires were used as the basis for confirmatory factor analyses (Walker, Reeves, & Smith, 2018) or mapping DDDM-related opinions from teachers and trainees (Green et al., 2016; Price, 2018; Reeves & Chiang, 2017, 2018). The overall purpose of Dunn et al.’s (2013) study was to validate the 3D-MEA. In order to implement it, a sample of elementary and high school teachers (n = 1,728) were asked to complete a questionnaire, the results of which underwent explorative and confirmatory factor analyses. The study confirmed the validity of the questionnaire in which four DDDM efficacy subscales and one anxiety subscale were identified. In this case, the four subscales included data access and detection, technology use, understanding data, and applying data, while the feeling of anxiety itself was one subscale. However, their study did not focus on subject-related differences. The given questionnaire was also used in the study by Price (2018), which focused on the correlation between data-informed school leadership (DISL) and DDDM-related efficacy and anxiety among a sample of teachers (n = 300). The two main measuring tools in this study were the DISL questionnaire and the 3D-MEA. In particular, the DISL questionnaire included data-related goal setting, development of evidence-based decision-making among teachers, educational programs, and the establishment of a data-rich culture. According to the results, high DISL is related with a high level of DDDM-related efficacy, while there was no significant correlation with DDDM-related anxiety. Consequently, while the head of institutions tend to consider their work to be data-driven, teachers feel more effective in the field of DDDM.

The effect of DDDM-based interventions on individuals

Based on the literature review in the present study, DDDM-based interventions proved to be a well-researched field. The teachers in the United States who did not participate in special training had limited knowledge about the elements of data literacy (e.g., statistical knowledge, different data systems and their usage, etc.) and relied less on the implementation of DDDM (Means, Chen, DeBarger, & Padilla, 2011; Reeves & Chiang, 2018; Walker et al., 2018). Moreover, there was no strong correlation between their data use and their educational decisions (Dunn et al., 2013).

In related research, DDDM interventions were carried out in different levels of educational systems such as elementary school teachers (Reeves & Chiang, 2018) and teacher trainees (Reeves & Honig, 2015). They also focused on teachers’ knowledge of data usage and their abilities and practices, rather than institutions and their learning support. In the present study, it was possible to identify the common conditions of DDDM-based interventions, including their application and effects on individuals as well as their characteristics (e.g., capacity, data properties, and the effects of leadership and organizational culture). Overall, the purpose of the interventions was to develop teachers’ DDDM-related knowledge and abilities, and to increase their feeling of satisfaction. In the following, DDDM-related abilities and knowledge are used to determine data literacy (Gummer & Mandinach, 2015).

It is also possible to get an inside view of teachers’ perceptions of DDDM and determine how such perceptions change after interventions are performed. In this regard, interventions can enable teachers to gain important knowledge from data that can be used to support certain educational decisions (e.g., differentiation). Nevertheless, the majority of the intervention-focused studies were only based on self-assessment questionnaires and post-training measurement tools (e.g., students’ performance), instead of classroom observations and other correlations (Marsh, 2012). For example, some researchers introduced a DDDM system to the participants, performed data collection, analyses, and application training, and administered a satisfaction survey.

In other studies, different types of trainings were found, including one- to two-day trainings (e.g., Green et al., 2016; Reeves & Chiang, 2017), online trainings (e.g., Reeves & Chiang, 2018), and one-year trainings (Van Geel, Keuning, Visscher, & Fox, 2017). For instance, Green et al. (2016) organized data literacy training seminars (n = 16; 3 × 0.5 days) in order to improve teachers’ analysis skills and cooperative practices. University summer camps were also organized (n = 30) to enhance teachers’ data collection skills and statistical knowledge, after which the 3D-MEA and the 3-2-1 formative measuring tool were used. This tool is an activity which encourages participants to reflect on their learnings, giving examples and recognize the unresolved areas. According to the results, the participants eventually recognized the value of the data, especially from the aspect of decision-making in teaching. In the research by Reeves and Chiang (2017), 58 teacher trainees completed a one-day intervention to analyze external data and support their teaching activities based on such data. In this case, the 3D-MEA and a questionnaire regarding the effectiveness of the intervention were used. Overall, the results showed positive changes in the following 3D-MEA subscales: data access and identification, interpretation, and application.

In a related study by Reeves and Chiang (2018), an online data literacy prevention program and changes in the application of teaching materials were monitored. In the latter, there was no significant added value, but a positive change was detected in the DDDM-related sense of efficacy and anxiety among the teacher trainees (n = 99) and practicing teachers (n = 25). Moreover, the teacher trainees’ attitudes became more positive toward pedagogical evaluations. In this case, the online intervention contained three modules in which the participants received the following information: the aim and the characteristics of the program; the data usage in individual, group, and class levels; and the data usage and grading in class and school levels. Then, they were asked to identify and determine the weak/strong points, the extent of development, and the application in education on the given level. Overall, a total of 7.5 hours was spent in the training. As for the measuring tools, they included the 3D-MEA and a questionnaire about data usage habits, attitudes toward pedagogical evaluations, and the efficiency of the intervention.

In other research, Reeves and Honig (2015) conducted a six-hour intervention among a sample of teacher trainees (n = 64), after which the data analysis, interpretation, classroom evaluation, and decision-making were measured. The attitude toward data literacy was also a subject of focus. Regarding the measurement tools, they included the Conceptions of Assessment III and Survey of Educator Data Use, which focused on the participants’ attitudes and beliefs. Based on the results, increased data literacy and a positive attitude toward data usage was found. In Van Geel et al. (2017) two-year study, they inspected a large sample of teachers (1,182 teachers from 83 schools) and their data literacy. In this case, the measuring tool contained general data literacy-related items and system-specific questions in order to interpret the results. Their study not only found a significant change in these two areas, but it also narrowed the gap between the data literacy of teachers with college and university degrees.

The scope of inspection can be extended to not only practicing teachers’ and teacher trainees’ intervention programs (Ebbeler, Poortman, Schildkmap, & Pieters, 2016), but also to headmasters and external consultants (Begin, 2018), by using the Data Team Procedure. This procedure is an eight-step, scientific-based process that leads team members (e.g., teachers, headmasters, data experts, etc.) to identify the problems in educational development and make improvements in students’ performance. Other problems can include early school leaving, poor academic results in a subject, and decreasing efficiency of an institution. Moreover, this procedure provides data users with the ability to move easily between the various steps of DDDM. According to previous studies, since it builds on the inner capacities of schools, it can be an efficient way to deal with reoccurring problems (Bolhuis, 2017; Schildkamp et al., 2018).

In sum, both short- and long-term intervention programs were primarily used in the literature to focus on DDDM-related efficacy and anxiety as well as the development of data literacy. The 3D-MEA and satisfaction surveys were also used to interpret the training-related efficacy and determine the level of data usage satisfaction.

Conclusion

The present study examined the affective aspects influencing DDDM (RQ1) in the literature and the measurement tools used to inspect such aspects (RQ2). For this purpose, the main criteria included: (1) the level of DDDM usage; (2) the emphasis of affective factors; and (3) the nature of the interventions and their effects on teachers. By highlighting the most important researches and methods this literature review also showed how little DDDM-related affective factors have been researched, even though the knowledge of DDDM can help expand its application in the educational field. Self-perception of sense of teacher’s work as one of the most influencing factor has been revealed. For example, the most widely used tool to track teachers’ DDDM-related sense of efficacy and data literacy was the 3D-MEA. In fact, the teachers with a higher level of efficacy were more willing to use this data-based practice (Dunn et al., 2013). The second affective factor was attitude toward data teachers’ own data literacy. This important DDDM-related approach was at interventions. The third one was used to determine their intervention-related satisfaction after pre- and post-evaluations. However, other affective components have yet to be tested, due to their novelty in the field. Thus, obtaining more information about DDDM and its affective elements can help reduce teachers’ anxiety toward this approach and ultimately enhance the overall educational process. Moreover, along with measuring the sense of efficacy and anxiety, it would be advisable to conduct studies on teachers’ systematic thinking and even motivation toward DDDM application, professional development, and information processes. Mapping these affective elements can also provide a better understanding of the conditions, possibilities, and limits of DDDM application.

References

  • Airola, D. T., & Dunn, K. E. (2011). Oregon DATA project final evaluation report. Fayetteville: Next Level Evaluation.

  • Bandura, A. (1997). Self-efficacy: The exercise of control. New York: W.H. Freeman and Company.

  • Begin, V. S. (2018). Instructional data teams and data literacy: Leaders supporting the work of school instructional teams. Graduate Student Theses, Dissertations, & Professional Papers.

    • Search Google Scholar
    • Export Citation
  • Bolhuis, E. D. (2017). How teacher educators learn to use data in a data team. Dissertation, University of Twente.

  • Bruce, C. D., Esmonde, I., Ross, J., Dookie, L., & Beatty, R. (2010). The effects of sustained classroom-embedded teacher professional learning on teacher efficacy and related student achievement. Teaching and Teacher Education, 26(8), 15981608. https://doi.org/10.1016/j.tate.2010.06.011.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Bús, E. (2015). Adat, mérés, fejlesztés–nemzetközi példák az oktatási adatok felhasználására. Educatio, 24(3), 134136.

  • Csapó, B., & Molnár, G. (2019). Online diagnostic assessment in support of personalized teaching and learning: The eDia System. Frontiers in Psychology, 10, 1522. https://doi.org/10.3389/fpsyg.2019.01522.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Dunn, K. E., Airola, D. T., Lo, W. J., & Garrison, M. (2013). What teachers think about what they can do with data: Development and validation of the data driven decision-making efficacy and anxiety inventory. Contemporary Educational Psychology, 38(1), 8798. https://doi.org/10.1016/j.cedpsych.2012.11.002.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Ebbeler, J., Poortman, C. L., Schildkamp, K., & Pieters, J. M. (2016). Effects of a data use intervention on educators’ use of knowledge and skills. Studies in Educational Evaluation, 48, 1931. https://doi.org/10.1016/j.stueduc.2015.11.002.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Firestone, W. A., Monfils, L., Camilli, G., Schorr, R. Y., Hicks, J. E., & Mayrowetz, D. (2002). The ambiguity of test preparation: A multimethod analysis in one state. Teachers College Record, 104(7), 14851523.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Green, J. L., Schmitt-Wilson, S., Versland T., Kelting-Gibson L., & Nollmeyer G. E. (2016). Teachers and data literacy: A blueprint for professional development to foster data driven decision making. Journal of Continuing Education and Professional Development, 3(1), 1432.

    • Search Google Scholar
    • Export Citation
  • Gummer, E., & Mandinach, E. (2015). Building a conceptual framework for data literacy. Teachers College Record, 117(4).

  • Marsh, J. A. (2012). Interventions promoting educators’ use of data: Research insights and gaps. Teachers College Record, 114(11), 148.

    • Search Google Scholar
    • Export Citation
  • Marsh, J. A., Pane, J. F., & Hamilton, L. S. (2006). Making sense of data-driven decision making in education: Evidence from recent RAND research. RAND Cooperation .

    • Search Google Scholar
    • Export Citation
  • Means, B., Chen, E., DeBarger, A., & Padilla, C. (2011). Teachers’ ability to use data to inform instruction: Challenges and supports. Office of Planning, Evaluation and Policy Development, US Department of Education.

    • Search Google Scholar
    • Export Citation
  • Means, B., Padilla, C., & Gallagher, L. (2010). Use of education data at the local level: From accountability to instructional improvement. US Department of Education.

    • Search Google Scholar
    • Export Citation
  • Price, J. J. (2018). The relationship between teachers’ perception of data-driven instructional leadership and their sense of efficacy and anxiety for data-driven decision-making. Electronic Theses & Dissertations, Georgia Southern University .

    • Search Google Scholar
    • Export Citation
  • Reeves, T. D., & Chiang J. (2017). Building pre-service teacher capacity to use extrernal assessment data: An intervention study, The Teacher Educator, 52(2), 155172. https://doi.org/10.1080/08878730.2016.1273420.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Reeves T. D., & Chiang J. (2018). Online interventions to promoto teacher data-driven decision making: Optimizing design to maximize impact. Studies in Educational Evaluation, 59, 256269. https://doi.org/10.1016/j.stueduc.2018.09.006.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Reeves, T. D., & Honig, S. L. (2015). A classroom data literacy intervention for pre-service teachers. Teaching and Teacher Education, 50, 90101. https://doi.org/10.1016/j.tate.2015.05.007.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Ronka, D., Geier, R., & Marciniak, M. (2010). A practical framework for building a data-driven district or school: How a focus on data quality, capacity and culture supports data-driven action to improve student outcomes. PCG Education White Paper.

    • Search Google Scholar
    • Export Citation
  • Samuel, M. (2008). Acccountability to whom? For what? Teacher identity and the force field model of teacher development. Perspectives in Education, 26(2), 316.

    • Search Google Scholar
    • Export Citation
  • Schildkamp, K., Handelzalts, A., Poortman, C. L., Leusink, H., Meerdink, M., Smit, M., et al. (2018). The data team! Procedure: A systematic approach to school improvement. Springer International Publishing.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Straub, E. T. (2009). Understanding technology adoption: Theory and future directions for informal learning. Review of Educational Research, 79(2), 625649. https://doi.org/10.3102/0034654308325896.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Széll, K. (2015). Az adathozzáférés és felhasználás nemzetközi gyakorlatai. Educatio, 24(3), 6272.

  • Tóth, E. (2011). Pedagógusok nézetei a tanulóiteljesítmény-mérésekről. Magyar Pedagógia, 111(3), 225249.

  • Tóth, E. (2015). Az Országos kompetenciamérés hatása a tanítási munkára pedagógusinterjúk alapján. Magyar Pedagógia, 115(2), 115138. http://doi.org/10.17670/MPed.2015.2.115.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Van Geel, M., Keuning, T., Visscher, A., & Fox, J. P. (2017). Changes in educators’ data literacy during a data-based decision making intervention. Teaching and Teacher Education, 64, 187198. https://doi.org/10.1016/j.tate.2017.02.015.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Vígh, T. (2007). A vizsgák tanítási-tanulási folyamatra gyakorolt hatásának elméleti és empirikus kutatása. Magyar Pedagógia, 107(2), 141161.

    • Search Google Scholar
    • Export Citation
  • Walker, D. A., Reeves, T. D., & Smith, T. J. (2018). Confirmation of the data-driven decision-making efficacy and anxiety inventory’s score factor structure among teachers. Journal of Psychoeducational Assessment, 36(5), 477491. https://doi.org/10.1177/0734282916682905.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Airola, D. T., & Dunn, K. E. (2011). Oregon DATA project final evaluation report. Fayetteville: Next Level Evaluation.

  • Bandura, A. (1997). Self-efficacy: The exercise of control. New York: W.H. Freeman and Company.

  • Begin, V. S. (2018). Instructional data teams and data literacy: Leaders supporting the work of school instructional teams. Graduate Student Theses, Dissertations, & Professional Papers.

    • Search Google Scholar
    • Export Citation
  • Bolhuis, E. D. (2017). How teacher educators learn to use data in a data team. Dissertation, University of Twente.

  • Bruce, C. D., Esmonde, I., Ross, J., Dookie, L., & Beatty, R. (2010). The effects of sustained classroom-embedded teacher professional learning on teacher efficacy and related student achievement. Teaching and Teacher Education, 26(8), 15981608. https://doi.org/10.1016/j.tate.2010.06.011.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Bús, E. (2015). Adat, mérés, fejlesztés–nemzetközi példák az oktatási adatok felhasználására. Educatio, 24(3), 134136.

  • Csapó, B., & Molnár, G. (2019). Online diagnostic assessment in support of personalized teaching and learning: The eDia System. Frontiers in Psychology, 10, 1522. https://doi.org/10.3389/fpsyg.2019.01522.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Dunn, K. E., Airola, D. T., Lo, W. J., & Garrison, M. (2013). What teachers think about what they can do with data: Development and validation of the data driven decision-making efficacy and anxiety inventory. Contemporary Educational Psychology, 38(1), 8798. https://doi.org/10.1016/j.cedpsych.2012.11.002.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Ebbeler, J., Poortman, C. L., Schildkamp, K., & Pieters, J. M. (2016). Effects of a data use intervention on educators’ use of knowledge and skills. Studies in Educational Evaluation, 48, 1931. https://doi.org/10.1016/j.stueduc.2015.11.002.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Firestone, W. A., Monfils, L., Camilli, G., Schorr, R. Y., Hicks, J. E., & Mayrowetz, D. (2002). The ambiguity of test preparation: A multimethod analysis in one state. Teachers College Record, 104(7), 14851523.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Green, J. L., Schmitt-Wilson, S., Versland T., Kelting-Gibson L., & Nollmeyer G. E. (2016). Teachers and data literacy: A blueprint for professional development to foster data driven decision making. Journal of Continuing Education and Professional Development, 3(1), 1432.

    • Search Google Scholar
    • Export Citation
  • Gummer, E., & Mandinach, E. (2015). Building a conceptual framework for data literacy. Teachers College Record, 117(4).

  • Marsh, J. A. (2012). Interventions promoting educators’ use of data: Research insights and gaps. Teachers College Record, 114(11), 148.

    • Search Google Scholar
    • Export Citation
  • Marsh, J. A., Pane, J. F., & Hamilton, L. S. (2006). Making sense of data-driven decision making in education: Evidence from recent RAND research. RAND Cooperation .

    • Search Google Scholar
    • Export Citation
  • Means, B., Chen, E., DeBarger, A., & Padilla, C. (2011). Teachers’ ability to use data to inform instruction: Challenges and supports. Office of Planning, Evaluation and Policy Development, US Department of Education.

    • Search Google Scholar
    • Export Citation
  • Means, B., Padilla, C., & Gallagher, L. (2010). Use of education data at the local level: From accountability to instructional improvement. US Department of Education.

    • Search Google Scholar
    • Export Citation
  • Price, J. J. (2018). The relationship between teachers’ perception of data-driven instructional leadership and their sense of efficacy and anxiety for data-driven decision-making. Electronic Theses & Dissertations, Georgia Southern University .

    • Search Google Scholar
    • Export Citation
  • Reeves, T. D., & Chiang J. (2017). Building pre-service teacher capacity to use extrernal assessment data: An intervention study, The Teacher Educator, 52(2), 155172. https://doi.org/10.1080/08878730.2016.1273420.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Reeves T. D., & Chiang J. (2018). Online interventions to promoto teacher data-driven decision making: Optimizing design to maximize impact. Studies in Educational Evaluation, 59, 256269. https://doi.org/10.1016/j.stueduc.2018.09.006.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Reeves, T. D., & Honig, S. L. (2015). A classroom data literacy intervention for pre-service teachers. Teaching and Teacher Education, 50, 90101. https://doi.org/10.1016/j.tate.2015.05.007.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Ronka, D., Geier, R., & Marciniak, M. (2010). A practical framework for building a data-driven district or school: How a focus on data quality, capacity and culture supports data-driven action to improve student outcomes. PCG Education White Paper.

    • Search Google Scholar
    • Export Citation
  • Samuel, M. (2008). Acccountability to whom? For what? Teacher identity and the force field model of teacher development. Perspectives in Education, 26(2), 316.

    • Search Google Scholar
    • Export Citation
  • Schildkamp, K., Handelzalts, A., Poortman, C. L., Leusink, H., Meerdink, M., Smit, M., et al. (2018). The data team! Procedure: A systematic approach to school improvement. Springer International Publishing.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Straub, E. T. (2009). Understanding technology adoption: Theory and future directions for informal learning. Review of Educational Research, 79(2), 625649. https://doi.org/10.3102/0034654308325896.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Széll, K. (2015). Az adathozzáférés és felhasználás nemzetközi gyakorlatai. Educatio, 24(3), 6272.

  • Tóth, E. (2011). Pedagógusok nézetei a tanulóiteljesítmény-mérésekről. Magyar Pedagógia, 111(3), 225249.

  • Tóth, E. (2015). Az Országos kompetenciamérés hatása a tanítási munkára pedagógusinterjúk alapján. Magyar Pedagógia, 115(2), 115138. http://doi.org/10.17670/MPed.2015.2.115.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Van Geel, M., Keuning, T., Visscher, A., & Fox, J. P. (2017). Changes in educators’ data literacy during a data-based decision making intervention. Teaching and Teacher Education, 64, 187198. https://doi.org/10.1016/j.tate.2017.02.015.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Vígh, T. (2007). A vizsgák tanítási-tanulási folyamatra gyakorolt hatásának elméleti és empirikus kutatása. Magyar Pedagógia, 107(2), 141161.

    • Search Google Scholar
    • Export Citation
  • Walker, D. A., Reeves, T. D., & Smith, T. J. (2018). Confirmation of the data-driven decision-making efficacy and anxiety inventory’s score factor structure among teachers. Journal of Psychoeducational Assessment, 36(5), 477491. https://doi.org/10.1177/0734282916682905.

    • Crossref
    • Search Google Scholar
    • Export Citation
The author instruction is available in PDF. Please download the file from HERE
 
The Submissions template is available in MS Word.
Please, download the file from HERE
Please, download the file from HERE (For book reviews).

 

 

Senior Editors

Founding Editor: Tamás Kozma (Debrecen University)

Editor-in-ChiefAnikó Fehérvári (ELTE Eötvös Loránd University)

Assistant Editor: Eszter Bükki (ELTE Eötvös Loránd University)

Associate editors: 
Karolina Eszter Kovács (Debrecen University)
Valéria Markos (Debrecen University)
Zsolt Kristóf (Debrecen University)

 

Editorial Board

  • Tamas Bereczkei (University of Pécs)
  • Mark Bray (University of Hong Kong)
  • John Brennan (London School of Economics)
  • Carmel Cefai (University of Malta)
  • Laszlo Csernoch (University of Debrecen)
  • Katalin R Forray (HERA Hungarian Educational Research Association)
  • Zsolt Demetrovics (Eotvos Lorand University, Budapest)
  • Csaba Jancsak (University of Szeged)
  • Gabor Halasz (Eotvos Lorand University, Budapest)
  • Stephen Heyneman (Vanderbilt University, Nashville)
  • Katalin Keri (University of Pecs)
  • Marek Kwiek (Poznan University)
  • Joanna Madalinska-Michalak (University of Warszawa)
  • John Morgan (Cardiff University)
  • Roberto Moscati (University of Milan-Bicocca)
  • Guy Neave (Twente University, Enschede)
  • Andrea Ohidy (University of Freiburg)
  • Bela Pukanszky (University of Szeged)
  • Gabriella Pusztai (University of Debrecen)
  • Peter Toth (HERA Hungarian Educational Research Association)
  • Juergen Schriewer (Humboldt University, Berlin)
  • Ulrich Teichler (University of Kassel)
  • Voldemar Tomusk (Estonian Academy of Sciences, Tallin)
  • Horst Weishaupt (DIPF German Institute for International Educational Research, Frankfurt a.M)
  • Pavel Zgaga (University of Ljubljana)

 

Address of editorial office

Dr. Anikó Fehérvári
Institute of Education, ELTE Eötvös Loránd University
Address: 23-27. Kazinczy út 1075 Budapest, Hungary
E-mail: herj@ppk.elte.hu

2020  
CrossRef Documents 36
WoS Cites 10
Wos H-index 3
Days from submission to acceptance 127
Days from acceptance to publication 142
Acceptance Rate 53%

2019  
WoS
Cites
22
CrossRef
Documents
48

 

Hungarian Educational Research Journal
Publication Model Gold Open Access
Submission Fee none
Article Processing Charge none
Regional discounts on country of the funding agency  
Further Discounts Gold Open Access
Subscription Information Gold Open Access
Purchase per Title  

Hungarian Educational Research Journal
Language English
Size B5
Year of
Foundation
2011
Publication
Programme
2021 Volume 11
Volumes
per Year
1
Issues
per Year
4
Founder Magyar Nevelés- és Oktatáskutatók Egyesülete – Hungarian Educational Research Association
Founder's
Address
H-4010 Debrecen, Hungary Pf 17
Publisher Akadémiai Kiadó
Publisher's
Address
H-1117 Budapest, Hungary 1516 Budapest, PO Box 245.
Responsible
Publisher
Chief Executive Officer, Akadémiai Kiadó
ISSN 2064-2199 (Online)

Monthly Content Usage

Abstract Views Full Text Views PDF Downloads
Apr 2021 0 0 0
May 2021 0 18 21
Jun 2021 0 31 35
Jul 2021 0 42 35
Aug 2021 0 42 39
Sep 2021 0 39 39
Oct 2021 0 0 0