Abstract
Increasing the number of students who complete their higher education (HE) studies is growing significantly across Europe; it is seen to contribute to individual and national prosperity, and to improve the efficiency of the HE system. The Higher Education Drop-out and Completion in Europe project examined the issue of “study success.” The methodology utilized a literature and policy review, two surveys of an HE expert in each European country, and eight in-depth mixed-method national case studies. This paper considers how in many European countries study success is not explicitly defined, and national policies can be ambiguous or even detrimental due to the lack of clarity about, and alignment with, study success. These flaws are reinforced by the lack of indicators and tools to measure study success, which would facilitate evaluation to improve national policies and institutional actions. These issues are illuminated by a comparison between the Czech Republic and England. A widely agreed definition of study success contributes to better policy alignment and allows performance indicators to be developed, which fosters a more coherent national and institutional approach to improving study success, but this needs to be underpinned by a shared national commitment, which values and promotes study success.
Introduction
Increasing the number of students who complete their higher education (HE) studies is growing significantly across Europe; it is seen to contribute to individual and national prosperity, and to improve the efficiency of the HE system. One of the targets of the Europe 2020 Strategy (European Commission, 2010) is for at least 40% of 30- to 34-year-olds to complete HE qualification by 2020. Although increasing participation rates is necessary, reducing student withdrawals and increasing completion rates are the key strategies for achieving this goal, as across Europe many students withdraw before obtaining their target qualification. The 2011 Modernisation Agenda (Eurydice, 2011) states that it requires a joint effort of all Member States, HE institutions (HEIs), and the European Commission to take a pro-actively work toward the objectives of increasing participation and attainment in HE.
The Europe 2020 target of a 40% completion rate has almost been achieved, and improvements have been seen across Europe. Fifteen countries have achieved their targets, while the majority are approaching them; Ireland, Luxembourg, and Portugal are the furthest adrift, but Luxembourg and Ireland have particularly high national targets. There are however pronounced gender differences across Europe, with women exceeding the 40% target (45%) and men still being significantly below (35%). It is also noted that “the EU’s tertiary attainment rate still lags behind the rates of some other major world economies such as Korea, Japan, Canada and the United States” (Eurostat, 2018, p. 11). However, European, national, and institutional efforts to widen access to HE – enshrined in the Bologna process (initially Prague Communiqué, EHEA, 2001 and prioritized in the Bucharest Communiqué, EHEA, 2012) – may put pressure on improving student retention and success through increased participation rates and greater student diversity. This calls for a stronger knowledgebase on what countries and HEIs can do in order to effectively achieve the objectives of reducing dropout and increasing completion, especially in relation to specific student groups. The current understanding of study success, its determining factors, and policies that can effectively reduce dropout and increase completion is limited (Quinn, 2013).
This paper focuses on what national policymakers can do to improve student study success. It draws on the Higher Education Drop-out and Completion in Europe (HEDOCE) project (Vossensteyn et al., 2015), which the author was involved in, and develop further insight about how study success can be improved at the national level through a comparative analysis of the Czech Republic and England. This is significant because much of the research on improving student outcomes points to the importance of the institution (Jones, 2008; Thomas & Tight, 2011; Yorke, 1999).
There are many challenges associated with examining study success across different national systems, including different definitions, which are discussed below (Thomas & Hovdhaugen, 2014). A review of the literature finds that variations between national HE systems contribute to differences in study success, in particular, selectivity in admissions, flexibility within the system, student tuition fees, and financial support.
Selectivity of the HE system – i.e., who has access – varies significantly across Europe and determines the prior academic attainment of the enrolling student cohort, which, in turn, has a direct effect on retention and withdrawal. A more open access system may rely on withdrawals as an alternative process of selection, whereas a more selective system may have lower rates of student withdrawal (Declercq & Verboven, 2018). Increasing student diversity through widening access policies may reduce study success, for example, as a consequence of lack of study skills or preparation for HE (Heublein, Spangenberg, & Sommer, 2003; Jones, 2008; Quinn et al., 2005). There are also differences between countries in how many entry routes there are to HE, which increase opportunities for more non-traditional students to enter HE, which can have a negative impact on retention and completion (Helland, 2005; Heublein et al., 2003; Thomas & Hovdhaugen, 2014).
Flexibility of the HE system, providing the opportunity to transfer between programs and institutions, including credit transfer, can influence retention and completion either positively or negatively (Houston, McCune, & Osborne, 2011; Thomas & Hovdhaugen, 2014). In several of the Scandinavian countries, credit transfers are widely accepted, which facilitates starting one degree and then switching to another (Hovdhaugen, 2009); similarly, in some Czech data, students who transfer are counted as withdrawals (Vlk, 2015). Conversely, in the UK, credit transfer is not widely accepted and so moving degree programs would account for a very small proportion of students withdrawing. Flexibility can also cause study delay and increase time to completion, for example, in Norway (Hovdhaugen, 2012) and Denmark (Danish Ministry of Higher Education and Science, 2013).
There is considerable variation in tuition fees and student financial support systems across Europe (OECD, 2011). However, there is no direct link between the level of tuition fees and completion rates (OECD, 2008). On one hand, students who pay for their education may be more committed to completing their HE program; on the other hand, paying tuition fees may slow completion as students need to engage in paid work, or they may leave HE being unable to meet the direct and indirect costs (Orr, Wespel, & Usher, 2014). There is little research suggesting that tuition fees force students to leave HE, rather the evidence about the impact of fees and student finance on dropout/retention and completion is ambiguous (see, e.g., Thomas, 2015). However, engaging in employment has a negative impact on study success (Vossensteyn, Cremonini, Epping, Laudel, & Leisyte, 2013), although this tends to be for students working in excess of 20–25 hr per week (Beerkens, Magi, & Lill, 2011; Hovdhaugen, 2014).
There is however little research about the impact of national policies and initiatives on reducing withdrawal and improving retention and completion. The majority of research published and reviewed focuses on the role of the HEIs. In summary, US and European evidence points to the importance of institutional commitment to improving study success, which shapes the organization of HE, learning, teaching, assessment, and student support. Therefore, this paper makes a significant contribution to knowledge about how national systems can adopt policies to improve study success and student outcomes by comparing and contrasting two national approaches.
About the “Higher Education Drop-Out and Completion in Europe” Project
The HEDOCE research study, funded by the European Commission, undertook a comparative review of study success across 35 European countries (Vossensteyn et al., 2015). More specifically, the study undertook a literature review of international, national, and institutional research and policy-oriented documents addressing study success; provided an inventory and categorization of national policies explicitly designed to improve study success; and examined national indicators and metrics associated with study success and considered their relevance for European comparisons.
The literature review included peer-reviewed literature, identified by key word searches of bibliographic databases. It was supplemented with literature from the USA, where this field of research is far more widely established; and by national language and gray literature identified and summarized by national HE experts from each of the countries in Europe. HE experts were identified in each country and were surveyed about key aspects of study success, in particular national policies and approaches. This information was verified against other sources, including the NESET report (Quinn, 2013), reports from Eurydice (European Commission/EACEA/Eurydice 2012, 2014), OECD reports, and with other experts. This resulted in a relatively complete overview of explicit study success policies for the period 2005–2014 in 35 European countries.
Eight in-depth country case studies on the Czech Republic, England, France, Germany, Italy, the Netherlands, Norway, and Poland were undertaken. The national experts were guided and supported by a case study template and a member of the core research team to maximize comparability across the case studies. The case study research involved interviews with key stakeholders at national level, including policymakers and representatives of cross-sector organizations. Two HEIs were selected to explore institutional approaches to study success, and consideration was given to whether institutions were translating national policy for local implementation, or introducing self-initiated study success policies. At the institutional case-study visits, interviews were conducted with institutional leaders, academic and support staff, and students. The findings from the case studies were used to write up the case studies in a common format, and each was reviewed by a member of the core research team to check for standardization of approach and to cross-reference with external sources of information.
Findings
The findings presented here focus on three issues: national definitions of study success, national policies, and national indicators and measures of success. These topics are discussed in relation to the findings from across 35 European countries that participated in this study. These findings are used as a framework with which to analyze the national case studies from Czech Republic and England, which generates new insights into how to develop a national approach to improving student study success.
Definitions, policies, and indicators for study success across Europe
Definitions of study success
As noted above, defining “study success” in an international comparative context is difficult (Thomas & Hovdhaugen, 2014); hence, the reliance so far in this article on the broad and vague phrase “study success.” Thus, it is not surprising (see Hagedorn, 2004) that the HEDOCE research identified a range of interpretations relating to the phrase “study success.” However, even when definitions appear explicit and widespread, interpretation, implementation, and consensus can diverge in practice between different stakeholders in the HE sector and beyond. Drawing on the literature, it is useful to clarify some key terms.
The first is “completion” or the “completion rate,” which is usually understood as the number of students who have successfully finished a study program at a HEI compared to the number who started. Although apparently a straightforward calculation, difficulties and concomitant inaccuracies can arise (see Chalmers, 2008; Thomas & Hovdhaugen, 2014). However – and usually – completion rates refer to a cohort and a point in time when it can be assumed that most students would have completed their study program (e.g., 1–2 years after the nominal study time). The length of time considered “normal” or acceptable varies between countries. This has led to another definition and associated indicator: time to degree – which emerged as the most popular definition across Europe in the HEDOCE project – but which refers to vastly different periods of time that are deemed acceptable and “normal.”
Next is retention, or continuation (rate). In its simplest sense, retention means the number of entrants who remain on a given course at a selected census point. This becomes a little more complicated when used to signify reenrollments over longer periods, and one potential problem in calculating retention or continuation rates is that it may include “limbo” students who are not actively studying for a degree, yet have not formally withdrawn either – and, or, they have reenrolled but they are not gaining credits. In the US, and beyond, these are sometimes known as “stop-outs” as opposed to dropouts.
“Dropout” or “withdrawal” (rate) refers to those who leave HE before completing their target award. In this paper, the term “dropout” is avoided as it is pejorative, and withdrawal or non-continuation is used in preference. As with other definitions, the major issue here is to identify the two groups of students that need to be compared in order for the relevant indicators to produce accurate data. It can be difficult to define and measure as students do not always formally withdraw but rather drift away, while others may take a break from their studies and reenter picking up at the same point as they left off. This makes it difficult to know at what point should they be deemed to have left the system for good.
Both “Switching” and “transfer” rates refer to the number of students that change their program of study and, or, HEI. Again, the basis for the calculation is mostly the entrance cohort. Transfer rates are calculated for different points in time, but mostly they refer to the switch of students after the first year of study; in some national systems, this is easier to do and easier to measure than in other systems.
In this study, the generic term “study success” is used to encompass student retention and completion, in contrast to withdrawal or drop-out and non-completion: “Study success comprises all major achievements of students in the higher education system, including dropout/persistence, completion of a degree and time-to-degree” (Vossensteyn et al., 2015, p. 24). The HEDOCE study examined interpretations of study success across Europe based on three definitions:
- –Completion: to have students successfully complete their study program with a degree.
- –Time-to-degree: to have students complete their study program within a reasonable time period.
- –Retention: the aim to have students reenroll in a study program until they complete their degree and to reduce the likelihood they drop out before completing their program.
Responses from 35 European countries indicated that “time-to-degree” is the most frequently used understanding of study success underpinning national policy making in Europe (18 countries), with completion being the second most common (13 countries).
This response however provides a rather simplistic understanding of the multifaceted and contested nature of study success across and within European countries. The case-study research revealed that, only in England, there is an explicit definition that is widely agreed upon by all stakeholders of what constitutes study success. In Germany, stakeholders have some degree of consensus on what they regard as study success – all of them seeing it as “the successful completion of an academic degree,” but this has not resulted in a more specific definition. In all other countries, there are variations in the definition of study success among stakeholders; several understandings of the concept exist side by side, and stakeholders choose to use the type of definition that they find most suitable. In Italy, for example, several stakeholders identified “finding employment” as study success, while “completing the degree,” particularly in a “timely fashion” was important to other stakeholders. In other countries, such as Norway and the Netherlands, timely completion is regarded as the most important understanding of study success, at least at the national policy level. The Norwegian case study however finds that HEIs do not tend to share the view that completion within a limited time period is paramount, but rather see completion at some point to be more important. Indeed, Norwegian student unions are concerned that there is too much emphasis on efficiency at the expense of the quality of the students’ study experience and attainment levels.
These examples illustrate the divergence of interpretations of study success and its definitions across European HE systems – and crucially within national HE systems. This in turn then makes it hard to identify the relevant policies that seek to promote or stimulate study success and address student withdrawal.
National policies for study success
The lack of consensus relating to definitions of “study success” might be taken as an indicator of the lack of priority accorded to it by policy makers and HEIs and others in the HE sector. However, the survey of national experts from 35 European countries suggested that study success is widely regarded as an important topic across Europe. In the majority of countries (28, which is 80%), experts indicated that study success is a policy issue that is being given consideration, while 16 country experts stated that study success is high or very high on the policy agenda. Experts in only seven countries indicated that study success is not really a matter for concern within their country’s HE policy agenda.
The study collected details of national policies and approaches used to address study success from a 10-year period, i.e., 2005–2015. The information was collected in a variety of ways, including two surveys with national experts and desk research. In the majority of countries, universities and other HEIs are actively engaged in improving study success, and governmental authorities such as ministries are engaged in some countries – approximately two thirds of the countries reviewed have study success policies in place, and about half of the countries have implemented regulations to improve study success. In total, more than 170 policies that explicitly and intentionally address study success were identified.
Drawing on the literature, three broad national policy areas were identified: “funding and financial incentives,” “organization of HE,” and “information and support for students” being used to explicitly address study success.
- –Funding and financial incentives: Financial policies often include incentives to encourage (or discourage) specific activity. Such policies can directly target students, e.g., through tuition fees, grants, scholarships, or loans; financial incentives can be used to influence students’ study success behavior, for example, by linking financial reward to credits gained. They can also influence HE providers – but here such policies tend to work indirectly by “inviting” HEIs to develop their own policies.
- –Organization of HE: This policy area refers primarily to teaching and learning but can also include structural characteristics such as the duration of programs or the types of degrees offered. Policy detail can be delegated, i.e., devolved to HEIs. Organizational regulations can address the quality of teaching and learning (and its accreditation, etc.), and also things such as student–teacher ratios, number of contact hours, assessment regulations, pathways toward a degree or “soft selection mechanisms” such as applicant interviews, etc., to influence who has access to HE courses.
- –Information and support for students: These policies relate to different stages in the student “life cycle.” Prospective students (and those transferring or thinking about leaving) are provided with information to inform their decisions. Information and support, pre- and post-entry provide course, study and career information, and address academic development and attainment, personal well-being, and professional development.
The 170 policies aiming to improve study success were divided into these three categories, and 22 “typical” policies were identified. A typical policy is one that is used in several countries and informed by similar rationales across these countries.
Nine typical policies were identified focusing on the organization of HE – and more specifically, the study experience. This may include “access” (what kinds of students governments and HEIs want to attract to HE) but also covers the way in which degree programs are structured. Finally, it encompasses changes to the curriculum design and associated pedagogy, such as a focus on the first year transition experience, different assessment regimes, and developing students’ academic confidence and skills.
Eight typical policies were identified focusing on funding – these can be divided into those focusing on student financing issues and other addressing funding for HE providers. Financial support for students can help them focus on their studies, improving both success and completion rates. Conversely, tuition fees may encourage students to choose their courses more carefully and to study efficiently. Public funding to HEIs often includes incentives to make institutions prioritize study success – e.g., through performance-based funding rewarding credits gained or degrees completed. In approximately a third of the countries, “study success” (however it is defined) plays a role in determining the funding of HEIs; this is especially true in countries study success is high or very high on the policy agenda. It is important to note that both student and institutional funding works in positive or negative ways, i.e., by incentivizing or rewarding some behavior through funding, and discouraging other behavior or outcomes through punitive measures.
Five typical policies focusing on information and (non-financial) support for students were identified. These include the provision of information and support to students – both prior to and after enrollment, and the provision of additional support for students, such as guidance with course choices, academic development, and health and welfare services. The provision of information and guidance is intended to improve the fit between students and their programs of study, and to shape and manage students’ expectations with regard to their studies and future employment careers. This includes national information services, such as student choice portals, league tables or institutional matching instruments, capability, and interest tests. Support services include extracurricular courses to enhance skills, competencies, and academic preparedness of students, e.g., in reading, writing, and language skills; specialist support provisions for students with a physical or learning disability; and health and mental well-being services. The main rationale underlying such policies is that a better match between students and programs as well as a growth in competencies are regarded as prerequisites for successful study.
Indicators and outcomes for study success
In light of the differing definitions of study success, and the varying ways in which national policies address it, it is perhaps not surprising that gauging the effectiveness of policy interventions is often difficult. This is the case both within and between countries; in this paper, the focus is on measurement within a national system.
Indicators of “study success” are difficult to operationalize and calculate, and the literature illustrates that study success is measured in very different ways (Hagedorn, 2004; Thomas & Hovdhaugen, 2014), and this is reflected in the findings from this study. In this study (Vossensteyn et al., 2015), indicators were sought in relation to any of the three definitions of study success articulated previously: completion rate, retention or continuation rate (or dropout or withdrawal rate), and time-to-degree (based on the length of time, it takes students to complete their undergraduate studies).
National experts indicated that 12 countries publish an indicator for completion and 23 do not, even though some of them have the data to calculate such a completion rate. Only six countries reported that some kind of retention rate is available, although for Denmark the indicator refers to withdrawal. For only a quarter of the countries, information on time-to-degree was identified. It should be noted that some countries have more than one indicator, but nearly half the European countries (17 out of 35) do not appear to have any national indicator and data about study success; this is surprising as only seven countries reported that study success is absent from the national HE policy agenda. Furthermore, there seems to be a mismatch between definitions of study success and indicators. For example, 18 countries identified time-to-degree as the national focus of study success, yet this is only an indicator in half of these countries.
A closer look at national approaches to improving study success through country case studies
The discussion in the preceding section suggests that countries do not always have a clear definition of what study success is, even when it is high on the national agenda. Furthermore, many countries do not have an explicit national indicator of study success, suggesting that it is not explicit either what is to be achieved, and/or how it is to be measured. This problem is compounded by the fact that there are multiple ways that national policymakers can seek to improve study success through information and support for students, changing the organization and delivery of HE and using funding mechanisms aimed at students or institutions or both. In this part of the paper, two countries are examined in greater detail to consider the relationship between an explicit and shared vision about what study success is and how it should be achieved; alignment or dissonance between national policies to improve study success; and measuring progress in improving study success.
The Czech Republic has been selected as an example of a Visegrad country and thus relevant to readers of this journal, and England has been chosen as an example of a country with a comprehensive, coherent, and effective national approach to improving study success. Each case study provides details of the national context and approach, drawing on the information provided by national experts with the HEDOCE project, and supplemented with additional information.
Czech Republic
The Czech Republic has been selected as an example of a Central European country, which is broadly representative of the Visegrad group of countries: Czech Republic, Hungary, Poland, and Slovakia (Stiburek, Vlk, & Svec, 2017), which have much shared history and are working together toward greater integration within Europe. In the Czech Republic, study success is on the HE policy agenda, although it is not seen as a high priority (Vossensteyn et al., 2015), and some of the concern placing it on the agenda is externally driven by the European Commission (Stiburek et al., 2017; Vlk, 2015). Data from the Czech Ministry of Education, Youth and Sports indicate that withdrawal rates are increasing, e.g., from 38% in 2003 to 48% in 2010, and this trend is continuing with less than 50% of students who enrolled in a bachelor’s degree in 2009 being awarded the target qualification (Vlk, 2015).
In the Czech Republic, successful study is broadly understood to mean completing the target program of study, and withdrawal includes students who have transferred to other programs. Retention refers to students progressing from the first to second year of undergraduate study. However, the extent to which there is concern about study success is largely in terms of the efficiency of the HE system. Therefore, this shifts the policy agenda toward consideration of time to completion, and completing a degree within a specific time period (Stiburek et al., 2017), although there does not appear to be a formal definition specifying a time period (c.f. Vlk, Stiburek, & Svec, 2016). Rather than achieving a degree, or completing with a specific time period, students view study success in terms of employment outcomes, and some academic staff also share this perspective (Vlk, 2015). Thus, within the Czech Republic, there are some definitions and measures of study success, but these are not widely accepted and shared and there are competing alternatives based on time to completion and employment outcomes. There is, arguably, a need for “a thorough debate on dropout calculation and comparison with other countries” (Vlk et al., 2016, p. 651).
This lack of engagement in defining study success may be because improving student retention and completion (or conversely reducing student withdrawal) is not perceived as a priority by the majority of HE stakeholders; rather a high rate of non-completion is viewed as an essential aspect of the Czech HE system (Stiburek et al., 2017; Vlk, 2015). In other words, academic failure and student withdrawal operate as ex post selection mechanisms, compensating for limited restrictions on who gains access to HE. A lack of selectivity have been exacerbated by a demographic decline in 19-year olds resulting in the lowering of entry qualifications and European efforts to expand the diversity of participation in HE as part of the social dimension of the Bologna process (noted above).
A vague or non-universal understanding of study success, and a lack of priority to accord to addressing student withdrawal and non-completion seem to contribute to an ineffective national policy approach. At the national level, there are very few policies explicitly designed to address or improve study success (Vlk, 2015), and policies are vague, and other regulations and policies undermine efforts to improve study success, or at least render them less effective.
The HE Strategic Plan for 2000–2005 identified the need to enable students who make poor study choices to switch programs and complete their courses, and drop-out was noted as an inefficiency in the system for the government – and for students. However, no specific measures of dropout were identified and very little detail was provided as to how this matter should be addressed (Ministry of Education, Youth and Sports, 2000; Vlk, 2015). However, HEIs were asked to publish some basic data about student withdrawals, and their associated measures to reduce the rates of withdrawal in their annual reports. Notwithstanding, the recommended structure of an annual report, set by the MEYS, is not obligatory to follow, and no shared definitions of dropout are set for this purpose (Vlk, 2015).
In drawing up the current Strategic Plan for 2016–2020, efforts were made to by the Ministry of Education, Youth and Sports to address student withdrawal more directly, but these were rejected by the HEIs. Instead, institutions are required to “identify the causes of increasing dropout rates.” To support this process, the government intends to provide funding to institutions, drawn from the European Structural and Investment Fund, and national funding. However, within universities, study success policy and management are decentralized, with faculties, departments, or even courses being responsible for this issue. (Vlk, 2015). There appear to be no direct consequences for institutions that do not address study success in their annual plans, other than some financial implications. Vlk (2015) notes that the government is linking retention to institutional funding, and higher rates of withdrawal may lead to a reduction in the number of funded student places; thus, the aim is to incentivize institutions to address early withdrawal. The impact of this financial measure will however depend on the proportion of income that is affected. It appears that the efficacy of this policy tool is likely to be limited as only a decreasing element of teaching funding is per capita, and the overarching structure of funding for HE encourages institutions to prioritize research over learning and teaching (as research generates income and teaching largely does not).
The wider institutional funding model incentivizes institutions to act in the opposite direction. Since 2010, a quality funding element has been introduced; thus, institutional funding is based on a mix of per capita funding and quality-related funding, with a gradual increase in the latter (Vlk, 2015). In a national context, in which withdrawal may actually be perceived as a positive aspect of quality, this funding strategy may inadvertently be driving institutional behaviors against study success.
Therefore, the focus on quality seems to be hindering efforts to improve study success, but this is largely a consequence of the way in which quality is conceptualized and implemented. The national quality assurance regime reviews the academic qualifications of staff and their research performance, with little focus on curriculum, pedagogy, assessment, and student support, and indeed no site visits take place (Vlk, 2015). There appears to be little focus on the organization and contents of HE programs, and how this might contribute to study success.
Indeed, the problem is further compounded by the implementation of the Bologna process (moving from long-cycle degrees, to the bachelor and masters structure of three plus two years). These changes have been implemented rather mechanistically, splitting up existing courses, rather than redesigning them (Vlk, 2015). This has resulted in some programs being unattractive at undergraduate level, as they are overly theoretical with little of the more engaging applied contents, which is delivered subsequently, especially in engineering and professional fields such as teacher education. These kinds of organizational issues are likely to increase rather than decrease academic failure and student withdrawal.
There appears to be little attention paid to providing student information and support, either prior to entry or once they are in HE. The Higher Education Strategic Plan (2000–2005) identified the issue of students switching courses, due to poor course choices. According to Vlk (2015), a portal is planned to provide prospective students with better information to inform their choices about progression to HE, and funding penalties for students are encouraging completion on time. The lack of on-course support reinforces the view that withdrawal is a positive aspect of quality assurance in some HE systems, and is not discouraged, whereas in other systems, efforts are made to support students to remain on their courses by addressing academic challenges, personal crises, etc.
In summary, it might be considered that there is a strong and pressing need for a national approach to improving study success. The rates of withdrawal and non-completion are increasing, and this is within a context of declining student numbers (due to demographic factors and in some subject areas, there occurs lower demand for the courses) and increasing diversity across the system, which appears not to be recognized or catered for the HE system. What appears to be missing in the Czech case, and potentially across the Visegrad countries, is national policy tools that aim to change or even transform the HE-learning experience itself. International research, e.g., from the US (Troxel, 2010) and Australia (Krause & Armitage, 2014), and European research (Thomas, 2012; Ulriksen, Madsen, & Holmegaard, 2010; Yorke & Longden, 2004) has identified the importance of the institution and institutional commitment to study success. In particular, research from Germany and the UK has identified the importance of learning, teaching, and assessment (e.g., Georg, 2009; Thomas, 2012), which is designed to improve student engagement and belonging. UK research identifies four dimension aspects of the academic experience that contributes to a sense of belonging – and retention and success. These are engagement with peers, interaction with staff, a developed capacity to engage, and an HE curriculum that is relevant to current interests and future aspirations (Thomas, 2012; Thomas, Hill, O’Mahony, & Yorke, 2017). In an institution with a strong commitment to study success, curriculum changes and additional support for study success are underpinned by monitoring students’ engagement and performance, and intervening when behavior indicates that they are at risk of early withdrawal (Thomas et al., 2017), allowing targeted interventions (Heublein, Schmelzer, & Sommer, 2008; Thomas, 2012). In the UK context, Buglear (2009) finds that poor data often underpin the institutions’ inabilities to intervene adequately to improve retention, and improved tracking is recommended by UK and other European research (Larsen, Sommersel, & Larsen, 2013; Thomas, 2012; Thomas et al., 2017). In the following case study, these and related issues are reviewed in the English context.
England
England was selected as a case study partly as it was identified by Vossensteyn et al. (2015) as having a comprehensive, coherent, and effective national approach to improving study success; furthermore, considerable research has been undertaken into the causes of student withdrawal and effective strategies to improve success for all students. Study success is high on the HE policy agenda across the UK, and in England in particular, and has been for approximately the past 15 years. This concern about study success has developed in response to two other policy directions: first overt efforts to increase the diversity of the HE student population, which have evolved into consideration of not just who enters HE, but equity in student outcomes and returns as well. Second, the introduction of tuition fees raised fears about both access to HE, and outcomes, for students from lower socioeconomic groups. Steps were therefore implemented at the national level to ensure that tuition fees and greater reliance on student loans (rather than non-repayable grants) did not have an unintentionally negative impact on study success in general and for students from lower socioeconomic groups in particular.
The English continuation rate (which refers to progression from first year of study to second year for full-time students) is high, currently around 94%, and only around 10% or 11% leave without an award or transfer to an alternative program of study at another institution (HESA, 2018). The rates have remained stable over the past decade or more, although the system has expanded in sized, increased in diversity, and introduced higher tuition fees. It should be noted however that there are significant variations between institutions; and between student groups. Therefore, broadly the English system can be judged to be effective with regard to study success.
In England, there is widespread agreement about what study success is; it is almost universally understood as the completion of a degree in a prescribed time period (with up to 1 year variance from the standard time allowed, i.e., 3 or 4 years for full-time degree programs). The completion rate is defined as “the proportion of starters in a year who continue their studies until they obtain their qualification, with no more than one consecutive year out of higher education,” this complemented the continuation rate, which is a more immediate measure, calculating the proportion of a higher education providers intake which is enrolled in the year following entry. (National Audit Office, 2007, p. 5). These definitions are not contested, although national bodies and institutions recognize and aspire to additional elements of study success. For example, the Higher Education Funding Council for England (HEFCE, 2013) has encouraged institutions to consider not just continuation and completion, but also attainment and progression into employment and further study (and work to improve these outcomes for all students is currently being taken forward by the newly created Office for Students). Institutions and other stakeholders in the sector recognize the value of an extended notion of success, taking account of issues such as personal goals and aspirations, and distance traveled.
In England, there is pressure to maintain and improve study success, especially in parts of the sector where it is lower than the national average, or in relation to students from less advantaged groups. A number of national policies have been employed to improve continuation and completion across the sector; these include different types of policies, which are largely aligned to improve study success. In particular, institutions’ funding is tied to the number of students enrolled and retained for the duration of the course, historically through the teaching grant paid to institutions, and more recently through student fees, which are directly related to the number of students studying. This provides a compelling financial impetus for HE providers to strive to improve study success, especially as the performance of each institution is measured and compared to its “expected” rates of continuation and completion and this information is widely published and utilized. The Higher Education Statistics Agency provides a “benchmark” for every institution, which is calculated by taking into account subject portfolio, entry qualifications, and student diversity. This benchmark figure is published alongside an institution’s actual performance with regard to the total institutional student population and in relation to subsections of the student population. National newspapers use this data to produce league tables about retention, and the information is fed into wider league tables about the “quality” of individual HE providers in England and the UK and which inform student-facing publicity such as the Good University Guide and similar, designed to inform students’ decision-making about entering HE. In addition, since 2003, additional funding has been paid to institutions to improve the continuation and completion of students from groups assessed to be at greater risk of withdrawal.
A further policy tool – the Teaching Excellence Framework (TEF) – was introduced in 2016 (BIS [Department of Business, Innovation and Skills], 2016). Its intention is to assess the quality of teaching and learning in each HE provider and to further inform students’ decision-making about where to study. It utilizes six metrics per institution, three of which are based on results from the National Students’ Survey, plus continuation data and two indicators of employment outcomes. The metrics are broken down in relation to student characteristics, identifying when particular target groups are scoring better, worse, or the same as the majority. Each institution (including private providers) has to meet baseline requirements to enter into the TEF and then receives an award of gold, silver, or bronze, based on an assessment of the metrics and the institution’s written submission by peers from across the HE sector. Therefore, the TEF provides an additional incentive for HE providers to maximize the continuation and completion of all student groups, as the data feed into publicly available assessment, which may be used to inform students’ decision-making regarding institutional choice. Further comparable information about the student experience is provided to students’ pre-entry through Key Information Sets, which present course-level data about every program in the UK through an online platform.
While the primary policy tools employed in England are institutional funding, and the provision of information to assist students with their decision-making, there is recognition that learning and teaching are integral to study success, and indeed much of the institutional funding is used to improve the learning, teaching and assessment experience to improve student engagement, belonging, retention and success (Thomas, 2012). This commitment to learning and teaching is reinforced by national initiatives: the government has contributed to the funding of a number of national organizations to improve learning and teaching: the Institute for Learning and Teaching in Higher Education (2000), Learning and Teaching Support Network (2000), Higher Education Academy (2004), Leadership Foundation (2004), and Advance HE (2018). These organizations, in various ways, have sought to develop and champion high-quality learning and teaching in HE, including its contribution to study success, through staff training and development, recognition and accreditation, and pedagogical research (Brooks, Baird, & Shenstone, 2014).
This drive for institutional change is reinforced by the requirement for institutions charging tuition fees over approximately £6,000 (this varies depending on their TEF rating) to submit an Access Agreement, or from 2018, an Access and Participation Plan, for approval from a government body (previously the Office for Fair Access and now the Office for Students). Approval is essential for HE providers to operate in England and charge fees above the threshold level. The Access Agreement, or Access and Participation Plan, specifies how a proportion of additional fee income is to be spent to ensure the access and success of disadvantaged student groups. This must include outreach work, financial aid, and spending to improve retention and success (including progression beyond HE). Research found that the process of developing and implementing an Access Agreement had a positive impact on institutional policies, planning, and behavior (Bowes, Thomas, Peck, Moreton, & Birkin, 2013), with most HEIs achieving or exceeding their targets. The process of producing and implementing an Access Agreement has impact on both institutions approach to increasing diversity and improving student success, and to the outcomes for students. The introduction of Access and Participation Plans in 2018 (with respect to students entering HE in 2019–2020) is intended to be more challenging for institutions, requiring them to focus on issues that they find the most challenging.
The combination of policies in England address study success from different angles, but they are largely reinforcing rather than in tension with each other. It might be anticipated that the introduction of high student fees (approximately £9,000 per year) would have a negative effect on study success, but rather it is stimulating and financing HEIs to care about and improve study success. However, it should be acknowledged that England has a fairly tight admissions system (institutional autonomy has been retained and is not regulated), which contributes to higher rates of study success. Furthermore, there is a widespread and embedded expectation that completion is possible within 3 years except for exceptional in circumstances. Institutions and students are not funded for more than 3 plus 1 years (except for longer courses), and students and their families do not expect to study for longer than the normal time period. This provides a good basis for retention and completion. National policy, guidance, and funding have been directed to maintaining and improving the retention of students in the context of expansion and increased diversity, and improving employability, and more recently the attainment outcomes of students.
Review, Discussion, and Conclusions
Each of the case studies can be assessed against 10 variables drawn from the HEDOCE project, as discussed above; these are presented in Table 1.
Review of national approaches to improving study success in Czech Republic and England
Variables | Czech Republic | England |
---|---|---|
Explicit definition of study success | To some extent. There is a definition, but it is not very explicit | Yes. Two complimentary definitions are defined and applied to every HEI |
Definition of study success is widely shared | No. The definition is contested, and views vary between stakeholders in the HE sector | Yes. The definitions are uncontested; additional shared definitions are being developed |
Institutional funding linked to study success | To some extent. Only a small and declining part of institutional funding is linked to study success | Yes. All student-related funding is determined by actual student numbers. Additional funding is allocated to support study success |
Student funding linked to study success | To some extent. Some penalties are in place for students who do not complete on time | Yes. Access to student funding (loans) is only provided for the duration of the course and up to one further year in exceptional circumstances |
Organization of higher education supports study success | No. Limited selection regarding access to HE reinforces withdrawal as a necessary aspect of the HE system. The quality assurance system does not consider withdrawal. Reorganization has rendered some undergraduate degrees as theoretical and lacking application | Yes. Admission is selective and there is an expectation of success. Having an approved Access and Participation plan is a requirement. There are several mechanisms (e.g., performance indicators and TEF), which reforce the value of study success. More flexibility could be built into the system to accommodate different study patterns |
Learning and teaching support study success | No. There do not appear to be any national tools or approaches used to develop learning and teaching for study success; this is left to institutions with little funding to incentivize this. | Yes. National research, policies and regulations, and developmental organizations promote high-quality learning and teaching as integral to study success. Institutional funding supports institutions to improve L&T |
Pre-entry information and support for student decision-making | To some extent. There is recognition that poor choice can result in transfer or withdrawal, but limited action to improve decision-making | Yes. Significant comparable information is provided to students prior to entry via KIS, TEF, and the media. Students receive no standardized support to use this information |
Post-entry information and support for success | No. There does not appear to be much information and support for students to facilitate study success within HE; transfer is treated as withdrawal | Yes. Selective access and the national priority accorded to study success means institutions have a commitment to support students to be successful. Reinforced by the funding model |
Measures of study success | Yes. Data are available about the performance of the sector regarding withdrawal, but not time to degree | Yes. The data are collected and analyzed nationally and compared with benchmarks based on institutional student and course profile |
Measures of study success used to improve student outcomes | No. Efforts to encourage institutions to engage with study success data and address issues have been resisted. This is probably due to lack of alignment between these 10 variables | Yes. Institutional performance and benchmark are widely published. This data informs the assessment of teaching excellence, league tables designed to inform student choice, and Access and Participation Plans. This data underpins national and institutional efforts |
Note. HE: higher education; HEI: higher education institution; KIS: Key Information Sets; TEF: Teaching Excellence Framework.
Table 1 summarizes the key national approaches to study success in Czech Republic and England in relation to 10 variables extracted from the HEDOCE research (Vossensteyn et al., 2015). This comparative analysis reinforces some of the key messages presented by the HEDOCE study, but also provides some further insights, which are discussed here: the contribution of national data to drive national approaches, institutional responses, and to measure progress; linking significant institutional funding to study success to necessitate institutions to make changes; the role of quality mechanisms to drive changes in learning, teaching, and assessment to improve study success; and the importance of an overt national commitment to improve study success, which permeates the wider HE policy agenda.
As the literature review demonstrates, many of the interventions that improve study success need to be implemented at the institutional level, or more locally within HE providers (e.g., at course level or in the classroom). National approaches therefore work most effectively when they positively influence the behavior of HEIs, especially with regard to the curriculum, teaching, and assessment. In the English case, it can be seen that funding policies (for students and institutions) promote retention and completion: much of the core funding of HEIs is directly related to number of students recruited and the number of students retained; additional funding is provided to ease the burden on institutions recruiting a more diverse student population; but all institutions are required by the regulator to spend a significant proportion of their additional fee income on both increasing diversity and improving the completion and progression of students from underrepresented and disadvantaged groups. Student financial support is provided for a fixed period, without the opportunity to prolong the study period except for exceptional circumstances.
In England, the collection and publication of institutional continuation and completion rates in relation to their expected rates (benchmarks) puts additional pressure on institutions to prioritize study success. This data is available to prospective students, and is used to inform commercial publications designed to support students’ decision-making about course choices. This data now forms an integral part of the quality process, the new TEF, which assesses the quality of learning and teaching at institutional level (and subject-level pilots commenced in 2018) and the outcomes again are provided to students to inform their decision-making. This national study success data are essential to both drive changes in institutional behavior and to assess progress. Insufficient progress impacts on institutional finance, recruitment, and even an institution’s right to operate in the English HE market, if the outcomes for targeted groups are not deemed good enough by the regulator, the Office for Students.
Within the English context, the national approaches create an institutional appetite to improve study success, which can be interpreted as a national commitment. National research (Moore, Sanders, & Higham, 2013; Mountford-Zimdars et al., 2015; Thomas, 2012; Thomas et al., 2017) and national policy guidance (HEFCE, 2015; Office for Students [OfS], 2018; Universities UK, 2016) address the problem of how to improve study success, and indicate the importance of learning and teaching, and thus national organizations and related initiatives have further supported the institutional goal to develop the academic experience to maximize study success.
In contrast, in the Czech Republic, national approaches do not seem to be operating in alignment to improve study success. There is not a shared definition of study success, and these results in a lack of data about the performance of the sector and individual providers. The majority of institutional funding is unrelated to study success (and this is also true for student financial support), so this does not provide an incentive for change. Furthermore, the quality assurance system pays little attention to curriculum, pedagogy, assessment, and student support (i.e., the student experience), but rather considers the academic qualifications of staff and their research performance – which undermines a student-centered approach. Indeed, the Czech Republic’s national approach is underpinned, and undermined, by a lack of national commitment to study success, and a belief that too much success is a bad thing. The Czech Republic, Visegrad countries, and other countries with aspirations to improve study success could use the list of variables presented in Table 1, to assess their national approach, and to benchmark this with the English approach. There are shortcomings in the English national approach, which are identified in Table 1 (e.g., the ad hoc support students receive prior to entry to use the plethora of information to inform their decision-making about courses and institutions and the lack of flexibility within the system to accommodate different study patterns), but it serves as a comprehensive, coherent, and effective model, which can provide insights for other countries seeking a national approach to improve study success.
Acknowledgements
This study draws heavily on the Higher Education Drop-out and Completion in Europe project, funded by the European Commission and led by the University of Twente, Netherlands and the Nordic Institute for Studies in Innovation, Research and Education, Norway. LT was a member of the core research team, and prepared the English case study. She is also responsible for the additional concept and design and analysis and interpretation of data presented in this paper. No funding is related to the analysis and preparation of this article and there is no conflict of interest.
About the Author
LT is a professor of Higher Education at Edge Hill University in the UK. Her research focuses on student equity, experience and outcomes, and institutional change. She led the “What works? Student retention and success programme” throughout its 9 years of operation (2008–2017), researching these issues in 22 institutions, and subsequently implementing and evaluating changes in 13 institutions and 42 disciplines. Other recent studies include HE students’ experience of independent learning, the engagement of commuter students, and developing a whole-institution-approach to diversity and success. Her research has influenced institutional policy and practice, national approaches in England, and is being utilized in a number of other countries to improve student equity, experience, and outcomes.
Ethics
The study procedures were carried out in accordance with the Declaration of Helsinki.
References
Beerkens, M. , Magi, E., & Lill, L. (2011). University studies as a side job: Causes and consequences of massive student employment in Estonia. Higher Education, 61(6), 679–692. doi:10.1007/s10734-010-9356-0
BIS [Department of Business, Innovation and Skills]. (2016). Higher education: Success as a knowledge economy – White paper. London, UK: Department of Business, Innovation and Skills.
Bowes, L. , Thomas, L. , Peck, L. , Moreton, R., & Birkin, G. (2013). The uses and impact of access agreements and associated spend. Bristol, UK: OFFA.
Brooks, L. , Baird, H., & Shenstone, A. (2014). Independent review of the higher education academy. A report to HEFCE by Capita Consulting. Bristol, UK: HEFCE.
Buglear, J. (2009). Logging in and dropping out: Exploring student non-completion in higher education using electronic footprint analysis. Journal of Further and Higher Education, 33(4), 381–393. doi:10.1080/03098770903272479
Chalmers, D. (2008). Indicators of university teaching and learning quality. Sydney, Australia: Australian Learning and Teaching Council. Retrieved from http://www.weboffice.uwa.edu.au/__data/assets/pdf_file/0007/1891663/Indicators_of_University_Teaching_and_Learning_Quality.pdf
Danish Ministry of Higher Education and Science. (2013, February). Frafald på videregående uddannelser [Dropout in higher education]. . Copenhagen, Denmark: Danish Ministry of Higher Education and Science.
Declercq, K., & Verboven, F. (2018). Enrollment and degree completion in higher education without admission standards. Economics of Education Review, 66, 223–244. doi:10.1016/j.econedurev.2018.08.008
EHEA. (2001). Towards the European Higher Education Area Communiqué of the meeting of European Ministers in charge of Higher Education Prague Communiqué. Rome, Italy: EHEA. Retrieved from http://www.ehea.info/Uploads/Declarations/PRAGUE_COMMUNIQUE.pdf
EHEA. (2012). Making the most of our potential: Consolidating the European higher education area. Bucharest Communiqué. Rome, Italy: EHEA. Retrieved from www.ehea.info/media.ehea.info/file/2012_Bucharest/67/3/Bucharest_Communique_2012_610673.pdf
European Commission. (2010). Europe 2020 strategy. Brussels, Belgium: European Commission. Retrieved from https://ec.europa.eu/info/business-economy-euro/economic-and-fiscal-policy-coordination/eu-economic-governance-monitoring-prevention-correction/european-semester/framework/europe-2020-strategy_en
European Commission/EACEA/Eurydice. (2012). The European higher education area in 2012: Bologna process implementation report. Brussels, Belgium: Education, Audiovisual and Culture Executive Agency.
European Commission/EACEA/Eurydice. (2014). Modernisation of higher education in Europe: Access Retention and Employability 2014. Eurydice report. Luxembourg: Publications Office of the European Union
Eurostat. (2018). Smarter, greener, more inclusive? – Indicators to support the Europe 2020 strategy (2018 ed.). Luxembourg: European Commission.
Eurydice. (2011). Modernisation of higher education in Europe: Funding and the social dimension. Brussels: Education, Audiovisual and Culture Executive Agency. Retrieved from http://eacea.ec.europa.eu/education/eurydice/documents/thematic_reports/131EN.pdf
Georg, W. (2009). Individual and institutional factors in the tendency to dropout of higher education: A multilevel analysis using data from the Konstanz Student Survey. Studies in Higher Education, 34(6), 647–661. doi:10.1080/03075070802592730
Hagedorn, L. S. (2004). How to define retention: A new look at an old problem. Retrieved from http://files.eric.ed.gov/fulltext/ED493674.pdf
HEFCE. (2013). Higher education and beyond. Outcomes from full-time first degree study. 2013/15. Bristol, UK: HEFCE.
HEFCE. (2015). National strategy for access and student success. Bristol, UK: HEFCE. Retrieved from http://www.hefce.ac.uk/sas/nsass/
Helland, H. (2005). Realkompetansestudenter bortvalg og studiepoengsproduksjon [Study progression and dropout among students entering on documented non-formal learning]. NIFU STEP report 6/2005. Oslo, Norway: NIFU STEP.
HESA. (2018). Non-continuation: UK performance indicators 2016/17. Retrieved from https://www.hesa.ac.uk/news/08-03-2018/non-continuation-tables
Heublein, U. , Schmelzer, R. , & Sommer, D. (2008). Die Entwicklung der Studienabbruchquote an den deutschen Hochschulen [The evolution of the dropout rate at German universities]. Hannover, Germany: HIS, Hochschul-Informations-System.
Heublein, U. , Spangenberg, H. , & Sommer, D. (2003). Ursachen des Studienabbruchs. Analyse 2002 [Causes of dropout. Analysis 2002]. Hannover, Germany: HIS, Hochschul-Informations-System.
Houston, M. , McCune, V., & Osborne, M. (2011). Flexible learning and its contribution to widening participation: A synthesis of research. York, UK: Higher Education Academy.
Hovdhaugen, E. (2009). Transfer and dropout: Different forms of student departure in Norway. Studies in Higher Education, 34(1), 1–17. doi:10.1080/03075070802457009
Hovdhaugen, E. (2012). Leaving early: Individual, institutional and system perspectives on why Norwegian students leave their higher education institution before degree completion (PhD dissertation). Sociology, Faculty of Social Science, University of Oslo, Oslo, Norway.
Hovdhaugen, E. (2014). Working while studying: The impact of term-time employment on dropout rates. Journal of Education and Work, 28(6), 1–21. doi:10.1080/13639080.2013.869311
Jones, R. (2008). Student retention and success: A synthesis of the research. York, UK: Higher Education Academy.
Krause, K. L., & Armitage, L. (2014). Australian student engagement, belonging, retention and success: A synthesis of the literature. York, UK: Higher Education Academy.
Larsen, M. R. , Sommersel, H. B., & Larsen, M. S. (2013). Evidence on dropout phenomena at universities. Copenhagen, Denmark: Danish Clearinghouse for Educational Research.
Ministry of Education, Youth and Sports, Czech Republic (MYES). (2000). Strategic plan for higher education institutions 2000–2005. Prague, Czech Republic: MYES.
Moore, J. , Sanders, J., & Higham, L. (2013). Literature review of research into widening participation to higher education. Bristol, UK: HEFCE.
Mountford-Zimdars, A. , Sabri, D. , Moore, J. , Sanders, J. , Jones, S., & Higham, L. (2015). Causes of differences in student outcomes. Bristol, UK: HEFCE.
National Audit Office. (2007). Staying the course: The retention of students in higher education. . London, UK: The Stationery Office.
OECD. (2008). Education at a Glance 2008. Paris, UK: OECD.
OECD. (2011). Education at a Glance 2011. Paris, UK: OECD.
Office for Students [OfS]. (2018). Guidance for access and participation plans, regulatory notice 6. Bristol, UK: OfS. Retrieved from https://www.officeforstudents.org.uk/media/1105/ofs2018_06.pdf
Orr, D. , Wespel, J., & Usher, A. (2014). Do changes in cost-sharing have an impact on the behaviour of students and higher education institutions? Evidence from nine case studies. Brussels, Belgium: European Commission.
Quinn, J. (2013, October). Dropout and completion in higher education and Europe among students from under-represented groups. . Luxembourg: European Commission.
Quinn, J. , Thomas, L. , Slack, K. , Casey, L. , Thexton, W., & Noble, J. (2005). From life crisis to lifelong learning: Rethinking working class ‘drop out’ from higher education. York, UK: Joseph Rowntree Foundation.
Stiburek, S. , Vlk, A., & Svec, V. (2017). Study of the success and dropout in the higher education policy in Europe and V4 countries. Hungarian Educational Research Journal, 7(1), 43–56. doi:10.14413/herj.2017.01.04
Thomas, L. (2012). Building student engagement and belonging in higher education at a time of change: Final report from the What Works? Student retention and success programme. London, UK: Paul Hamlyn Foundation.
Thomas, L. (2015). Editorial. Widening Participation and Lifelong Learning, 17(3), 5–16. doi:10.14413/herj.2017.01.04
Thomas, L., & Hovdhaugen, E. (2014). Complexities and challenges of researching student completion and non-completion of HE programmes in Europe: A comparative analysis between England and Norway. European Journal of Education, 49(4), 457–470. doi:10.1111/ejed.12093
Thomas, L., & Tight, M. (Eds.). (2011). Institutional transformation to engage a diverse student body. Bingley, UK: Emerald Books.
Thomas, L. , Hill, M. , O’Mahony, J., & Yorke, M. (2017). Supporting student success: Strategies for institutional change. What works? Student retention and success programme. . London, UK: Paul Hamlyn Foundation.
Troxel, W. G. (2010). Student persistence and success in United States higher education: A synthesis of the literature. York, UK: Higher Education Academy.
Universities UK. (2016). Working in partnership: Enabling social mobility in higher education – The final report of the Social Mobility Advisory Group. London: Universities UK. Retrieved from https://www.universitiesuk.ac.uk/policy-and-analysis/reports/Pages/working-in-partnership-enabling-social-mobility-in-higher-education.aspx
Ulriksen, L. , Madsen, L. M., & Holmegaard, H. T. (2010). What do we know about explanations for dropout/opt out among young people from STM higher education programmes? Studies in Science Education, 46(2), 209–244. doi:10.1080/03057267.2010.504549
Vossensteyn, J. J. , Cremonini, L. , Epping, E. , Laudel, G. , & Leisyte, L. (2013). International experiences with student financing: Tuition fees and student financial support in perspective. Final report prepared for the Dutch Ministry of Education, Science and Culture. Enschede, Netherlands: CHEPS.
Vossensteyn, J. , Stensaker, B. , Kottmann, A. , Hovdhaugen, E. , Jongbloed, B. , Wollscheid, S. , Kaiser, F. , Cremonini, L. , Thomas, L., & Unger, M. (2015). Drop-out and completion in higher education in Europe. A report prepared by CHEPS and NIFU. Luxembourg: European Commission.
Vlk, A. (2015). Czech Republic. Dropout and completion in higher education in Europe. Annex 3 country case studies Europe policy briefings Australia USA. Luxembourg: European Commission.
Vlk, A. , Stiburek, S., & Svec, V. (2016). Dropout calculation and related policies in Czech Higher Education. Efficiency and Responsibility in Education Conference, 13, 650–657. Retrieved from https://www.researchgate.net/publication/311558878
Yorke, M. (1999). Leaving early: Undergraduate non-completion in higher education. London, UK: Falmer Press.
Yorke, M., & Longden, B. (Eds.). (2004). Retention and student success in higher education. Maidenhead, UK: Open University Press.