[Theme in Focus] Towards Evaluation Culture in Western Balkans WBC-RTI.info Newsletter February 2016

RTDI Evaluation

The complexity and heterogeneity of innovation systems in different European and Western Balkan countries  requires strategic intelligence to design, implement, and monitor research, technological development, and innovation (RTDI) measures at different spatial levels (local, national, regional, and European) by addressing issues of relevance, efficiency, efficacy, impact, and sustainability . For this purpose, evaluations are an essential tool for evidence-based decision-making. This is especially true in the South-East Europe region, which is characterised by the adoption and adaptation of new RTDI policies, programmes, and (support) institutions, and a transformation of funding towards competitive schemes. At the same time, however, a lack of methodological and procedural know-how on the part of both evaluators and awarding authorities concerning the purpose, design, and use of evaluations, has become evident.

Developing Evaluation Culture in South East Europe / Western Balkans

SEE countries do not have an evaluation culture partly because of a wrong perception of the role of evaluation (they see it as criticism and punishment rather than a way to improve policies) and partly because of lack of skills. (Lena Tsipouri / Nikos Sidiropoulos, 2014 [1])

Evaluations are considered to be an effective tool for ensuring transparency and accountability and can contribute to more efficient modes of new public management. Also, the correct application of the different types of evaluations has to be learned in policy systems with continuously increasing complexity. This complexity results from vertical and sometimes quite different spatial intervention levels (i.e. local, national, regional, European, global). It is also affected by an increasingly complex web of rules and regulations (i.e. national/European/global) and by the emergence of horizontal multi-level policy systems cutting across previously more separated policy fields and stakeholder arenas. Ex-ante, interim, terminal, and ex-post evaluations have to be properly and meaningfully tendered, and must be implemented so as to secure strategic intelligence building and evidence-based decision making.

The innovation capacities and innovation levels in the South East Europe region are limited and public interventions are necessary. However, under tight financial regimes, public spending for innovation has to identify the right rationales and mechanisms for performance-based innovation funding from the start. To secure the optimum use of taxpayer money, principles of good governance have to be respected.

Based on the research conducted within EVAL-INNO project (Fostering Evaluation Competencies in Research, Technology and Innovation in the SEE Region), the key challenges for improved RTDI evaluations in the SEE region – which applies also for WBC – include:

  1. the  lack  of  qualified  evaluators  for  programme,  institutional  and  policy evaluations  in  the  field  of  RTDI,  as  well  as  methodological  deficits  and weaknesses;
  2. the  lack  of  knowledge  on  professional  tendering  procedures  (incl.  public procurement laws) to obtain the best evaluation results;
  3. difficulties  in  accessing  RTDI  evaluation  information  and  good  practices, and  a  general  lack  of  the  usage  of  good-practices  for  RTDI  programs, institutions and policy evaluations in the region.

“Evaluations in the RTDI policy domain entail a number of complex management and methodological questions in a highly interdisciplinary area.” (Balázs Borsi, 2014 [2])

Besides trainings organized (please see the EVAL-INNO website for the training materials which are publicly available), EVAL-INNO project also adopted and published in several languages a model of RTDI Evaluation Standards as well as RTDI Programme Evaluation Guidelines which primarily target evaluation practitioners in the South-East European countries, namely:

  • Organisations thinking about commissioning an evaluation
  • Analysts in the commissioning organisations, who support the decision making process pertaining to evaluations, and
  • Current and future evaluators, who need to conduct their work in a policy- and politics-influenced environment, which impose certain limitations compared to pure research assignments.

Trainings and developed guidelines and standards were also promoted and disseminated to all relevant stakeholders from policy and academia in WB countries. 

“Apart from severe brain drain or a lack of private sector R&D the countries in the region suffer from the system which encourages academics to focus on teaching rather than on research or entrepreneurship. The lack of statistical data on R&D systems and the quality of available data can still be considered problematic. Furthermore, in 2016 the lack of “evaluation culture” is still evident in the region.” (Djuro Kutlaca, 2016 [3])

In order to foster the evaluation culture in the Region WBRIS (Western Balkan Regional R&D Strategy for Innovation)  explicitly suggests generating and systematically updating R&D statistics in line with standard practices established by the Oslo manual and consistent with EURSTAT data, including data related to the scientific diaspora, the Community Innovation Surveys, and other EU indicators as well as to implement a monitoring and evaluation system enabling the assessment of public expenditures in research and innovation. The “South East European 2020 Strategy” (SEE 2020) put as well a strong focus  in all measures suggested to introduce efficient monitoring and evaluation mechanisms in the region’s public sectors (including R&I systems) to improve transparency and accountability. In order to improve the national R&I systems and integration into the European research Area (ERA) also different national R&I strategies (such as the new Serbian strategy 2016-2020) envisage “policy mix peer review" to be implemented.

Learning “Evaluative thinking” – a possible way forward towards "evaluation culture" in the Western Balkans region?

Probably the most regular driver that enhances new evaluation culture in Western Balkan is external financing of development projects, either through development assistance agencies or in programs financed by European Union. Another important factor is increased strength of civil society organizations in region and their demand to participate in policy making on EU, national and local level of governance. (Bojan Radej, 2016 [4])

Increasing need for establishing critical mass of evaluators led Slovenian Evaluation Society to propose establishing Western Balkan Evaluation Network (WBEN) to Evaluation Society in B&H in 2012. Today WBEN involves founding members also from Croatia, Montenegro, Macedonia, and Serbia. In October 2015 the first Western Balkan conference on Evaluation was held in Sarajevo, beeing organized by Evaluation Society in Bosnia and Herzegovina (BHeval) as a member of Western Balkan Evaluators Network (WBEN).

Conference also adopted a joint evaluation statement to be disseminated to national and regional stakeholders about importance of strengthening national evaluation capacities in Western Balkan region (»For enhanced evaluation of development in the Western Balkan countries«). The signatories of the statement shall endeavour for the support and promotion of the development of evaluations systems in the Western Balkans. The objectives are as follows:

  • Improvement of the quality of public management in preparation of policies, plans, projects and projects.
  • Contribution to the integrated information to efficiently harmonise the mutual interests.
  • Contribution to the transparency and democratic management of public interests and the involvement of the civil society and private sector into the processes of evaluations,.
  • Contribution to the equal treatment of all stakeholders, especially of vulnerable social groups for equity-focused and gender-responsive evaluation.
  • Contribution to the protection of the environment and to the protection of natural and cultural heritage
  • Contribution to the integrated solutions of the detected challenges.

“Evaluative thinking is critical thinking applied in the context of evaluation, motivated by an attitude of inquisitiveness and a belief in the value of evidence, that involves identifying assumptions, posing thoughtful questions, pursuing deeper understanding through reflection and perspective taking, and informing decisions in preparation for action.” (Buckley et.al, 2015 [5])

In their article published in American Journal of Evaluation in 2015, Buckley et.al emphasise that “Evaluative thinking (ET) is not a born-in skill nor does it depend on any particular educational background”. Furthermore, if an organization’s leader asserts that ET is important, yet does not provide opportunities for staff to learn about and practice it, little or nothing will change.

The same could be applied for public institutions dealing with and implementing R&I policy in the region – if evaluation is on one side considered relevant within strategies, however no policy, institutional, programme or project evaluations or only rare and on irregular basis are conducted by addressing issues of relevance, efficiency, efficacy, impact, and sustainability and by involving international evaluators – little or nothing will change. Furthermore, if no opportunities for staff to learn about monitoring and internal evaluation and practice it – little or nothing will change within the institution. Buckley et. al furthermore highlight that ideally, all members of an organisation (or institution) should have the opportunity to think evaluatively about their work and the efforts to promote ET should not be limited only to staff with evaluation responsibilities.

However, unlike people, organizations do not think or act. Therefore, an organization that ‘‘thinks evaluatively’’ might be better described as an organization in which people at all levels of the organization are evaluative thinkers and are engaged in the evaluation of what their organization does in some capacity. Creating and encouraging a culture of ET is not a trivial undertaking. It requires intervention and commitment at multiple levels of the system, where individuals at all levels of the system will require different evaluation knowledge, attitudes, and skills. For example, those at the ‘‘top’’ of the management hierarchy will have to be committed to allowing time and space for evaluation, as well as to being open to change based on evaluation results; program-level staff will have to adopt the knowledge and skills; and all levels of the hierarchy will need to build trust, support broad inclusion in decision making, and create a space that is safe for asking questions and giving and receiving honest critique without fear of shame or losing one’s job. (Buckley et.al, 2015 [6])

Open Evaluation 2016 - International Scientific Conference on RTI Policy Evaluation

One of the opportunities to exchange and learn about current developments in RTI Policy Evaluation Sector in 2016 is offered by the Open Evaluation 2016 conference to be held on November 24-25, 2016 in Vienna. The conference will gather academics, evaluators, research managers, authorities and RTI policy makers to debate challenging developments in RTI policy and their effects on evaluation theory and practice.

The conference addresses new actor settings, approaches and themes in RTI policy evaluation. Thus, the term OPEN EVALUATION signals openness towards new values, new stakeholders and beneficiaries and new approaches and themes in RTI policies and RTI evaluations.

This is as well a good opportunity for evaluators and policy makers from the Western Balkan region to exchange with their peers from EU MS. Submission of extended abstracts of 3 to 5 pages including references is open until April 15, 2016. More information is provided here.

[1] Lena Tsipouri / Nikos Sidiropoulos: "RTDI evaluation culture in the EVAL-INNO countries". In: Ivan Zupan, Martin Felix Gajdusek, Ines Marinkovic (eds): "Developing RTDI Evaluation Culture in South East Europe". LIT 2014

[2] Balázs Borsi: "RTDI Programme evaluation guidelines". EVAL-INNO 2014

[3] Djuro Kutlaca. Interview on Feb.20, 2016

[4] Bojan Radej. Report on the First Western Balkans conference on Evaluation / Email exchange on Feb. 29, 2016

[5] Jane Buckley, Thomas Archibald, Monica Hargraves, William M. Trochim: "Defining and Teaching Evaluative Thinking: Insights From Research on Critical Thinking." p. 378. In: American Journal of Evaluation 2015, Vol. 36(3) 375-388

[6] ibid. p. 383


You can find more articles related to this topic here.


 

Document type
  • Newsletter
Language

English

Publication Year

2016

Geographical focus
  • Western Balkans
Scientifc field / Thematic focus
  • Cross-thematic/Interdisciplinary

Entry created by Ines Marinkovic on April 8, 2016
Modified on December 30, 2016