By: Becky Hillyer
Research Analyst, OCSDNet


Summary: 

    • OCSDNet has been working with DECI since mid-2015 to improve network strategies and capacity for monitoring, evaluation and communications. Recently, a workshop was held in Cape Town in order to discuss evaluation and communication practices for large, global research networks.
    • Designing an iterative and reflexive evaluation and communications strategy for a highly diverse global research network is important for promoting and understanding research impact and for adapting to an ever-changing research climate

    • Critical reflections on Evaluation and Communications within a complex, Open Science Network

      Two week ago I had the privilege of attending a workshop hosted by DECI (Developing Evaluation and Communication Capacity in Information Society Research) and IDRC in Cape Town. DECI is an international action-research project composed of mentors around the world that partners with IDRC-funded research networks to provide support in the development of evaluation and communication strategies. At the same time, they are involved in on-going and iterative discussions with their partners to better understand how they can improve their own processes to be more accessible and effective. OCSDNet has been a ‘mentee’ of DECI since August 2015.

      The Cape Town workshop was a unique opportunity for DECI mentors and mentees to physically meet and discuss how DECI’s approach has worked, not worked, and/or been employed differently within different contexts. Moreover, the event offered DECI partners the unique opportunity to critically reflect on their processes of strategy-development and to offer feedback around how these processes may be altered to better support learning and usage.

      Background of Monitoring and Evaluation in OCSDNet

      To put it quite bluntly, the entire concept of monitoring, evaluation and communications was minimal within OCSDNet’s initial program proposal to IDRC almost two years ago. Although we had a general inclination that we would use an “outcome harvesting” methodology for evaluation purposes, there was little foresight around who would undertake this process, what resources would be required, nor really what structure would be used to employ this methodology in a strategic way. Nonetheless, we proceeded to develop an array of data-collection and communication tools: monthly reports from OCSDNet project teams, on-going publication of blog posts, newsletters, website forums, Twitter and Facebook pages, etc.

      Several months into the program, IDRC approached the network to suggest that we establish a relationship with the DECI team in order to more deeply consider an evaluation and communications strategy for the network. But why, we thought. We have all of these tools and processes! We are already doing evaluation and communications!

      After several calls with DECI, we remained unconvinced that we needed their mentorship. We didn’t understand this “UFE” and “Res-Com” jargon. We felt that it would simply add to our workload, unnecessarily. We couldn’t understand the value.

      Finally, after a meeting with the Research on Open Educational Resources for Development (ROER4D) network (another international research network funded by IDRC, who has been working with DECI since their program’s inception), we finally began to understand what a comprehensive evaluation and communication strategy could look like for a diverse and international research network. More specifically, we began to understand how DECI’s approach could be used to facilitate that process. We decided, at that point, to move forward in our working relationship with the DECI team. We are now happy that we did.

      Over the past few months, we developed a flexible and iterative strategy for responding to a set of key evaluation questions and acting upon a set of communications objectives. Working with DECI has helped us to critically consider why we are using the tools that we use and to what end. However, the process has not been without its challenges.

      The DECI Model & OCSDNet

      Indeed – how does one create an evaluation strategy for a large and diverse network like OCSDNet? How do we measure vague but important concepts such as “field building” in the context of open science in development? How do we understand whether the network’s research and learning is making some sort of impact at a larger scale? This is evidently a complex scenario and requires a strategy of evaluation and communication that is able to draw out patterns, themes and outputs in an iterative and emerging way. Certainly, this is not a context that would do well with traditional modes of results-based M&E tools such as log frame analysis.

      DECI proposes a model of evaluation -“Utilisation Focused Evaluation” (UFE) – that positions the selection of key evaluation questions in the context of their usefulness to a set of ‘users.’ Users can be network coordinators, policy makers, IDRC staff, project teams or community members: basically anyone to whom the results of the evaluation process could be useful. Moreover, they propose a model of evaluation that is not set in stone. Instead, they encourage frequent revisions and updates to strategies as projects evolve and priorities change. To complement this evaluation strategy, DECI also recognises the value of a coherent and complementary strategy for communicating network learning and meeting proposed objectives.

      Personally, with a background in action research and participatory development, I understand DECI’s models to be closely related to processes of reflexive or ‘double loop’ learning, whereby researchers consistently reflect on what they know, how they know it, and how that knowledge causes them to interpret and react to new forms of knowledge or opinions.

      Key Learning from the Cape Town Workshop

      During the Cape Town meeting two weeks ago, there were a variety of interesting discussions that arose around DECI’s current methodology and how it may be improved for better uptake and improved effectiveness for partners. In the case of OCSDNet, while we have benefitted from DECI’s structure and strategic ways of thinking about evaluation and communication, we felt that the often-linear ‘steps’ proposed in their process do not always bode well with the intention of their approach nor indeed the complex and iterative nature of OCSDNet. We discussed that a potential change to this approach could be to consider key elements of an evaluation or communication strategy as a series of gears which, when all working together, produce the intended outputs of a research network. Moreover, through shared learning about OCSDNet’s process of developing an ‘Open Science Manifesto,’ we suggested that it could be useful for a research team to begin the development of their evaluation and communication strategy with the construction of a set of core, underlying principles that could help to set the tone and contextualise the institutional culture around evaluation and communication.

      For instance, in the case of OCSDNet and similarly structured networks such as ROER4D, we recognised that our relatively horizontal structure of leadership and collective values of openness, transparency, accountability and reflexivity have quite naturally shaped the methods that we have chosen for evaluation and communication purposes. On the other hand, the methods chosen and indeed the way that another network may conceptualise DECI’s approach may be quite different, given variations in the principles, norms and values that shape their respective institutional cultures.

      Another interesting discussion to arise from the Cape Town workshop was the idea of positioning tentative “evaluative moments” within a program’s theory of change. Considering a theory of change as an incomplete road map may be a useful analogy in this regard. Prior to a project’s inception, it is a helpful practice to think strategically and predicatively about where potential “junctures” may exist, that could signal turning points within a project or program. While it may not be possible to predict the route that a project may take, it is nonetheless important to position where such key moments could exist within a theory of change in order to develop the tools and strategies to react efficiently when these moments arise.

      Towards the Development of a Refined Network Strategy for M&E and Communications

      The discussions from the Cape Town workshop have empowered a new way of thinking about monitoring, evaluation and communications for OCSDNet, which, while still in development, we feel is more reflective of the network’s core objectives, theory of change, intended outcomes, shared principles and adaptive approach. We understand an effective evaluation and communications strategy as a vector for connecting all of these elements and ensuring that the goals and intentions of the network are achieved. In short, we envision the system moving (albeit sometimes roughly) as follows:

      Updated M&E and Comms layout May 2016

      All in all, DECI’s Cape Town workshop was both insightful and helpful from the perspective of understanding how to further develop our own network evaluation and communication strategies, as well as being an important example of how “capacity development” can (and often should) be a two-way, dialogical process of learning and sharing between both mentors and mentees.

      Key learning from the Cape Town workshop can be summarised as follows:

              • The importance of discussing, agreeing upon and positioning shared “principles” at the core of a network-based evaluation and communications strategy

       

              • The relevancy of institutional culture and hierarchy for shaping the way an approach to evaluation will be designed and whether or not it will be successful

       

              • The importance of an iterative and reflexive approach to M&E and communications, given the complex nature of an international research network such as OCSDNet

       

            • Understanding that a project’s “theory of change” is not set in stone; it must also be open to adaptation and reflection as learning ensues.