Conference Agenda

Overview and details of the sessions of this conference. Please select a date or location to show only sessions at that day or location. Please select a single session for detailed view (with abstracts and downloads if available).

Please note that all times are shown in the time zone of the conference. The current conference time is: 22nd Dec 2024, 06:22:35pm CET

 
 
Session Overview
Session
Evaluation and quality assessment of ITD research
Time:
Wednesday, 06/Nov/2024:
3:00pm - 4:00pm

Location: De Expo


Show help for 'Increase or decrease the abstract text size'
Presentations

A Quality Assessment Framework for Transdisciplinary Research Design, Planning, and Evaluation

Brian Murray Belcher

Royal Roads University, Canada

Appropriate definitions and measures of quality are needed to guide research design and evaluation. Traditional disciplinary research is built on well-established methodological and epistemological principles and practices. Disciplines have their own evaluation criteria and processes in which research quality is often narrowly defined, with emphasis on scientific excellence and scientific relevance. Emerging transdisciplinary approaches are highly context specific and problem oriented, they integrate disciplines and include societal actors in the research process. Standard research assessment criteria are simply inadequate for evaluating change-oriented transdisciplinary research (TDR), and inappropriate use of standard criteria may disadvantage TDR proposals and impede the development of TDR. There is a need for a parallel evolution of principles and criteria to define and evaluate research quality in a TDR context. In 2015, we developed a TDR quality assessment framework consisting of twenty-five criteria organized under four principles. Since that time, the literature on TDR and TDR assessment has grown, other TDR research assessment frameworks have been published and tested, and we have further tested and refined our own assessment framework. This talk will present the underlying principles and approach of the TDR Quality Assessment Framework and review lessons learned from testing the framework in evaluations of several completed research for development projects. It will then review two other frameworks in use: RQ+ and the CGIAR Quality of Research for Development Framework. Based on this, we have developed a revised version of the assessment framework and the scoring system. The revised principles are: 1. Relevance. The importance, significance, and usefulness of the research problem(s), objectives, processes, and findings to the problem context (6 criteria); 2. Credibility. The research findings are robust and the sources of knowledge are dependable (12 criteria). 3. Legitimacy. The research process is perceived as fair and ethical (4 criteria). 4. Positioning for Use. The research process is designed and managed to enhance sharing, uptake, and use of research outputs and stimulates actions that address the problem and contribute to solutions (7 criteria). The main changes from the original version are in: the definition and naming of the fourth principle (from “Effectiveness” to “Positioning for Use”); filling gaps, eliminating overlap and refining definitions in individual criteria; replacing rubric statements with guidance notes. The QAF is designed for a range of users, including: research funders and research managers assessing proposals; researchers designing, planning, and monitoring a research project; and research evaluators assessing projects ex post. We present the key components of the revised framework and describe how to apply it.



Designing a self-assessment grid to improve the way interdisciplinarity is considered at every stage of a research project: an original support tool for project development.

Flore Nonchez, Maryline Crivello

Aix-Marseille University, France

Aix-Marseille University is a multidisciplinary university, with a variety of interdisciplinary programs and projects in research and education supported since 2012. Despite a series of successes and achievements supported by the "excellence initiative" label awarded to our University, in 2020 the newly-elected governance came to the realization that pushing interdisciplinarity further required a more proactive and encompassing approach to promote lasting change.Therefore, a new Mission for Inter- and Transdisciplinarity was launched in 2021 with four strategic complementary objectives. One of them focuses on providing practical support for the implementation of interdisciplinary projects by the community, by offering guidance, facilitation and evaluation tools. Indeed, as far as interdisciplinary research projects are concerned, we found that to fully achieve their objectives and optimize collaboration between disciplines while avoiding the pitfall of interdisciplinary washing and the sprinkling of buzzwords, they needed to demonstrate greater methodological rigor than standard disciplinary research projects, insofar as they had to meet both disciplinary and interdisciplinary requirements.

Based on our real "learn by doing" experience, we would like to explain in our presentation how we combined theory (i.e. literature review on the specificities of interdisciplinary research and evaluation) and practice (i.e. our own experience of facilitating multidisciplinary groups and interdisciplinary research programs) in an empirical approach to develop a new support tool for setting up interdisciplinary research projects. Indeed our practice of internal calls for interdisciplinary projects (from writing the framework to critically reviewing applications) enabled us to observe three frequent types of shortcomings (that we will present), corresponding to either expressed needs or implicit expectations on the part of researchers: this tool is an attempt to provide a full answer. It takes the form of a checklist of essential questions to be asked at every stage of any interdisciplinary research project, from the initial thought process through to the exploitation of results, in response to the sine qua non key success criteria we have identified for any interdisciplinary research project. It reminds projects’ leaders that interdisciplinarity cannot be improvised, and requires time and method to reach its full potential.

Our grid has been designed as a self-assessment practical tool, given that research projects are often set up in a short space of time (especially when participating in a call for proposals), and that we wanted to target all those involved in interdisciplinary research projects, whatever their knowledge and experience of interdisciplinarity.

In a nutshell, the vade-mecum that we will share at the conference is intended to be both a useful tool to improve the relevance, design and successful implementation of interdisciplinary projects, whether or not in response to calls for proposals, and an educational tool (to reinforce skills in setting up interdisciplinary projects, as a support for doctoral training...). We will also present how we disseminated this tool within our community of project leaders so far, and how we currently support and monitor its use within an action-research approach. We also plan to capitalize on our experience in other types of projects, in order to build up a comprehensive interdisciplinary toolkit. With a view to continuous improvement, we hope to enhance this grid with feedback from our colleagues, including during its presentation at this conference.



Navigating Impact: Real-Time Observation in Transdisciplinary Projects

Marlene Franck, Sebastian Preiss

Hans Sauer Stiftung, Germany

As an intermediary institution working in transdisciplinary project contexts, the social design lab experienced the demands to enable strategic project iteration in open ended processes and real-world experimentation settings. Small, barely perceptible changes within the project context often seemed to be the drivers for societal change and transformations. In order to empirically prove this perception and efficiently prompt adjustment of resources and the research/project design, a real-time impact observation methodology was developed within the social design lab. This methodology seeks to identify impacts, potentials, and changing needs during the project, complemented by an ex-ante and ex-post analysis. The presentation will provide an explanation of this methodology.

The framework of the impact observation is set in the ex-ante impact orientation phase. In this phase a vision and the transformation tracks are formulated. Transformation tracks are strategic corridors, which the project team defines as crucial for reaching the desired societal transformation. For each transformation track qualitative short-term objectives, so-called transformation qualities are defined.

Within the operative work of transdisciplinary projects, the real time impact observation Is the heart of the developed methodology. It is carried out by collecting and evaluating information on (presumed) impacts or small changes, so called impact particles that could potentially lead to impacts. To foster feasibility, impact particles are noted by the observing team members in a questionnaire shortly after the observation. These notes are always taken in a standardized template (context, occurring change, assumptions about long term consequences, actors, date). If possible, all information about occurring change is directly assigned to the transformation track and transformation qualities to which they are presumably contributing. In frequent cycles the collected impact particles are presented, discussed and checked for data quality. Based on clustered impact particles and complemented by insights from discussions in the team, recommendations for action and adjustments to the strategy are developed. The recommendations for action lead to instructions and to-dos for the project team, making the impact observation a central method for project management.

After completing a long-term project cycle, or a whole project, an ex-post analysis is conducted. During a half-day workshop, the conclusions of each transformation quality are reviewed and impact narratives are formulated. These narratives describe impact patterns and chains that became visible throughout a longer time period, connecting different transformation qualities.

The methodology helps to observe impact in the sdl projects, yet this approach is not without flaws. Detailed notetaking of impact observation is time consuming, creating a trade-off between addressing real-world problems and gaining insights about impact. However, as the impact observation is producing knowledge for social transformation processes and helps the sdl to better process and pass on experiences, this effort is considered worthwhile. Regular reflection cycles help not overlooking small but important refinements in the daily routine. In order to efficiently carry out impact observation on a daily basis, the sdl developed some feasible and easily implementable approaches and techniques.



Transdisciplinary research needs transdisciplinary evaluation: Insights from a review process of transdisciplinary research proposals

Nadin Gaasch1, Audrey Podann2

1Berlin University Alliance c/o Technische Universität Berlin, Germany; 2Technische Universität Berlin, Germany

How can we ensure that transdisciplinary research is ultimately funded when reviewing transdisciplinary research proposals? We were faced with this question at the end of 2020. For us as dedicated supporters of transdisciplinary research, it was initially quite simple: by bringing experts with transdisciplinary knowledge into the review process in form of a transdisciplinary peer review. However, this alone was not the success factor. In our presentation, we will discuss which factors we consider to be important for selecting projects which demonstrate a convincing transdisciplinary research design.

Although there is a great wealth of research and empirical findings on the evaluation of existing research projects, there are hardly any findings in the literature on transdisciplinary review processes for research proposals. With our contribution we want to draw attention to this gap and share our experiences. Thus, we target the conference stream 1) Enhancing the theoretical foundations of inter- and transdisciplinary with special contribution to: "Harnessing experience and knowledge gained from inter- and transdisciplinary projects and programs" with focus on "Evaluation and assessment".

With our presentation we want to answer the following five questions:

1. What are the preconditions for implementing a transdisciplinary review process?

2. What ingredients are needed for a transdisciplinary review process?

3. What positively surprised us in the overall process - from the conception to the implementation of the review process?

4. Where would we make improvements?

5. What advice can we give to those who want to set up such a transdisciplinary review process?

Our experience of setting up a call for proposals for transdisciplinary projects and their evaluation is based on two calls for proposals for transdisciplinary research projects of the Berlin University Alliance. The Berlin University Alliance is a cooperation of four Berlin partners – Freie Universität Berlin, Humboldt-Universität zu Berlin, Technische Universität Berlin, and Charité – Universitätsmedizin Berlin – that receives funding in the line of the German government’s Excellence Strategy. The four partners aim to overcome institutional and disciplinary boundaries in order to create an integrated research environment. In this context, the Berlin University Alliance promotes transdisciplinary research projects, which we actively support as the Alliance's TD-Lab - Laboratory for Transdisciplinary Research.



 
Contact and Legal Notice · Contact Address:
Privacy Statement · Conference: ITD Conference 2024
Conference Software: ConfTool Pro 2.8.103
© 2001–2024 by Dr. H. Weinreich, Hamburg, Germany