By Urama Kevin Chika, Executive Director, ATPS

In response to the post titled “Opening up Research Funding at ATPS” by Adrian Ely, I would like to expound on the Network’s strategic approach to putting stakeholder participation at the core of the research funding process and throughout the whole value chain. This is built on the firm belief that knowledge held by different knowledge communities (both tacit and codified) is valid and could be the bases for innovations for development.

The ATPS supports demand-led capacity building activities in the area of Science, Technology, and Innovation knowledge generation, knowledge circulation and networking, policy making and policy practice in Africa for African development and global inclusion.

In doing so, the ATPS engages the quadruple helix (Science Experts, Policymakers, Private Sector Actors, and the Civil Society) in the identification, prioritization, and implementation of STI policy research and policy processes at national, regional, and global levels.

If asked to state the underlying beliefs in two phrases, I would borrow two from the recent stakeholder workshops held by the ATPS in the process of developing an African manifesto for Science, Technology and Innovation for Africans by Africans. One, “Innovation does not happen in the mainstream” and two, “Collaboration breeds Innovation”.

The ATPS Secretariat has followed a transparent and rigorous participatory process of engagement with the quadruple helix for evaluating the proposals received under the auspices of the calls for proposals for the year 2009. The drawn out process involved several stages:

(a) A drawn out participatory process involving the ATPS national chapters and national and regional stakeholders led to the identification and prioritization of core areas of development challenges and STI policy needs in the member countries and in the region requiring urgent attention in the medium term. These informed the strategic goals and priority implementation programs of the ATPS Phase VI Strategic Plan, 2008 – 2012. Some required regional responses and actions while other required national case studies;

(b) The process in (a) above informed the call for proposals advertised on the websites and also disseminated through the network membership in the national chapters;

(c) All proposals received by the ATPS were submitted to the National Chapter Coordinators for pre-advice and assessment on the basis of three criteria including (Science Quality, Added Value / Innovation Quality; and Societal / Policy Relevance). Each National Coordinator reviewed the proposal and provided pre-advice to the proponents on how to improve the proposal to increase relevance to national policy needs and also add value at the local levels. In addition, they also ranked the proposals in order of Science Quality, Added Value / Innovation Quality; and Societal / Policy Relevance.

(d) The proposals were also simultaneously sent to a selected team of three from an Independent International Expert Panel of Reviewers (IIEPR) for each strategic theme: Climate Innovations; Agricultural innovations; Intellectual Property Rights. The IIPER reviewed and ranked the proposals independently using the same set of criteria as the national coordinators, but focusing on the regional development challenges and policy priorities. Each IIPER member also provided pre-advice to the candidates to assist them in revising the proposal to improve the innovation content, science quality and societal/policy relevance. All the pre-advice forms were also forwarded to the respective candidates anonymously (i.e., without the names of the reviewers) to inform revision of their proposals.


(e) The scores ranks were collated by the ATPS Secretariat Management Committee (SMC) and the average marks for the four evaluation scores (the National Chapter Coordinator and the three members of the Independent Expert Panel) were computed by the ATPS Electronic Grant Management System (EGMS). This system ensures an independent and impartial sorting and short listing of the proposals to be invited to a Tournament where each Lead Investigator presented the proposal to the ATPS membership and the same Panel of IIEPR, with ATPS Science Council and Board members present.

(f) At the Tournament, each candidate had 10 minutes to present the proposal before the Panel of International Experts, other members of the ATPS and stakeholders, and the ATPS Board. The IIEPR repeated the same system of evaluation on the basis of the same set of criteria. At the same event, the ATPS Participatory Proposal Evaluation System (PPES) was also implemented whereby the wider membership of the ATPS and stakeholders were involved in scoring the presentation of the proposals during the Tournament. The scores from the general membership serve as a control score showing the general perception of the proposed activity by ATPS stakeholders. The average scores obtained from this general assessment process are held on the EGMS for the records and as a control on the scores obtained from the rigorous assessments by the national Chapter Coordinators and the IIEPR. If for instance, the mean score obtained from the general membership rates a proposal as excellent, but the Panel of International Experts rated the same proposal as very poor, there might be reason to investigate the proposals further by sending it out to another set of Independent Experts for further review. All these scores and pre-advice forms will be held in the ATPS archives to inform capacity building needs of the research teams in future.

(g) The mean scores from the whole process, the pre-advice scores by the national chapters and the IIEPR, the tournament scores by the IIEPR, and the general scores by the ATPS membership and stakeholders are collated by the EGMS and made available to the ATPS Board to inform their decision on the specific activities to be implemented. It is very interesting to note here that the scores collated from the recent tournament in Abuja Nigeria on 26 November 2009, show strong correlation between the mean scores by the IIEPR and that of the general members and stakeholders.

The hall mark of this Participatory Proposal Evaluation System (PPES) is the core principles of Transparency, Objectivity and Responsibility which the ATPS Secretariat has adopted as the guiding principles of engagement in all activities. The Network membership is proactively engaged at all stages of decision making to ensure ownership of the process and the products of the ATPS activities. This is also necessary to ensure effective implementation and policy relevance of our activities.

All ATPS thematic research programs focus on facilitating innovation capacity development at individual and institutional levels. Each research program is therefore expected to include the “make” or “design” perspective, i.e. translation of the research outputs into “institutional” and/or “social engineering” designs, and/or cost effective “technical designs/technologies” that are necessary to the specific development and/or policy gaps addressed. To enhance the process, each research team is expected to include (or proactively engage) trans-disciplinary science experts; relevant policymakers and practitioners at all stages of the project cycle: from conception to implementation and dissemination of results. The scientific quality; added value and innovation content; and societal and policy relevance of activities are regarded of equal importance in all ATPS activities.