Changes to the 2013 Aid Transparency Index
Earlier this month we started data collection for the 2013 Aid Transparency Index.
As the Index is our most important advocacy tool, it needs to evolve and keep up with the changes to the wider transparency and aid effectiveness environment. So, this year we are focusing not only on the availability of aid information, but also on the quality and comparability of the information.
Why have we changed our approach?
At the Busan High-Level Forum on Aid Effectiveness, donors agreed to implement “a common, open standard for electronic publication of timely, comprehensive and forward-looking information on resources for development cooperation by the end of 2015”. These specific, time-bound commitments marked a noticeable shift in the aid transparency agenda, particularly when compared to the broad statements made previously .
Donors committed to produce implementation schedules by the end of December 2012, outlining what elements of the standard they would be publishing by when. Since then, 44 organisations have produced implementation schedules. You can read more analysis of the here.
The evolution in the Index methodology follows this progress made by donors in using (or committing to use) a common standard for publishing their aid information in an open and timely manner. The changes also respond to feedback received from donors’, peer reviewers and CSO partners on the previous methodology, particularly regarding the sample selection for activities (something we’ve always struggled with – see ‘challenges, limitation and lessons learned’ section in the 2012 Index for more on this) and the need to assess the quality of the data being made available. This new emphasis on accessibility and quality also allows the Index to better reflect the type of aid information needed by recipient country governments if it is to be truly useful and meaningful for their own work.
In other words, the changes in the Index methodology are reflective of the changes in the global environment, taking into account the latest set of commitments donors have made and the needs of recipient countries.
…And what are the specific changes?
Indicators: The new methodology uses 39 indicators to monitor aid transparency, most of which are drawn from the indicators used previous years. Indicators have been grouped into categories covering commitmentto aid transparency and publication of data at the organisation and activity level. Four indicators from 2012 (country budget, current activities, database and design docs) have been dropped or replaced, either because they have been refined further or because they overlap with other indicators.
But two new commitment indicators have been introduced:
- commitment to aid transparency, measured using our analysis of donors’ implementation schedules; and
- accessibility of data, an assessment of donor’s information portals /websites based on whether they allow free bulk export, provide detailed, disaggregated data and are open licence.
Scoring approach: A new, graduated scoring methodology is used for some of the indicators in the publication category. This new scoring takes into account the format that the information is provided in. For example, data that is published in the most open, comparable format, IATI-XML, can score up to 100% for these indicators, while information items published in less accessible formats such as Excel, CSV or PDF will be scored lower, using a graduated scale based on the accessibility and comparability of each of these formats. More information on the scoring approach for individual indicators can be found here.
Data collection: We have built a new platform for data collection this year – the Aid Transparency Tracker, which brings together three tools – an Implementation Schedules analysis tool, a Data Quality tool and a survey tool.
Data collection will follow a two-step process for donors publishing information in the IATI-XML format. First, their data will be run through the Data Quality tool which is designed to run automated checks and tests on each donor’s data, providing both a comparative view across donors and granular details on each organisation’s data. These tests are aggregated to produce scores for indicators which they are relevant to. Next, for those indicators for which information is not published in IATI-XML, or for donors publishing data in other formats, the data will be collected using a manual survey, following the same process as in previous years.
Most importantly, what comes next?
This change in methodology means that donors who are not publishing their aid information in accessible, comparable formats will not score well under this year’s methodology. Unfortunately, we may see several donors that have done moderately well in the past slip further down the ranking.
For example, a donor that publishes its information in PDFs – the least accessible format, making information difficult to access, compare and use – can only score a maximum of one third for 22 of the 39 indicators and a maximum of half for 13 others. At the same time, donors that have started publishing high quality, comprehensive data in more comparable, open formats such as IATI-XML or Excel could see significant improvements in their score.
Most importantly, the changes in methodology will allow us to complete an annual stock-take of donors’ progress in meeting their Busan commitments and help them identify how they can improve between now and the 2015 deadline.
Another implication of the changes is that a year-on-year comparison between absolute scores and rankings in 2012 and 2013 will be hard to do. However, given that a majority of the indicators used and organisations featured remain unchanged, it will be possible to compare movements in the relative positions of organisations within the overall ranking over the years.
In a few weeks time, we will share more information on the Aid Transparency Tracker platform and the different tools it hosts. We welcome your thoughts on this new approach. Any change provides an opportunity for dialogue on future improvements so please send us your feedback: email@example.com