Please note: You are using an outdated version of Internet Explorer. Please update to IE10 here to properly experience the ATI website.

FAQs

General

1. Why do you produce the Aid Transparency Index?

Publish What You Fund produces the Aid Transparency Index (ATI) in order to assess the state of aid transparency among the world’s major donors, track and encourage progress and hold donors to account.

2. What kind of change do you hope to effect having published the ATI?

Our best days in the office are when a donor phones asking ‘How do we become number one?’. Our goal is to motivate and facilitate donors to improve the amount of publically available information on the aid money they spend.

3. What difference would it make to aid / development if all donors were fully transparent?

Without transparency, recipient governments are unable to account for all the aid money that is spent in their countries, making it very difficult for them to effectively plan and deliver services in coordination with their aid partners. Improving transparency can improve accountability: donors are able to follow the money, citizens know where their tax money is being spent, and recipient governments know which projects are being planned or implemented in their area. Greater transparency also is linked to reducing corruption and the misallocation of resources.

4. What gives you the right to rank donors according to how much they did what you asked them to do?

The Aid Transparency Index is primarily focused on monitoring progress made by donors towards their own commitments to improve the transparency of their aid. Many of the indicators we use in the ATI are based on project information published in the IATI standard, which has been agreed as part of the ‘common, open standard’ endorsed by all donors of the 2011 Busan Agreement.

5. Why are you based in London? Aren’t you biased towards the UK?

While we are based in London, we are an international campaign and have no UK bias. Our advocacy is primarily focused on the US, the EU and the IATI signatories. The ATI is an objective measure of the amount of aid information that is available publicly.

6. Will you continue to produce the ATI each year?

As more donors commit to improving their aid transparency and work to act on their commitments, we will continue to publish an annual ATI monitoring this progress for the foreseeable future.

7. How is this year’s ATI different from last year’s?

Recognising the changes in the global environment since Busan, we focus more explicitly on monitoring the format and comparability of published data in the 2013 ATI. Six indicators have either been removed or changed (see Methodology). We also have a new data collection platform – the Aid Transparency Tracker.

8. How do you choose which donors to include in the ATI?

In 2013, donors have been selected using three criteria:

• They are a large donor (ODA spend is more than USD 1bn);

• They have a significant role and influence as a major aid agency and engagement with the Busan agenda;

• They are an institution to which government or organisation-wide transparency commitments apply, for example G8 member countries, all UK agencies or all EU Member States.

Donors need to meet a minimum of two of these criteria to be included in the ATI. There are some donors that are spending more than USD 1bn in ODA that have not been included, for example India, Saudi Arabia and Turkey. Ideally we would like to rank all large donors, but this is not possible at the present time.

9. Aren’t south-south co-operators, DFIs and/or foundations different from traditional aid donors? Why does this apply to them?

The south-south co-operators, DFIs and the one foundation included in the ATI are all major spenders (over USD 1bn) and highly influential. We recognise that not all the indicators we use in the ATI are a direct fit with every organisation, so we have amended our scoring guidelines for certain indicators to accept equivalent documents or information. These decisions have been taken in consultation with the relevant organisations.

We also conducted a review of our methodology this year that looked at how we could better account for different business models among donors and the main conclusion we came to was that every donor in the ATI can score for every indicator, though some may be easier or more difficult for different types of donors. The scoring guidelines available on the website specify in detail the types of evidence we take into consideration for each indicator.

10. Why is the Gates Foundation the only philanthropic foundation included in the ATI? Does the Gates Foundation really meet the criteria for inclusion in the ATI?

The Gates Foundation was the only foundation that met 2 of the 3 criteria for selection – it spends over USD 1bn annually and is covered by the Busan partnership for effective development cooperation (see question 8).

11. Have you added or dropped some organisations from this year’s ATI?

There are 67 donors included in this year’s ATI, compared to 72 in 2012, using the new criteria (see question 8). We have dropped climate finance funds, some domestic DFIs and the Hewlett Foundation from the 2013 ATI and included the IMF and the German Federal Foreign Office. The ATI now assesses the donors responsible for over two thirds of reported aid spending.

12. Why have you included the IMF this year? How do they fare?

The IMF fits our new criteria, and we were additionally advised by our peer reviewers to include it, given that it is a large provider of development finance with a budget of more than USD 1bn. While the IMF as a provider of balance of payments support is different from most organisations included in the ATI, the ATI does cover a variety of organisations, ranging from private foundations to IFIs such as the IFC, Germany’s KfW and the Chinese Ministry of Commerce, none of which fit the mould of traditional aid providers.

For an organisation that is new to the ATI, the IMF fares quite well. It is the highest-ranking non-IATI publisher in the ATI and scores better than many large traditional donors. We hope the IMF will engage more closely with the development cooperation effectiveness agenda and will perform even better in future iterations of the ATI.

13. Why have you included the German Federal Foreign office this year? How do they fare?

The German Foreign Office spent USD 965m in 2011 (according to the DAC CRS figures), so just under the threshold of our first criteria. We decided it merited inclusion because of the low proportion of ODA covered by existing German agencies (BMZ, GIZ and KfW) in the ATI, and also because the foreign office is covered by Germany-wide commitments to transparency.

14. What do you think this year’s ATI shows?

a. The top ATI performers are now publishing large amounts of accessible, timely, comparable and comprehensive information about their aid.

b. The usefulness of information being made available remains mixed – some of it is out of date or in unhelpful formats.

c. Many organisations need to increase their ambition and show political commitment in order to fulfil their international transparency obligations.

15. Isn’t this ATI just measuring IATI?

We measure aid transparency. IATI is the only agreed standard for publishing current aid information in a timely, comparable, comprehensive and accessible format. However, the 2013 ATI also takes information published in other formats into account in order to gauge what information is being made available more broadly.

16. What proportion of all aid is now published/transparent and how has this changed over the years?

It is impossible to check manually if the required information on every single activity in every country that every donor operates in is published. And while it is possible to measure the volume of aid published to IATI, this is currently difficult as there are some differences in the manner in which publishers report activity budgets (some may not publish activity budgets at all).

What we can say is that donors representing 69% of ODF reported to the DAC are now publishing to IATI, which is considerably more than in previous years. In future years, as the IATI standard is improved and more donors start reporting all fields correctly, it should be possible to accurately measure the volume of aid published.

17. Donors representing what percentage of ODF a) are committed to aid transparency, b) publish to IATI?

Donors representing 85% of ODF have committed to aid transparency by endorsing the Busan common standard, or producing an IATI/Common Standard implementation schedule or are covered by broader commitments such as those made by the G8. Donors representing 69% of ODF have started publishing to IATI.

18. Organisations representing what percentage of ODF are very good/good/fair/poor/very poor?

The ATI does not include all donors or all agencies within specific donor groups – so it would be unfair to comment on what percentage of all ODF falls within a specific performance category.

19. Do you think there will ever be a time when all donors are in the very good category?

We set a high but achievable standard. This year’s results clearly demonstrate that it is possible for donors to make dramatic increases in their scores based on the amount of information they make available on their aid activities in a relatively short period. Three of the four organisations in the very good category (MCC, GAVI and UNDP) have made it to this top category for the first time this year. Obviously, it may take longer for organisations who currently collect or publish little to no information on their aid activities, have poor information systems and capacity constraints compared to those who are already collecting (and publishing) some amount of information and have greater resources at their disposal.

20. Are there reasonable limits to aid transparency?

We recognise that publishing some sensitive information may endanger the well-being of recipient populations or project staff or reduce commercial competitiveness of donors’ procurement or investment processes. In such cases, we do not ask that these types of information be disclosed, but how that information is classified as likely to put people in danger or be commercially sensitive needs to be clearly defined. Details of what information is exempt need to be publicly available.

21. In terms of the ATI generally, what does success look like? And failure?

A successful organisation in the 2013 ATI is one that publishes comparable, comprehensive, accessible and timely information on its aid activities in useful formats such as IATI XML, in keeping with the changes that have taken place in the global aid transparency environment. Failure is the inability to keep pace with these changes and is reflected in the poor performance of organisations who publish little or no information on their aid activities or publish information in less useful formats such as PDFs or hard-to-navigate webpages.

22. What are the most surprising findings from this year’s ATI?

One of the most surprising findings is how little some organisations know (or are willing to share publicly) about their own aid activities. This was evident in our engagement with some donors regarding their largest aid recipient in 2011/2012, and in various comments provided on our feedback.

23. What are the least surprising findings from this year’s ATI?

Unfortunately, a majority of donors still publish very limited information on their aid activities.

Donor Ratings

24. How did you choose which donors to add to the ATI this year?

(See question 8).

25. Why do you select more than one agency for some donors?

Some countries give aid from a large number of institutions, so it would not be appropriate to simply rank one donor in each country. Also, some agencies are working to improve their transparency in response to government-wide transparency commitments (such as in the UK) or through a whole-of-government approach to transparency (such as the U.S. agencies), and we want to monitor the work that agencies are doing to meet these broader commitments.

26. How do you define development finance institutions?

At its core, we see development finance as private-sector finance bodies with a mandate focusing on development. Their operational model makes the publication of some types of information either easier or more difficult compared to other donors in the ATI, which we acknowledge in the indicators guidelines by accepting different information items for certain indicators.

27. Does adding more donors / changing the methodology make year-on-year findings difficult to compare?

The improvements made to the methodology this year mean it is not possible to compare absolute scores in 2013 with absolute scores in previous years. Taking into account the format for a publication gives a more accurate picture of aid transparency now, but it is difficult to compare with scores from previous years. In 2012, organisations would have had either 0% or 100% of the score for an indicator regardless of format. In 2013, for 22 indicators, publishing to IATI can give between 50% and 100% of the score, while publishing to PDF can give only 16.67% of the score. So an organisation that may have scored 100% in 2012 may only score 16.67% in 2013, due not to a change in practice but the change in scoring method. The new, more nuanced methodology will be used in future years, making it easier to compare absolute scores going forward.

28. You say there has been steady progress in the ATI on aid transparency on last year but in reality the overall results have dropped for all donors this year, as has the average score – how can you explain this?

The changes made to the methodology this year means it is not possible to compare absolute scores or average scores in 2013 with scores in previous years. As the results demonstrate, there is a leading group of donors who are now publishing large amounts of timely, comparable and comprehensive information about their aid and several organisations have started publishing more information on their aid activities over the past year and are planning further improvements to their data. The low average score can be explained by the poor performance of those donors who still publish limited information on their aid activities or publish information in less useful formats.

29. The donors are all different – how do you level the playing field?

Some indicators require a different interpretation for different donors. For example, Memoranda of Understanding (MOUs) are called many different things by different donors, though all collect the substantive equivalent of an MOU. Similarly, for agencies that do not organise themselves by country, we accept thematic strategies and budgets. We conducted a review of our methodology this year that looked at how we could better account for different business models among donors and the main conclusion we came to was that every donor in the ATI can score for every indicator, though some may be easier or more difficult for different types of donors. The scoring guidelines available on the website specify in detail the types of evidence we take into consideration for each indicator.

30. How do donors respond to their ratings? Do they care?

Many donors engage with us both during the data collection period and after the ATI is published. Some of them have directly asked us how they can rank #1. Some donors, like Sweden, are using the ATI as a benchmark for monitoring their own progress on implementing aid transparency. In many cases, donors who are working to improve their transparency are primarily concerned with having their efforts reflected in the ATI so we are looking forward to seeing the reactions of some of the donors who have moved up several places since 2012.

31. Do you rate donors that fund you?

In previous years, we have included the Hewlett Foundation, our biggest funder. This year, donor selection for the ATI used criteria developed after a methodological review conducted under the guidance of our peer reviewers. The Hewlett Foundation did not meet the criteria and was therefore dropped this year.

32. How/why have the performance categories changed this year?

There are still five categories evenly divided between 0 and 100%, though some have been renamed for 2013. As in 2012, the “very poor” (0–19%) and “poor” (20–39%) categories remain the same, but the next category is now “fair” with scores ranging between 40–59%, instead of “moderate”. It was felt that the next category, with organisations scoring between 60–79%, should be credited with performing relatively well when comparing all agencies, hence it has been changed from “fair” to “good”. Organisations in the “very good” category (80–100%) really do stand apart from the others in terms of the amount of information they are publishing, and in more accessible formats.

33. This year nine donors are in the very good and good categories – why do you think that is? And why are so many in the poor and very poor categories?

The organisations in the very good and good categories have demonstrated real political commitment to implement the promises made in Busan to start publishing to a common open standard in 2013. These organisations have begun to publish comprehensive information on their aid activities to IATI, including in many cases, results information and project documents.

Meanwhile, a majority of the donors included in the ATI publish very limited information on their aid activities, most are yet to start publishing to IATI and have shown a lack of ambition or commitment in fulfilling their international transparency obligations and therefore find themselves in the poor or very poor categories.

34. What explains some agencies’ big improvements in the ranking?

The big improvers in the 2013 ATI have the following in common: more information on their aid activities published currently compared to previous years and/or they publish information in IATI or other machine-readable formats.

35. How many agencies have actually got worse transparency scores? Why?

Improvements to the methodology this year means it is not possible to compare absolute scores in 2013 with absolute scores in previous years.

36. How can donors improve, generally?

Donors can improve either by making the information they already publish comprehensive for all their activities, or by publishing information for the first time. We have found that the best way to improve quickly is to begin publishing to IATI, as IATI publishers tend to have high activity level scores, which is the indicator level with the lowest average score.

37. Do you advise donors individually on how they can improve?

Yes, we are always willing engage with donors and discuss how they can increase their transparency. As well as some common issues, donors tend to have unique challenges to increasing their transparency (such as internal reporting systems, knowledge of the issue, internal politics, and buy-in from national governments) and when asked, we provide information and support to donors.

38. How well are U.S. agencies doing?

The 2013 ATI shows that the six U.S. agencies are at very different stages of transparency. MCC is the top-ranking donor in the ATI; Treasury and USAID perform moderately, while the Department of State, Department of Defense and PEPFAR do poorly. Four of the agencies are publishing information in IATI XML format. MCC and Treasury (and USAID to some extent) must be congratulated in particular for making significant improvements in the amount of information on foreign assistance they publish. The others have a lot of catching up to do.

39. What did the WB/DFID do wrong to lose the top spot?

Although DFID and the World Bank have dropped in the ranking compared to 2012, this is a reflection of very high quality IATI publication from MCC, GAVI and UNDP and not a reduction in DFID’s or the World Bank’s transparency or a deceleration in their progress. In fact, both organisations should be congratulated for continuing to demonstrate commitment to aid transparency and for the ongoing improvements to their already fairly comprehensive IATI publication.

International Aid Transparency Initiative (IATI)

40. Why is IATI so important? Aren’t other forms of publication just as good in their own way / for specific purposes?

IATI is the agreed standard for publishing current aid information in a common, comparable format. While donors may publish extensive information on their own website or to the DAC, it will always lack these vital elements of being current and comparable. IATI also includes some “value-added” fields, for example results, impact appraisals, sub-national location and a budget identifier.

41. Do we really need yet another reporting requirement for donors?

IATI is not a reporting requirement but an agreed standard for publishing comparable information on aid flows. There is no system other than IATI that ensures current, comparable and high-quality aid data is published. Existing reporting standards, such as the OECD DAC are simply not designed to focus on a project-by-project type of publishing that the IATI standard is focused on.

42. What is the difference between “publishing” and “reporting”?

In this context, publishing refers to the process of posting information about current activities, often on institutions’ own websites. The IATI approach is to publish information in the IATI format (IATI XML) and to notify the IATI Registry of that publication so that all aid information in the IATI format can be found and compared in one place. Reporting is the release of verified information in accordance with the requirements of stakeholders who wish to scrutinise and assess that information regularly. This is not normally timely information as release is delayed by the process of verification and standardisation. Publishing and reporting serve two distinct purposes: timely publishing is necessary for real-time transparency and accountability and – crucially – for effective planning of development resources, both by partner country governments and other donor organisations. Reporting is designed to provide complete statistical information (especially financial) once accounts have been reconciled and audited, and remains essential for institutional accountability (e.g. delivery of aid volume or sectoral pledges).

43. How can it be fair to have moving goalposts (i.e. IATI fields changing over time)?

As the development community improves its transparency, it is likely that new needs will be uncovered and agreed upon by donors in consultation with recipient country partners. Donors can be fairly assessed by moving goalposts if they are applied equally to all donors, but the focus of IATI standards is primarily to improve the impact and effectiveness of aid spending through greater transparency.

44. Why doesn’t IATI yet link to partner country budgets?

A “budget identifier”, which would allow projects to be mapped (automatically, as much as possible) onto recipient country budgets, is currently being developed by IATI. We have recommended that existing IATI publishers pilot the identifier so others can implement a tested, functional and effective identifier as soon as it is available.

45. Can a donor rate ‘good’ if they don’t publish to IATI?

The 2013 ATI methodology takes into how accessible and comparable information is and the format in which it is published. IATI is the only international standard for publishing comprehensive, comparable, timely and accessible data and therefore for donors to rate ‘good’ or ‘very good’, they need to publish to IATI. However, the reverse is not true. Publishing to IATI does not guarantee a spot in the top performance categories. How well an IATI publisher does on the ATI is determined by the timeliness, comprehensiveness and the coverage of their data. Donors that have good project databases could potentially publish to the IATI Registry with minimal effort, making their data comparable with that of other donors, and therefore potentially far more useful.

46. Does being an IATI signatory improve the standing of individual donors?

Simply being an IATI signatory is not enough to do well in the ATI. We do score donors on their engagement with IATI, so donors that have published an ambitious IATI implementation schedule can improve their standing, though the overall impact of this on their score is small. IATI signatories who have not started publishing information to the IATI Registry and agencies who are publishing limited, aggregated or old information to IATI do not perform well in the ATI. The big leaps we are seeing in donors’ performance primarily relate to them meeting their commitments and publishing comprehensive information on their aid activities to IATI.

47. If a donor publishes to IATI, surely they should be in the ‘good’ category (at least)?

In order to be placed in the good category, donors’ IATI publication must be comprehensive, i.e. information on the organisation and activity level indicators measured by the ATI should be consistently available across all activities and recipient countries. This is currently not the case for several organisations who publish limited, aggregate or old information in their IATI data or those that currently publish only a limited number of IATI fields.

48. If IATI is so important, why do you accept other forms of publication?

The ATI assesses the amount of publicly available aid information donors publish. It is still relatively early days for IATI however, and there are insufficient donors publishing to give an accurate account of the overall transparency of aid by focusing exclusively on IATI.

49. What if a donor starts publishing to IATI after you’ve collected your results?

If a donor has begun publishing to IATI, we would certainly commend them and would reflect that in future. If the information is not available during the data collection period then it cannot be taken into consideration.

50. Do you publish to IATI?

Yes, Publish What You Fund started publishing to the IATI Registry in November 2011. We updated our publication in September 2012 and shared our implementation schedule with the IATI Secretariat then as well. We now publish on a monthly basis.

51. Who actually uses IATI data?

Our challenge now is to encourage wide-ranging use of the data, and we are excited to see where this takes the aid effectiveness agenda. At Publish What You Fund, we use IATI data in the ATI to make the conversation around aid transparency more applicable to a wider audience. We want donors and CSOs to use IATI data more widely too, using it in their own development plans.

52. What tools can be used with IATI data?

High quality and comprehensive publication by donors is needed in order to promote the use of IATI data. Some new tools have been launched recently, such as DFID’s Development Tracker – which helps visualise IATI data from DFID and its NGO development partners. As more donors start publishing comprehensively to IATI, there will hopefully be other such tools that will help with making informed decisions and conduct evidence-based research on aid.

53. How can you increase the number of people using IATI?

One of our key recommendations for organisations wishing to increase their ranking is to encourage others to make better use of their data. IATI data has the potential to be a useful resource for a number of stakeholders, from donors and recipient country Ministers of Finance, to researchers, CSOs and citizens in both donor and recipient countries.

Methodology

55. How is this year’s methodology different from 2012?

We have made two main changes to the methodology:

• We focus more explicitly on monitoring the format and comparability of published data.

• Six indicators have either been removed or changed.

See Methodology for full details.

56. Why change the methodology now?

Since the launch of the 2011 pilot Index, donors have shifted from making high-level commitments to practical implementation. As the ATI evolves, it needs to reflect the progress made by donors in using flexible formats and making their aid information more accessible in line with these commitments. Feedback from consultations emphasised that Publish What You Fund should assess organisations on their progress with implementing the common standard and that this should start in 2013 in order to properly assess progress against the target of full implementation by the end of 2015.

57. Isn’t it unfair to change the methodology (‘move the goalposts’)?

The Index is an advocacy tool – and we are pushing donors to meet their own commitments. We are confident that this Index stands as a credible reflection of donor’s current levels of aid transparency and can be used to monitor their progress over time. We will continue to review the methodology and take into account feedback received from donors and our CSO partners and make adjustments if necessary.

58. Which four indicators were removed?

  1. “Publishes forward planning budget for country for next three years” (indicator 13 in 2012): Feedback on this indicator highlighted that it overlapped with the disaggregated budget indicator (indicator 7 in 2012) included at the organisation level.
  2. “Publishes current activities in this country” (indicator 17 in 2012): Our experience with this indicator in 2012 highlighted that unless the donor is publishing comprehensive data publicly for the specific country, it is not possible to measure this accurately.
  3. “Centralised online country database” (indicator 18 in 2012): Our experience from 2012 was that donors with good quality databases (indicator 11 in 2012) also tend to have country databases, therefore this information is captured already.
  4. “Publishes the design documents and/or log frame for the activity” (indicator 42 in 2012): Our experience with this indicator in 2012 highlighted that it overlapped with indicators 37 (impact appraisals) and 38 (objectives).

59. Which indicators were changed?

• “Engagement in IATI” (indicator 2 in 2012) to “Overall commitment to aid transparency” (indicator 2 in 2013)

In previous years, engagement with IATI was used to measure a donor’s overall commitment to aid transparency. In 2013, the indicator has been renamed so it is more explicit what it measures. The data source is donors’ schedules for implementing the IATI component of the common standard. This change has been made to reflect the commitments and timeline agreed in Busan.

• “Centralised, online database” (indicator 11 in 2012) to “Accessibility of the data” (indicator 3 in 2013)

In previous years, the existence of a centralised, online database was used to measure the accessibility of a donor’s aid information, including whether it is published in a useful format. Given that the format of the data will now be taken into account for all other organisation and activity level indicators, this indicator is now redundant. But we still want to measure how accessible donors’ data is overall, and whether the donor is promoting access and re-use of the data. Donor’s individual portals and online project databases are a good way to measure this. Each portal or database have been assessed against three criteria – whether the portal allows bulk export of data; whether it provides disaggregated, detailed data; and whether it is released under an open licence.

60. What do you mean by “format” of the data?

There is a substantial difference between searchable IATI XML data where you can access and compare any number of worldwide projects across a number of fields as opposed to searching dozens of URLs or looking for information published in several different PDF files. This difference has been quantified by allowing organisations to score more highly on 22 indicators depending on the format of publication.

61. How did you score data formats? Why are 22 indicators scored on format and other not?

There are 39 indicators in total, of which three measure commitment to aid transparency and 36 measure the publication of information.

The scoring methodology for the publication level indicators takes into account the comparability and accessibility of information. For 22 indicators, data published in PDF format scores lower than data published in Excel, CSV or IATI XML formats. Data that is published in the most open, comparable format of IATI XML can score up to 100% for certain indicators, depending on quality and coverage. We score format because there is a substantial difference between searchable IATI XML data where you can access and compare any number of worldwide projects across a number of fields as opposed to searching dozens of URLs or looking for information published in several different PDF files.

For 13 other indicators, the scoring approach recognises that format is not so important – an annual report published in PDF is much the same as an annual report published on a webpage. However, the inclusion of links to such PDF documents in an organisation’s IATI data is more valuable – especially at the activity level – as it makes them easier to locate and identify than documents available just on the organisation’s website. Therefore documents made available via links through IATI are scored higher than documents available through other sources.

62. How do you measure the quality of IATI data?/ What are the tests you run on IATI data?

The quality of data published in IATI XML is assessed by running a series of tests on all activity and organisation data packages being published. These tests have been designed to assess the availability, comprehensiveness and comparability of aid information and to determine whether an organisation’s IATI data conforms to the IATI standard appropriately. Most of the tests have been derived directly from the IATI schemas which provide formats for reporting data on various information fields in the IATI XML format. The tests return results for the percentage of activities within an organisations’ data packages that contain correctly coded information on the specific indicator being tested. For example: what percentage of activities reported contain a title? Or what percentage of activities that have been completed contain information on results? The full list of tests can be accessed on the Aid Transparency Tracker site.

This year we incorporated feedback we have received on the tests, from donors and others over the course of the data collection process and will continue to tighten the tests in future years.

63. What is the Aid Transparency Tracker?

The Aid Transparency Tracker is an online data collection platform that provides the main, underlying dataset for the ATI. The Tracker includes three separate data collection tools:

• An automated data quality assessment tool (for indicators where comparable and timely data is available via IATI)

• A survey (for indicators where comparable and timely data is not currently available)

• An implementation schedules’ analysis tool.

64. How did you gather information on each donor and what was the timeline?

Most information is gathered from what is published online by each organisation – either on their website, on the IATI Registry or on national platforms such as the U.S. Foreign Assistance Dashboard. Two indicators used different data sources, to assess the quality of Freedom of Information legislation and donor’s implementation schedules.

The ATI process as a whole lasts six months. A month is spent preparing the data collection tool and confirming donor and CSO contacts; three months is spent on data collection and verification; and two months are spent analysing the data and writing the report. We have a set data collection period (1st May–31st July) to ensure that all donors are compared fairly at a certain period in time.

65. Once the data collection has started, did you advise donors individually on how they could improve?

Yes, we are always willing to engage with donors to help them can increase their transparency. As well as some common issues, donors tend to have unique challenges to increasing their transparency (such as internal reporting or knowledge management systems). We are happy to provide information and support to donors. All donors were given an opportunity to review the data we collect for them and can provide us with clarifications and corrections as needed.

66. How did you select independent reviewers/CSO partners for the ATI?

We usually work with national NGO platforms for aid effectiveness and development. For most of the EU member states, we approach the AidWatch/CONCORD platform which then recommends members to us. If the platform members are unable to conduct the review, we ask them to recommend other organisations to us. Where there is no national NGO platform, we work with CSOs we have partnered with in the past on the Index or in other advocacy efforts. For multilateral organisations or IFIs where there is direct match with an NGO platform or CSO, we ask our peer reviewers to provide recommendations on who we can approach for the independent review. The independent review process is voluntary and unpaid. There are some organisations for whom we are unable to find independent reviewers. In these cases, Publish What You Fund undertakes the assessment. In 2013, data for 44 of the 67 donor organisations was independently reviewed.

67. Who are the peer reviewers?

Laurence Chandy, Brookings Institution

Julia Clark, Center for Global Development

Lars Engberg-Pedersen, Danish Institute for International Studies

Jörg Faust, German Development Institute

Brian Hammond, consultant, IATI Secretariat

Nathaniel Heller, Global Integrity

Marie Lintzer, Revenue watch Institute

Richard Manning, independent consultant

Afshin Mehrpouya, HEC Paris

Larry Nowels, independent consultant

Rita Perakis, Center for Global Development

Paolo de Renzio, International Budget Partnership, Center on Budget and Policy Priorities

Owen Scott, Development Gateway

Kiyotaka Takahashi, Keisen University & Japan International Volunteer Center

Rob Tew, Development Initiatives

>