July 7, 2014
The recent special audit report by the European Court of Auditors (ECA) on the establishment of the European External Action Service (EEAS) is revealing reading. The report lists several shortcomings in the preparation, planning, organization and staffing of the EEAS. It’s worth reading for two reasons.
First, it’s a good example of an external performance audit by ECA in view of the fact that ECA itself is a relatively new EU institution. It was established in 1977, when it replaced a so-called Audit Board with had limited staff and independence and which didn’t even publish its reports.
Since then of course much has changed. ECA plays an important role in the institutional balance of the European Union, in particular as providing assurance to the European Parliament in giving discharge to the Commission for the annual implementation of the EU budget.
But the report is above all a good example of the shift in recent years from regularity auditing to performance auditing. The traditional types of auditing, financial and compliance auditing, are simply not enough to ensure accountability and transparence in the public sector and to improve its performance.
We need information whether public agencies achieve their objectives and whether public funding is being used not only legally but also effectively. That’s why public auditing against audit criteria such as economy, efficiency, and effectiveness (the 3 E’s) is a must by every modern supreme audit institution.
The court has in recent years increased the number of performance audits in response to a changing environment and demand for performance data. In 2006 ECA published a performance audit manual.
The audit of EEAS demonstrates clearly that performance measurement and performance auditing are two different activities. The existence of performance data at the audited body is no condition for an audit. It’s up to ECA and other external audit institutions to audit public agencies by collecting new performance information.
The audit of the EEAS is therefore also a good example from a methodological point of view since it applied a variety of data collection means.
In this audit ECA collected information by documentation reviews and interviews, which of course is standard, but it also examined a sample of 30 briefing requests and another sample of 30 recruitments. In addition ECA collected information through surveys to 35 EU delegations and 15 Member States.
The audit took place in parallel with a mid-term review which was carried out by the EEAS and published in July 2013. A more innovative approach could have been to carry out the audit in real time, right after the establishment of EEAS in 2011, so that the teething troubles of EEAS could have been discovered in time and avoided.
The auditors admit that their conclusions and recommendations converge in many points with those in the mid-term review. The added value of the audit if course that it’s much more credible since it has been carried out by an independent institution. This should facilitate the acceptance of the report.
The majority of the recommendations of the recommendations have indeed been accepted by EEAS, judging by its reply attached to the report. Where there are disagreements, one would assume that the stakeholders, above all the Council, will have an extra look at the issues.
This brings us to the second reason why this report is worth reading. From the very beginning, the EEAS and its head, the High Representative for Foreign Affairs and Security Policy (HR), have been criticized for inadequate or late responses to the challenges in EUs external affairs.
Think e.g. about the Arab Spring and the Ukraine crisis. Some of the criticism might have been unjust considering that, as the audit report states, the Member States did little preparatory work prior to setting up the EEAS and it took some time for EEAS to recruit its staff, in particular the third one coming from the Member States.
EEAS was also required to respect a so-called budgetary-neutral condition and didn’t receive resources for support functions. Still it’s not certain that the budget condition was fully respected.
The report mentions duplication of functions in EEAS and other EU institutions, and cumbersome procedures at the EU delegations. EEAS added an extra management layer and has twice the number of senior management staff as its predecessors. It also kept the EU special representatives without integrating them sufficiently in its work.
It’s possible that the total costs of the EEAS compared with the previous structures, incl. those in the Member States, have been reduced somewhat. But if the intention was to exploit synergies with the Member States by co-location of embassies or consular representation in third countries, the report states that much still remains to be done.
Some of the shortcomings in the EEAS are of its own making. A top-down approach was put into place. The report mentions the cumbersome validation process for briefing requests and possibly also press releases which results in unnecessary delays. Despite her busy schedule, the HR interviewed all short-listed candidates for head of delegation posts.
Compared with the e.g. the Commission, the EEAS neither established a competency framework for managers nor used assessment centers for managing positions. Significant gender and geographical imbalances remained or might have increased in EEAS.
Political reporting isn’t shared within EEAS because not all staff has security clearance, not even among heads of delegations.
To sum up: The scope of the audit was limited to the establishment of the EEAS and related internal EEAS issues where it identified insufficient preparatory work and a number of shortcomings or weaknesses (not all mentioned in this article). While it didn’t examine the decision-making process in any specific crisis situation, not to speak about EEAS capability to foresee what happened in EU’s neighborhood, one can assume that there is a relation between the audit findings and EEAS overall performance until now.