luigireggi.eu

31/10/11 Open Policy

The map of EU Structural Funds Transparency at regional level – October 2011

According to the current 2007-13 regulation, all regional and national agencies responsible for managing one of the 434 Operational Programmes funded by the 2007-13 Structural Funds must publish on the web a list of businesses or public authorities that have received public funding and the amount of funding received. But the way they do this varies greatly across Europe.

The second output of the evaluation activity of the availability and quality of open data on European Structural Funds is now being published. It’s a benchmarking report (in Italian only, at least for now) that I prepared with the help of my colleague Chiara Ricci for the DG Regional Policy of the Italian Ministry of Economic Development.
I had the chance to present it at the Annual Meeting between the European Commission and the Italian Managing Authorities of the European Regional Development Fund (ERDF), held on October 27-28, 2011 in Rome. It was an extraordinary opportunity to talk about the benefits of open government data in front of a number of high-level representatives of all regional and central institutions involved in the implementation of Regional Policy in Italy (here is my presentation).

The report features brand new data on detail, accessibility, formats, and other characteristics of the datasets on the recipients and the projects funded by European Regional Policy (“lists of beneficiaries”). It’s a new wave of data collected in October 2011, exactly one year after the first web-based survey. A total of 32 characteristics are taken into account in the evaluation process, including the presence of search masks and visualization systems.

The map of Structural Funds transparency reported below shows a core component of this research, that is the format of the data published by each region. The map shows the average score of all the regional and multi-regional programmes that have an impact of that specific territory. A very low score is attributed to PDFs and to HTML reports that split the data into multiple tables or pages (regions in red or orange) . Higher scores are assigned to the XLS format, which is machine-processable (in yellow). The highest scores are attributed to the few regions in Europe that publish data in an open format such as CSV (in green), since no data is currently published in XML or JSON or RDF (see the report for the details about the construction of the index). You can also find the link to the datasets by clicking on each region. The link to the Regional Programme is displayed where there is more than one dataset available.

One year after the first survey, the level of openness has not improved. About two-thirds of EU Operational Programmes still publish their data in PDF, while only 2% use open formats. A radical change is necessary to meet the requirements of new 2014-2020 regulation, as proposed a few weeks ago by the European Commission, which include the use of CSV or XML format.

0 likes no responses
10/10/11 Innovation Policy

Ex-ante conditionalities for Regional Innovation Policies

As I reported last week, the European Commission has presented the proposals for the new 2014-2020 EU Regional Policy regulations, and the EU regions are currently discussing the future of the policy at the Open Days 2011. Although these drafts need to be definitively adopted by the end of 2012 by the Council and the European Parliament, this step is going to represent a milestone in a long process. On the one hand, it is the final product of a two-years-long discussion started in the high-level group reflecting on future Cohesion Policy, a support working group composed by representatives of the Commission and the Member States. On the other hand, it represents the beginning of the 2014-2020 programming phase.

A new principle is introduced. Regional Policy will not finance a Member State Programme until a set of ex-ante conditionalities, that is specific conditions and pre-requisites at the national level, are fulfilled. Ex-ante conditionalities were first envisioned by the European Commission in the framework of the debate on the reinforcement of economic governance [pdf]. Then they were reaffirmed in the Fifth Cohesion Report and further developed in a high level group document [pdf] proposed by the Commission in December 2010.

A list of the conditionalities proposed can be found in the general regulation. As for research and innovation (R&I), the conditionality is as follows: “the existence of a national and/or regional innovation strategy for smart specialisation in line with the National Reform Program, to leverage private R&I expenditure, which complies with the features of well-performing national or regional research and innovation systems”.

The focus of the conditionality is in fact only on the strategic level, that is on the existence of policy documents that, based on tools such as the technological forecasting, select a limited number of policy priorities and industry sectors to support. In particular, the proposal of the EU Commission states that Member States must:

  • define an innovation strategy for smart specialisation that is based on a SWOT analysis to concentrate resources on a limited set of R&I priorities, outlines measures to stimulate private RTD investment and contains a monitoring and review system.
  • adopt a framework outlining available budgetary resources for R&D;
  • adopt a multi-annual plan for budgeting and prioritization of investments linked to EU priorities (ESFRI).

The need for a strategic view of regional policies in general and of innovation policies in particular (see for example the much-needed smart specialization approach) is evident. But is such a strategy-focused approach enough to ensure a better policy and a better use of public money?

First, given the essential nature of R&I policy, the answer should be “no”. As stated in one the underlying papers of the Barca Report on the future of Cohesion Policy, “Putting money into a strategic R&D plan may not always lead to a new technology, or even less to the adoption of that technology”. In other words, identifying regional strengths and comparative advantages is correct, but will may not lead to the desired results.

Secondly, a strategic document that is compliant with the Commission’s requests is easy to be obtained for a regional government. For example, the regional strategy could be written by a consultancy and adopted without an active involvement of regional and local agencies, and, even worse, of stakeholders. I don’t mean that this is the usual way to define a regional strategy, but it happens.
Thirdly, a strategy defines policy objectives, priorities, targets, indicators, but does not describe exhaustively how these objectives are going to be reached. And we know that devil is in the details.
So this is why other kinds of ex-ante conditionalities should be taken into account, specifically those related to policy implementation. For example, one conditionality could focus on the specific content and characteristics of the criteria for projects selection included in the Operational Programmes. Since Operational Programmes are generally set up after the definition of the “contract” between the Member State and the European Union, this solution may be more difficult to implement – because Member States should first approve the Operational Programmes and the selection criteria and than wait for the EU money – but will ensure a tremendous improvement in the quality of selected interventions.

For example, a conditionality could be applied to the selection criteria by making sure they incorporate the results of ex-post evaluation of past interventions and learning from policy failures. The conditionality could also be leveraged to systematically introduce a multi-stage approach in funding projects. In this case, a project-level conditionality is applied both on final and intermediate results through in itinere and ex-post evaluations.

0 likes no responses
06/10/11 Open Policy

Structural Funds 2014-2020 open up to open data

This is very good news for the open data movement. Earlier today Commissioners Hahn and Andor presented the European Commission proposals for the new Structural Funds regulations for the period 2014-2020. The new general regulation includes an article that force EU countries and regions to open up their data on projects and beneficiaries of Regional Policy. EU Regional Policy (or Cohesion Policy) is worth € 376 billion, more than a third of the entire budget of the Union.
In the last few months, technical and policy recommendations on how to improve the rules of this policy concerning transparency were provided by two studies (one commissioned by the European Parliament and the other by the DG Regional Policy [pdf]) and one independent web-based survey. Plus, organizations such as Transparency International advocated better rules and practices, such as the creation of a centralized website that contains all EU funds beneficiaries and that publishes the data respecting the 8 principles of Open Government Data.

The good news is that most of these recommendations have been incorporated in the drafts of the new regulations. In particular, Art. 105 (Chapter II, Information and Communication) states that EU countries “shall in order to ensure transparency in the support of the Funds maintain a list of operations by operational programme and by Fund in CSV or XML format which shall be accessible through the single website or the single website portal providing a list and summary of all operational programmes in that Member State”. It has been demostrated that the presence of a single website covering all data from the local institutions will likely improve the performance of the country in terms of transparency.

The minimum set of information to be provided – currently limited to three items – has been extended to cover new interesting data such as postcodes of beneficiaries. The data fields that must be included are listed in Annex V:

  • Beneficiary name (only legal entities; no natural persons shall be named);
  • Operation name;
  • Operation summary;
  • Operation start date;
  • Operation end date (expected date for physical completion or full implementation of the operation);
  • Total eligible expenditure allocated to the operation;
  • EU co-financing rate (as per priority axis);
  • Operation postcode;
  • Country;
  • Name of category of intervention for the operation;
  • Date of last update of the list of operations.
  • The headings of the data fields and the names of the operations shall be also provided in at least one other official language of the European Union.

In my opinion, this proposal is probably a good compromise between the need to introduce new, more transparent ways to publish data and the current level of technical and administrative capacity of EU regions. However, a few important features characterizing real open data are still missing. For example the data should be released in a linked-data formats such as the RDF. Plus, a clear indication of the license under which the data are released should be provided. Introducing these features now – even though the RDF format seems now pretty advanced – is particularly important seeing that it is rather difficult to modify a multi-annual regulation once it is approved.

Obviously it will be crucial to monitor the actual implementation of these rules across the European Union. Luckily, official regulations have demonstrated to be a powerful tool, far more persuasive than other initiatives, such as the European Transparency Initiative, undertaken by the Commission after the approval of the official regulations. As already demonstrated, the level of compliance with regulations among EU agencies is extremely high, given that the Commission has the power to stop the flow of money from the EU to the Regions if these rules are broken.

0 likes no responses
02/10/11 Digital Government , Research

A holistic view for Public e-Services diffusion and impact: Introducing project T.A.I.P.S.

One of my first posts on the Regional Innovation Policies blog was about “traditional” public e-services – as opposed to Government 2.0 new applications – and their still slow diffusion in many countries in Europe and in the world. My point there was that low take-up of public e-services, which is considered by some the main reason of the digital government failure, was probably simply due to a shortage of… public e-services.

While most critics of EU e-government policy point only to the lack of interest of households and enterprises in expensive and unsustainable digital public services, I think we should also consider that today a significant number of public agencies, especially in the lagging regions of the world – fail to deliver their most useful basic public services on line. Considering e-government services, though most of them were pushed by national governments in the first years of the new millennium and are already available on the web with an acceptable level of sophistication (see for example the list of CapGemini twenty basic public services in latest benchmarking report), the situation is very different at the local level, where small agencies are struggling to provide services with less money and face complex coordination issues with scarce skills.

Moreover, if we zoom out and consider advanced services from other recently-developed domains of digital government such as e-health, e-procurement, e-education, infomobility, “smart” cities, etc, the supply-related issues are manifest.

In other words, measuring the progress of digital government requires a holistic view to include the wide spectrum of public e-services in different policy domains (health, transportation, education, etc.) and the different aspects of service provision (not just e-readiness or web interactivity, but also multi-channel availability and take-up).

Providing this view is the main goal of TAIPS (Technology Adoption and Innovation In Public Services), a research project carried out by the Department of Economics, Society and Politics (DESP), University of Urbino (Italy) and funded by the European Investment Bank (EIB), which aims at exploring the determinants and impact of public e-services diffusion from the point of view of the Economics of Innovation. The project is lead by Professor Antonello Zanfei, an industrial economist whose interests range from innovation diffusion to industrial dynamics and economics of multinational enterprises.

A few weeks ago the first outputs were released. One paper is entitled What do we know from the literature on public e-services? and provides quantitative evidence that ICT research, as it happens in policy making, still considers the various policy domains as separate silos. The next step of TAIPS will be to unify those views. A benchmarking the progress of Italian regions with a joint, e-services pilot methodology is under way. This exercise is to be eventually extended to selected EU Countries.

Plus, TAIPS staff is organizing an International Conference in Urbino, Italy on April 19-20, 2012. Here you can download the outline. The deadline for abstract submission is pretty soon (on Wednesday, October 5), but will probably be extended a little bit. The conference will be interesting since many invited speakers – leading scholars in the field of Economics of Innovation and Information Technology – have already confirmed their participation. I will report again on this in the next few weeks, so please stay tuned!

0 likes no responses
26/09/11 Open Policy #

Open Budget and Open Data on Public Policies

Open budget and open data on public funding are two fundamental aspects of transparency and accountability. Here two indexes are compared: the Open Budget Index by the International Budget Partnership and the index based on the 8 principles of Open Government Data that measures the transparency of the lists of beneficiaries of European Regional Policy

Transparency of public budgets and public policy are key elements to get an effective and accountable government. Access to information on the use of public money is crucial to ensure an effective participation, and to generate trust, credibility of public choices – even in hard times – and the effectiveness of the interventions.
It’s interesting to compare two composite indicators on openness and transparency of public funding in Europe:

  • the Open Budget index (OBI), released by the International Budget Partnership (IBP) every year, analyzes budget transparency in 94 countries all around the world (here is the full report 2010). The index is composed by two pillars (“Availability of Budget Documents” and “Executive’s Budget Proposal”) and 92 qualitative variables that are aggregated by using a simple mean. The data are collected through a questionnaire by a network of independent organizations.
  • droppedImage (1)the index of transparency of EU Regional Policy (Structural Funds) that I put forward in this paper published in the last issue of the European Journal of ePractice. It measures the openness and transparency of the data on the beneficiaries of the European funds that all regions and member states acting as Managing Authority of the policy must publish on the web. The evaluation is based on the Eight principles of Open Government Data.

While the Structural Funds transparency index is calculated for all Europe, the OBI index is available for only 14 European countries, which include almost all main member states.

The first thing to note is that there is no correlation between the two indicators, at all. The best-performing countries in one index are the worst-performing countries in the other. France is maybe an exception, with very good results in open budget and a quite good score in Structural Funds transparency (mainly due to a centralized platform that provides information about all beneficiaries of regional programmes across the country).
This non-correlation can be explained by taking into consideration the different phenomena that the two indicators aim to describe. OBI methodology mainly focuses on quantity and detail of information disclosed, while the index on transparency of EU policy mainly considers the quality and the format of the data.

Secondly, at least two groups of countries seem to emerge. A first group (in green) is located at the top left of the graph and includes UK, France and Sweden. All the other countries (in red) show lower values of OBI index and quite similar values of the Structural Funds indicator, with the exception of Czech Republic and Slovakia that got very high scores.
While the green group has a pretty long tradition of being open and accountable, the very good performances of the newcomers Eastern Europe countries are probably due to the positive role that the European Commission is playing in that region to push transparency of the programmes funded by EU policies.

0 likes no responses
19/09/11 Civic Technology

Open Data to the next level: WHY and HOW to involve the private sector

The attention of civil society and policy makers is now turning to uncharted lands: open data from the private sector can be mashed-up with governmental data to create new apps and services. The Open Data portal created by Enel – a leading Italian power company – is a step in the right direction

While the open data movement is spreading within public sector – with very interesting initiatives both at local and international level – the attention of civil society and policy makers has turned to an uncharted land, that is the open data from the private sector. The need to involve businesses in the open data movement emerged quite clearly at the first European Digital Agenda Assembly, held in Brussels on 16-17 June 2011. In particular, the European Commission aims at stimulating more private participation in the open data initiatives, and is considering specific actions to promote the re-use of big datasets held by large private sector organizations.

Professor Nigel Shadbolt, a member of the UK Government’s Public Sector Transparency Board, outlines the benefits of an open data strategy in an article published in Think Quarterly. “Open data offers the prospect of instant connectivity between partners, as in open supply chains, where businesses source from places they might never have considered or even suspected could be a source. Open data can reduce integration costs, improve transparency and harness the innovation of others. If you release your data then others will develop applications that make best use of it – providing new services that benefit you directly, like all of those free travel apps that the travel companies didn’t have to write, but which nevertheless drive people onto the transportation network”.

Following the example of other companies such as SimpleGEO from the US, Enel – Italy’s largest power company and a key player in the European market – is now opening up a first set of datasets. The company, which originally launched an open data portal on 23 August under Creative Commons BY NC ND license disallowing commercial re-use, earlier today changed the license to a CC BY, merely requiring re-users to mention Enel as the source of data. Datasets include economic and financial information about the company and “sustainability data”, which comprise data on generation, distribution and sale of electricity and gas.

Raffaele Cirullo, head of New Media unit at Enel, reports on Enel strategies to the Spaghetti Open Data (SOD) mailing list. As a first step, an initiative entitled Enel Sharing was launched in 2008 to harness the power of social media to promote the brand amongst stakeholders and disseminate the cultural initiatives of the company. Then the unit focused on emerging innovations in the field of new media as a way to introduce a new culture of sharing within the group. Open data is of course one of the most interesting paradigm shifts, with major marketing impact within the private sector. These are the main goals of Enel Open Data initiative:

  1. improve the market by fostering competition
  2. increase transparency by increasing participation
  3. favor technological innovation by encouraging the development and spreading of new applications, mash-ups and data visualization systems.

Personally, I very much share the opinion of Lorenzo Benussi – researcher at NEXA Center for Internet & Society of Politecnico di Torino – who jumped in the discussion with a message to the SOD mailing list on the eventual advantages of the diffusion of the open data model in the private sector. First, open financial data on corporate accounting may lead to a more effective control of global markets. Secondly, information on businesses assets, processes and activities is of great interest to the public and can be mashed-up with governmental data on the matter. Some examples: information about natural resources provided by the oil industry, power and communication grids, ships logistics, etc.

0 likes no responses
14/09/11 Civic Technology

Open Data up for adoption

“Linea Amica” – the integrated contact center of Italian public administration – is opening up and crowdsourcing a set of data underlying its information services to local communities. Everyone can adopt a record of the dataset and help the government solve major data quality issues.

An interesting initiative, with an unusual marketing approach, was launched last week by FormezPA, an agency of the Italian Government: Linea Amica – the official integrated contact center of Italian Public Administration – is giving the data up for adoption. This is the message displayed on the webpages of RubricaPA, a specific service that allows users to find and locate a public agency by searching among thousands of national, regional and local authorities. The service is now letting the users modify the underlying data by submitting more accurate or updated information on an agency location, telephone number or certified email.

The process is simple. You modify of a set of data through a form, then your suggestion is evaluated by the staff, and, if accepted… you have now adopted that specific data. This means that the staff at the ministry considers yourself somehow responsible of that data and its change over time. Something that may (or may not) create a sort of a personal bond with the data itself. Or even an act of love, quoting from Alberto Cottica’s definition of social network.

RubricaPA started to publish open data on public agencies addresses, fiscal codes and certified emails in October 2010 under the Italian Open Data License v1.0 (which is built on Open Data Commons and Creative Commons BY-SA), a step forward of national government towards open data. But the dataset, created through a matching of data from different sources (official statistics, central registers, old similar projects), is flawed by data quality issues and missing values. Some information is outdated or inaccurate, sometimes conflicting. That is why a little help from the crowd may become crucial. In fact, this is the first time that a central and official service sponsored by the Ministry of Public Administration resorts to crowdsourcing techniques to face major data quality issues.

The question is: who should be interested in helping “Linea Amica” improve its information services? The promoters hope to actively involve local public servants and citizens who care about their local community and want a major state-wide service such as Linea Amica help line to use the correct information. “This is my data, I should care”.

We will see if this kind of love is enough to get the right level of participation.

0 likes no responses
07/09/11 Research

A strategic balance for open government data publication

A quite long debate on how to publish open government data is still  dividing stakeholders and researchers. Should government develop own tools for data visualization and analysis in order to include non-techically oriented citizens?

 

The debate on how to publish open government data is dividing public servants, open government advocates and researchers into – at least – two main groups.
There’s a first group of civic hackers organizations and – not surprisingly – academic literature that is focusing on the “invisible hand” of private sector or civil society organizations which is able to reuse PSI and to mash up this information with other sources to create new innovative services. In this case the government should only publish hi-quality data in an open, machine-readable format and let the others do all the rest.
Others are pointing to the risks of the so-called “data divide” or, from a public value perspective, think that government should consider different users needs and adopt a more pro-active approach e.g. by elaborating its data on governmental websites:

  • Interesting points on “data divide” or, more generally speaking, on “open data inclusion” for example are raised in Michael Gurstein blog. Moreover, in the comments of this World Bank blog post, Tim Davies highlights the importance of the skills to access, work with and interpret data widely amongst policy makers and local communities.
  • The public value perspective is introduced in this paper from the Center for Technology in Government (CTG), Albany, NY. Basically, this approach suggests that government should consider different users needs and the impact of a set of value generators on different groups of users.

So, what should public agencies do to ensure data inclusion and public value generation?

I recently presented a paper at EGOV 2011 conference entitled “Information strategies for Open Government in Europe: EU Regions opening up the data on Structural Funds”. In the paper I identified three groups of European Public Agencies publishing the data on the beneficiaries of EU Regional Policy:

  1. Agencies that publish the data in PDF with little information and detail on projects and financial data
  2. Agencies that focus on data quality, detail, accessibility and machine-readable formats
  3. Agencies that focus on data visualization, maps, graphs and interactive search, but only a few of them let the user download the underlying raw data

It seems that the second group is following a good strategy from an “invisible hand” point of view, but is lacking actions to include non-technically oriented citizens. The third, even if it can be argued that is not pursuing even an “open” data approach, shows some interest in data inclusion since it’s presenting the data in a “easier” way (maps, etc.) and/or in an aggregated form, which are useful for non-technically oriented citizens.

One conclusion that can be drawn is that both the approaches are necessary. But is it really necessary that every agencies develop their own data visualization tools? How many tools are necessary for the same kind of data (e.g. beneficiaries of EU funding) in EU regions? What is the minimum set of information (metadata, notes from the public administration to suggest a correct interpretation, etc.) required for this kind of data?
For example, in the case of European Common Agricultural Policy: should each State develop geo-referencing tools and maps or let Farmsubsidy.org do all the work?

0 likes no responses
25/07/11 Research

US and EU in search of an Open Government R&D agenda: 44 topics in 4 clusters

Open Government is not only changing politics and policies but is also redefining the notion of established research areas such as e-government and e-democracy.  The world of research – with the active participation of practitioners – needs to define an Open Government R&D agenda for the years to come.

In this post – just a note to myself – I list some interesting research topics classified into 4 main areas.

Barack Obama’s presidential campaign and the US Open Government Directive of December 2009 profoundly changed the way governments of the whole world are conceiving the role of ICT in the Public Sector. Obama’s Directive, which directly (and almost immediately) influenced policy making in most OECD countries and also contributed to the growth of bottom-up initiatives, is now impacting the world of research.

Key questions such as the actual impact of open government data on citizens and enterprises remain largely unanswered. It is not just a matter of democratic principles and political messages, or transparency only. The diffusion of web 2.0 technologies and user-driven innovations in the public sector – along with the creation of new business opportunities coming from the re-use of government data by the private sector – is changing the perspective of interdisciplinary but actually quite separated research fields such as e-government (focused on the use of ICT in internal processes and in public services provision) and e-democracy (focused on citizen engagement through technologies such as on line polling and voting, deliberation, consultation). Teresa M. Harrison and her colleagues from the Center for Technology in Government (CTG) at the University of Albany SUNY made this clear in a paper published a few days ago: “Although e-democracy in political and e-government in administrative realms have historically been largely separated, it now appears Open Government brings these two spheres of activity together”. 
On the one hand, the provision of e-government services not only requires technical expertise but also, inevitably, implies political choices. On the other hand, e-government implementation should take advantage of the “power of the crowd” and the opportunities that come from involving the citizen and the private sector in new forms of public-private collaboration.

As boundaries between research domains are blurring, time has come to define an Open Government holistic framework and a global Open Government R&D agenda.

In Europe, the CROSSROAD project, a Support Action funded by the European Commission, has produced a Research Roadmap for “ICT for governance and policy modeling”, as defined by the objective 7.3 of the EU Seventh Framework Programme (FP7) 2009-2010. A white paper published in December 2010 and edited by Fenareti Lampathaki, Sotiris Koussouris, Yannis Charalabidis and Dimitris Askounis (National Technical University of Athens) identifies five main research themes and a three-level taxonomy. As the point of view is the broader concept of ICT for governance and policy making, a set of useful tools and research domains that are not usually considered in the current debate on Open Government are included here. This is the case of public opinion mining tools, which could be used to find out, for example, what types of citizens care about which type of government information. Another examples are the technologies that the EU classifies into the “Future Internet” studies, some of which (e.g. the Internet of Services) are based on government linked data availability.

In the US, the President’s Council of Advisers on Science and Technology (PCAST) highlighted the importance of establishing an R&D agenda for open government in a report issued in December 2010. The Open Government Research & Development Summit was hosted on March 21-22nd, 2011 by the Networking and Information Technology Research and Development (NITRD) Program. The summit brought together government leaders and researchers to explore the needs of the community, and was organized by the office of the U.S. Chief Technology Officer Aneesh Chopra, while Beth Noveck – law professor at the New York Law School – was one of the prime movers on getting the meeting to happen.
Building on this first event, a workshop organized by the Center for Technology in Government (CTG) in Albany, New York on April 27-28th gathered a number of academics, practitioners and, moreover, hundreds of research questions still unanswered. These questions were then clustered into omogeneous groups such as “the value / ecosystem of Open Government”, “What do citizens want?”, “Government capabilities”, etc. As a second step, research questions were considered by four lenses: 1) law and policy, 2) management, 3) technology and 4) cross-cutting. Professor Ines Mergel reported on this in her blog: day one and day two. Furthermore, a full list of all the questions is now available in a CTG report prepared by Meghan Cook and M. Alexander Jurkat, which also include an interesting list of the biggest challenges faced in Open Government as perceived by the participants.

EU CROSSROAD project and US CTG workshop came up with quite similar research themes and questions, with CTG themes mainly comprised in the first section of CROSSROAD taxonomy “Open government Information and Intelligence for transparency”. Other CROSSROAD areas partially in common with the US approach are, for example, “Social computing, citizen engagement and inclusion” and “Identity management and trust in governance”.

In the following table I try to combine some of the most interesting aspects of the CROSSROAD and CTG exercises, that is a robust identification of research clusters and the use of “lens” corresponding to different disciplines.
Questions and themes are grouped together on the basis of data and information flows from government to citizens and back from citizens and businesses to government. With reference to the figure:

  1. Open / linked data “supply side”: how to foster meaningful and useful government data publication? What implications / impact within the government agencies?
  2. Open / linked data “demand side”: how to meet citizen and businesses needs? How to support data use and re-use?
  3. Social computing: How to involve the citizen in collaboration projects / activities?
  4. Citizen engagement: How to involve the citizen in democracy?
    For each combination of cluster / research theme vs. lens / research discipline I list some examples of questions and topics particularly interesting to me.

0 likes no responses
18/03/11 Research

Towards EU Benchmarking 2.0 – Transparency and Open Data on Structural Funds in Europe

The first output of a web‐based survey shows that the European Cohesion Policy is only halfway to accomplishing a paradigm shift to open data, with differences in performance both between and ‐ in some cases ‐ within European Countries.
Low scores are attributed to the formats the authorities are choosing when publishing their data on the web, while other indicators such as the level of granularity are positively influenced by the requirements of current regulations.

Availability of Open Data on projects and beneficiaries of the European Cohesion Policy (or Regional Policy) – which is the second-biggest EU policy after agriculture with a budget of EUR 347 billion for the period 2007-13 – can surely help foster transparency in the use of public money in Europe.

The European Union lacks common initiatives such as the US Recovery.gov or USA spending.gov to track government spending and improve transparency of public policies. In particular, in the case of Structural Funds, there is no single point of access to the data, since each single EU Region and National agency acting as Managing Authority of the Funds is responsible for publishing data on the beneficiaries and the amount of public funding received.  This implies that hundreds of Managing Authorities are following different paths and implementing different information strategies when opening up their data. Many databases (often simple PDF lists) are now uploaded to regional or national institutional websites, showing huge variation not only in the way they can be accessed (formats, search masks, data visualization etc.) but also in content and quality of data provided (detail level, granularity, description, etc.).

Last summer, after a first analysis on the prevailing formats, I started to design an independent web-based survey on the overall quality of data published by each Managing Authority responsible for the 434 Operational Programmes approved in July 2009. Data was collected in October 2010 by me and Chiara Assunta Ricci, a brilliant PhD student in Economics at La Sapienza University of Rome. We were inspired by what people at the fantastic project Farmsubsidy.org had been able to do with the data from the other big EU policy, the Common Agricultural Policy. While their greatest achievement is having gathered the PDF documents every State has to publish on line in one real database, they also provide an evaluation of the lists of projects they have used by a transparency composite indicator. The same exercise could be applied to Structural Funds.

The first output of the survey was published a few days ago in the European Journal of ePractice. Here you can download the paper and here the full issue “The Openness of Government”. The paper is based on David Osimo’s seminal proposal for a “Benchmarking 2.0” and represents a pilot of a measurement framework for comparing governments’ efforts to make data available.

This exercise could represent a first step for improving current ‘traditional’ EU e-Government benchmarking. In fact, the new edition “Digitizing Public Services in Europe: Putting ambition into action – 9th Benchmark Measurement” (page 19 and 137) confirms the importance of updating and expanding the scope of the analysis by including new metrics on “Transparent and Open Government” .
The evaluation scheme is based on the Eight principles of Open Government Data, which are considered as a key reference and a worldwide de facto standard. This scheme is meant to be flexible and could be applied to other kinds of Government Data. Gianfranco Andriola, one of the promoters of the Italian Open Data Licence, helped me define the methodological approach for the principles “format” and “licence”. I must also thank the “man behind the curtain” of Spaghetti Open Data initiative, Matteo Brunati aka Dagoneye, for his suggestions about open formats, and Sergio Scicchitano for his many advices and support.

Results can be summarized as follows:

  1. The European Cohesion Policy is only halfway to accomplishing a paradigm shift to open data, with differences in performance both between and – in some cases – within European countries. Best performing countries such as the Czech Republic and Finland obtain a score of 71%, while the worst performing Member State is Latvia with 25%. Countries from the eastern Europe often appear in the first half of the chart.
  2. Very low scores are attributed to the formats the authorities are choosing when publishing their data on the web, while other indicators such as the level of granularity are positively influenced by the requirements of current regulations.
  3. A considerable difference in performance is shown when comparing datasets that are shared and centralized at national level with those which are managed by a single regional authority. This variation is also statistically significant with regard to all the indicators examined, and is probably due to the fact that a centrally managed programme has the advantage that information flows are easier to manage and local actions are more easily coordinated.
  4. The use of open, machine-processable and linked-data formats have unexpected advantages in terms of transparency and re-use of the data by the public and private sector. The application of these technical principles does not need extra budget or major changes in government organization and information management; nor does it require the update of existing software and infrastructures. What is needed today is the promotion among national and local authorities of the culture of transparency and the raising of awareness of the benefits that could derive from opening up existing data and information in a re-usable way.

 

0 likes no responses
Recent Comments
- Patrick to From closed data to open data ecosystems – stages of an evolution
Thanks for this interesting article. I agree with this evolution as you've laid it out – especiall...
- Putting research into practice: Training academics to use Open Data as OER: An experience from Uruguay | Thoughts on Open Education to OpenCoesione School – A scalable learning format using OpenData as Educational Resources
[…] developed by A Scuola di Open Coesione, and in the work t Chiara Ciociola and Luigi Reggi...
- Così fallisce l’Open Government: quando lo Stato fa auto-gol | luigireggi.eu to A (long) list of the risks of Open Government
[…] un post di un paio di settimane fa mi sono cimentato in una prima lista dei possibili ris...