10/10/15 Research # , , , ,

Pfeffer, Power and the Open Source Community: writing software as a political process

Pfeffer, Power and the Open Source Community: writing software as a political process

Managing with Power by Jeffrey Pfeffer presents a very detailed and comprehensive analysis of power in all its different forms. The author painstakingly reviews numerous sources of power, strategies and tools to employ power effectively. The central message of his “clinical diagnosis of power” (p. 300) is that power is a necessary condition for action. In fact, even the most brilliant ideas require power to be developed, diffused, and executed.

The book, although deeply rooted in the literature, has been accused of cynicism. Many of the “heroes” presented as examples – prominent individuals, smart enough to get power and keep it, at least for a while – seem to perceive power as a zero-sum game. In this view, every means to get power can always be justified by the ultimate goal of “getting things done,” a mantra repeated many times throughout the book.pfeffer

While this is probably true of the Machiavellian examples that Pfeffer included in the book, his relentless categorization of forms of power can also be read as a practical manual for the brilliant, honest member of an organization that just needs enough power for having his voice heard and make a change.

Indeed, the wide spectrum of categories of power represents a valid toolbox that can be applied to different situations and contexts. While the chapters of the book can be seen as different components or features of power to be activated or not depending on the specific case, the interaction between these elements is crucial to analyze real-life examples.

Here I would like to apply some of the most relevant elements of power to the case of the Open Source community, with the aim to show how Pfeffer’s points are valid even in a context that is sometimes perceived as unconditionally egalitarian, collaborative and open.


The Open Source community

Community is the key component of Open Source software (OSS) development. OSS is in fact based on the free access and redistribution of the underlying code, which each developer shares with the community of developers as a global collaborative effort towards the production of what was seen as a common [1].

Software developed as Open Source is widely adopted and its production model is so successful that some of the leading software companies in the world – including Microsoft – heavily invest in it. The model is based on a bottom-up structure, a non-coercive organization and a largely decentralized production [2]. OSS developers share a common culture (“hacker culture”) and ideology that, especially during the early days, has been called “Microsoft-phobia” [3]. This “alternative” view of software also derives from purely technical considerations, as commercial software is perceived as not totally reliable, and is consistent with the principles of self-production and user-driven innovation [4].


Entering the Community: Power Diagnosis and Allies

According to Pfeffer, the first step to get power is to diagnose “the relative power of the various participants and comprehend the patterns of interdependence” (p. 49). It’s important to define the relevant political sub-units, the existing social ties and reputational and representational indicators of each individual. Bergquist and Ljungberg [5] studied the case of a newbie (an inexperienced newcomer to OSS development) that wishes to enter the OSS community. They highlight the importance of studying the existing social interrelation and networks as well as community norms and values (p. 312).

In particular, to establish a sense of loyalty to the community, a newbie should understand the power relations in order to make allies. As Pfeffer points out, making allies is a crucial strategy to get power. Allies can be acquired thanks to obligations and favors. In the OSS community, the whole game is based on the “mutual interchange where one gift is given to another,” and a sort of interdependence is created between the giver and the receiver [5]. A complex network of givers and receivers then forms, in which the relations can be one-to-one but also one-to-many.   The search for allies takes place on the Internet through the shared on line tools of communication. There, conversations are not only public but also private, with gathering of alliances mainly through private conversations, followed by an alignment of the arguments in the public forum [6].


Reputation, Performance and Formal Authority

In “Managing with Power,” Pfeffer points out that formal authority, reputation and performance represent a key source of power and are interrelated.

Reputation, in terms of peer-recognition and prestige, is one the main driving factors that motivates OSS developers to work and share their products with the community [7]. In the first place, reputation derives essentially from performance. The quantity and quality of the code produced are easily recognizable thanks to the shared on line tools of production (such as the platform GitHub), as a signal of the quality of the individual programmer [2]. As Pfeffer highlights, this is a quite rare case in which performance can be measured in a quantitative way.

Once a developer obtains enough reputation, he also gets some sort of formal authority within a specific project through direct invitation. Sack, Détienne [8] studied the existing hierarchies among Python developers. Members’ levels span from the newbies at the bottom of the pyramid to Guido Van Rossum, the developer who founded the project Python.

Figure 1 – Sociotechnical stratification of roles in the Python project


Source: Sack et al. (2006)


Pfeffer maintains that formal authority “confers control over certain resources and the ability to take certain implied or specified actions.” In fact, high-level members of the OSS community have the power to control crucial resources.
First, they control the source code. In the case of the Python project, although the source code is stored in CVS files that can be read by any member, the write privileges are given only to a subset of developers, with evident asymmetries in power relations [5]. Second, high-level members of the community can monitor and sanction members behavior. For example, they can “ban” or “mute” a member due to “flaming discussions” or “trolling” [9]. Third, even though the discussion on software development is open to anyone, Sack, Détienne [8] found that some members – especially those with formal authority – can influence the discussions on specific topics and so have an impact on the decisions on product development.   The sequence of public messages in the Python forum and the links between them suggested that influencing people can deviate the course of a discussion and focus on specific arguments and line of work.


Location in the Communication Networks

“Managing with Power” includes a chapter on the role of communication networks and the individual position in the communication structure. According to Pfeffer, “People who are well placed in the communication network also tend to be central players in terms of power and influence.” Information and knowledge are therefore crucial sources of power, deriving, among other things, from social relations and connections.

In particular, network centrality can measure the degree of influence that an individual can exert over the structured tasks of a project. In the case of Python, Ducheneaut [10] looked at the evolution of the network position of the developer “Fred” from January to October 2002.


Figure 2 – Developer “Fred” position in the communication network of a Python project


Source: Ducheneaut (2005)


Fred is presented as a case of “successful socialization,” since he could manage to gain a central position in a specific project within the Python community in less than a year. At the beginning, although he has a strong background in Phython development deriving from professional experience in a software company, Fred does not know how the community works and starts asking questions. Then “By making connections with some of the project’s participants, Fred is trying to make the structure of this network more visible to himself […] discovering in the process which parts of the network relate to his work” [10]. Once he gets the reputation of a good “bug fixer,” his role in the communication network is more and more crucial as other members join the project. In October 2002, he has definitively acquired enough power to influence the group decisions as soon as Fred’s proposal to introduce a new software module is approved by the community.


In conclusion, the case of the OSS community shows that power dynamics matters in a context that sometimes is seen as the panacea of distributed collaboration. According to Bergquist and Ljungberg [5], “One easily gets the impression that the sharing of gifts in online communities creates a very friendly and altruistic atmosphere. And indeed it does, to some extent. But it does not mean that social stratification and struggles over power cease to exist.” In fact, developing software is inherently a political process.


  1. Benkler, Y., Coase’s Penguin, or, Linux and “The Nature of the Firm”. The Yale Law Journal, 2002. 112(3): p. 369-446.
  2. Bonaccorsi, A. and C. Rossi, Why open source software can succeed. Research policy, 2003. 32(7): p. 1243-1258.
  3. Dalle, J.-M. and N. Jullien, Windows vs. Linux: some explorations into the economics of Free Software. Advances in Complex Systems, 2000. 3(01n04): p. 399-416.
  4. Von Hippel, E.A., Democratizing innovation. 2005: MIT Press.
  5. Bergquist, M. and J. Ljungberg, The power of gifts: organizing social relationships in open source communities. Information Systems Journal, 2001. 11(4): p. 305-320.
  6. Divitini, M., et al. Open source processes: no place for politics. in Proceedings of ICSE 2003 workshop on Open source. 2003.
  7. Greiner, M.E. Leadership behavior in virtual communities. in Proceedings of the 7th Annual Conference of the Southern Association for Information Systems. 2004. Citeseer.
  8. Sack, W., et al., A methodological framework for socio-cognitive analyses of collaborative design of open source software. Computer Supported Cooperative Work (CSCW), 2006. 15(2-3): p. 229-250.
  9. Markus, M.L., B. Manville, and C.E. Agres, What makes a virtual organization work: Lessons from the open-source world. Image, 2014.
  10. Ducheneaut, N., Socialization in an open source software community: A socio-technical analysis. Computer Supported Cooperative Work (CSCW), 2005. 14(4): p. 323-368.



Photo by Igal Koshevoy

0 likes no responses
10/10/14 Digital Government , Research

Regional Governments and ICT policy coordination

Many regional governments in Italy have tried to solve the problem of coordinating different levels of government and different local ICT policies through the creation of ad-hoc public companies. These companies are owned by the regional government itself or by a consortium of local actors, and have the goal to ensure more flexibility and specific capacity in providing advanced services to provinces and municipalities. This raises questions about their actual efficiency and effectiveness.

A paper with Chiara Assunta Ricci provides a brief overview of recent e-government policies of the Italian regions with a focus on the coordination models between local actors. In particular, the role of regional information technology (IT) public companies is explored through a cluster analysis based on evidence from an ad hoc survey. Advantages and disadvantages of the different coordination models are discussed. In particular, two composite indices are employed: (i) an index of the intensity of coordination at the regional level; (ii) an index of effectiveness of IT policies, measured as the level of advancement of municipalities in the use of ICTs. The two indices are then compared with the coordination models adopted. Preliminary results show a positive correlation between the two indices, while the presence of an IT public company does not appear to significantly affect either the IT performance nor the level of coordination.

Here is an earlier (full) version of the paper (in Italian), presented at the XXXIII Annual Scientific Conference of the Italian Regional Science Association (AISRe), Rome, Italy.
Here is the final version published in Economia e Politica Industriale – Journal of Industrial and Business Economics.


IT companies owned by regional government in Italy (x = dimension; y= no. of activities / topic covered)

Untitled 2

Italian Regions (x = ICT policy coordination index; y = effectiveness of ICT policy index; “No IH” = Regions not owing any IT company; clusters = see previous graph)

fig 3 colori



0 likes no responses
09/02/14 Research

Do EU regional digital strategies need more balance?

Here is the abstract of my paper “Are EU regional digital strategies evidence-based? An analysis of the allocation of 2007–13 Structural Funds” with Sergio Scicchitano, which was published yesterday in Telecommunications Policy.

The ambitious goals of the European “Digital Agenda” need active involvement by regional innovation systems. Effective regional “digital strategies” should be both consistent with the European framework and based on available evidence on the needs and opportunities of local contexts. Such evidence should be used to balance the different components of the Information Society development (e.g. eServices vs. infrastructures; ICT supply and demand), so as to ensure that they can all unleash their full potential. Therefore, EU regions should spend more money to overcome regional weaknesses than to improve existing assets. In this paper we explore the different strategies of the EU's lagging regions through the analysis of the allocation of 2007–13 Structural Funds. Then, we verify whether such strategies respond to territorial conditions by comparing strategic choices made with the actual characteristics of local contexts. Results show that EU regions tend to invest more resources in those aspects in which they already demonstrate good relative performances. Possible causes of this unbalanced strategic approach are discussed, including the lack of sound analysis of the regional context and the path dependence of policy choices.

You can download an earlier (full) version of the paper, presented at the Regional Innovation and Competitiveness Policy Workshop, UK-Innovation Research Centre – University of Cambridge in 2012.

0 likes no responses
04/04/13 Research

Open Data strategies are finally converging – EU regions and the data on cohesion policy

EU Regions and national agencies managing EU Structural Funds are forced by a common Regulation to publish at least a minimum set of information on the projects and recipients that are funded with public money. This data is crucial to fight corruption and, more importantly, understand how the money is being used and what kind of results the policy has achieved.


While some Regions haven’t released much more information than the name of the beneficiary and the total value of the project, more and more public authorities in Europe are taking current regulations as an opportunity to manage EU funds more transparently.
Two years ago I blogged about three different open data strategies that public authorities were pursuing back in 2010.

  1. The first implied the release of high-quality data in machine-readable format
  2. The second was focused on data visualization and interactive search in order to include non-technically oriented citizens in open data re-use and understanding
  3. The third was about NOT being open. Little detail, little quality, lots of PDFs.

New data collected in October 2012 on the availability and quality of open data on EU Cohesion Policy tell quite a different story. From October 2010 to October 2012 the strategies have evolved, leaving room for more speculation about what kind of supply of policy data we can expect for the future. More precisely, data suggests that the two proactive strategies have become one.

According to a nonlinear multivariate analysis of 8 indicators on the openness and transparency of 434 Operational Programmes in Europe, it is not easy to clearly distinguish a strategy based on re-usable formats and detailed information from a strategy focused on letting users browse through data and diagrams.

For example, in 2010 a machine-readable format was associated with highly detailed financial data on project implementation or with proper metadata and projects’ description, while the presence of a map or of advanced search capabilities was likely where data were presented directly in a HTML page. Now the two formats are highly correlated. This implies that some national or regional portals – just like Italy’s national portal OpenCoesione – now let the users both download the data in bulk and surf through the data right on the website.

Obviously, this is good news for researchers, data journalists and ordinary citizens. Data providers seem to be more aware that the usefulness and stewardship principles are complementary. Most public agencies, though, keep following the same strategy of NOT being open and offer data in PDF with little information.
The variables showed in the two graphs below relate to:
• the format (PDF, XLS or CSV, HTML)
• the way the data is presented (GEO = maps & graphs; RIC = search functions)
• the datail of the content (CONT) and the financial data in particular (FIN). The variable QUAL represents data quality features such as the presence of metadata, english version of the fields, update frequency.

0 likes no responses
26/04/12 Digital Government , Research #

Why we need another composite index (on public e-Services)

The debate on composite indicators or synthetic indices in the e-government field has been ongoing since the publication of the first benchmarking exercises at the EU level back in 2002. Many analysts and researchers consider composite indicators as “black boxes” (see for example this paper by Frank Bannister, 2007). We put in still intelligible indicators and what comes out is a mysterious number, and, inevitably, a mysterious rank. The feeling is that it’s a weird combination of voodoo (or too complicated math), subjectivity, weak frameworks, unbelievable results (can you really believe that Italy has put 100% of public services on line with the highest possible level of interactivity?).


A 3-days seminar at the JRC-IPSC of the European Commission opened my mind. There I found a motivated and high-skilled team coordinated by Andrea Saltelli, which, by the way, was responsible for drafting the OECD-EC Handbook on Constructing Composite Indicators.

While it was clear to me that things like data quality, framework reliability and transparency – when it comes to show how the results have been computed – are always crucial, I learned that composite indicators quality and robustness can and must be checked, and that more advanced and reliable techniques can be applied. I suspect that if we applied tools such as the Sensitivity Analysis or the Uncertainty Analysis to the existing “black box” indicators we would get an idea of how ranks can vary and of therefore at what extent resulting policy indications can be week.

I’ve been working for quite some time on a composite indicator on eServices (eGovernment, eEducation, eTransportation, to be extended to eHealth and Smart Cities) for research project TAIPS funded by the European Investment Bank, together with my friends and colleagues Marco Biagetti, Davide Arduini and Professor Antonello Zanfei. I presented some preliminary results at the 1st EIBURS-TAIPS Conference at Urbino University (here you can find all papers and slides from the conference), in front of a bunch of innovation policy gurus including Paul David, Ian Miles, Edward Steinmueller and Keith Smith.
Here is the abstract and my slides.

Abstract The study aims at providing evidence on regional differences in the diffusion of ICT in the public sector in Italy, with a focus on different types of public e-services (eGovernment, eEducation and Intelligent Transport Systems). Data are obtained by merging four different surveys carried out by Between Co. (2010-11) and Istat - Italy’s National Bureau of Statistics (2009). We pursue a three-fold objective. First, we attempt to overcome the prevailing attitude to consider the various domains of public e-service provision as separate from one another. In other words, measuring the progress of digital government requires a holistic view to capture the wide spectrum of public e-services in different domains (e.g. local and national administrative procedures, transportation, education, etc.) and the different aspects of service provision (not just e-readiness or web interactivity, but also multi-channel availability and take-up). Second, we shall tackle a major drawback of existing statistics and benchmarking studies of public e-services, which are largely based on the count of services provided online, by including more sophisticated indicators both on quality of services offered and back office changes. Third, we develop a sound, open and transparent methodology for constructing a public eServices composite indicator based on OECD/EC-JRC Handbook. This methodology, which incorporates experts opinion into a Data Envelopment Analysis, will allow us to combine data on different e-service categories and on different aspects of their development, and will enable us to define a ranking of Italian regions in terms of ICT adoption and public e-service development.
0 likes no responses
02/10/11 Digital Government , Research

A holistic view for Public e-Services diffusion and impact: Introducing project T.A.I.P.S.

One of my first posts on the Regional Innovation Policies blog was about “traditional” public e-services – as opposed to Government 2.0 new applications – and their still slow diffusion in many countries in Europe and in the world. My point there was that low take-up of public e-services, which is considered by some the main reason of the digital government failure, was probably simply due to a shortage of… public e-services.

While most critics of EU e-government policy point only to the lack of interest of households and enterprises in expensive and unsustainable digital public services, I think we should also consider that today a significant number of public agencies, especially in the lagging regions of the world – fail to deliver their most useful basic public services on line. Considering e-government services, though most of them were pushed by national governments in the first years of the new millennium and are already available on the web with an acceptable level of sophistication (see for example the list of CapGemini twenty basic public services in latest benchmarking report), the situation is very different at the local level, where small agencies are struggling to provide services with less money and face complex coordination issues with scarce skills.

Moreover, if we zoom out and consider advanced services from other recently-developed domains of digital government such as e-health, e-procurement, e-education, infomobility, “smart” cities, etc, the supply-related issues are manifest.

In other words, measuring the progress of digital government requires a holistic view to include the wide spectrum of public e-services in different policy domains (health, transportation, education, etc.) and the different aspects of service provision (not just e-readiness or web interactivity, but also multi-channel availability and take-up).

Providing this view is the main goal of TAIPS (Technology Adoption and Innovation In Public Services), a research project carried out by the Department of Economics, Society and Politics (DESP), University of Urbino (Italy) and funded by the European Investment Bank (EIB), which aims at exploring the determinants and impact of public e-services diffusion from the point of view of the Economics of Innovation. The project is lead by Professor Antonello Zanfei, an industrial economist whose interests range from innovation diffusion to industrial dynamics and economics of multinational enterprises.

A few weeks ago the first outputs were released. One paper is entitled What do we know from the literature on public e-services? and provides quantitative evidence that ICT research, as it happens in policy making, still considers the various policy domains as separate silos. The next step of TAIPS will be to unify those views. A benchmarking the progress of Italian regions with a joint, e-services pilot methodology is under way. This exercise is to be eventually extended to selected EU Countries.

Plus, TAIPS staff is organizing an International Conference in Urbino, Italy on April 19-20, 2012. Here you can download the outline. The deadline for abstract submission is pretty soon (on Wednesday, October 5), but will probably be extended a little bit. The conference will be interesting since many invited speakers – leading scholars in the field of Economics of Innovation and Information Technology – have already confirmed their participation. I will report again on this in the next few weeks, so please stay tuned!

0 likes no responses
07/09/11 Research

A strategic balance for open government data publication

A quite long debate on how to publish open government data is still  dividing stakeholders and researchers. Should government develop own tools for data visualization and analysis in order to include non-techically oriented citizens?


The debate on how to publish open government data is dividing public servants, open government advocates and researchers into – at least – two main groups.
There’s a first group of civic hackers organizations and – not surprisingly – academic literature that is focusing on the “invisible hand” of private sector or civil society organizations which is able to reuse PSI and to mash up this information with other sources to create new innovative services. In this case the government should only publish hi-quality data in an open, machine-readable format and let the others do all the rest.
Others are pointing to the risks of the so-called “data divide” or, from a public value perspective, think that government should consider different users needs and adopt a more pro-active approach e.g. by elaborating its data on governmental websites:

  • Interesting points on “data divide” or, more generally speaking, on “open data inclusion” for example are raised in Michael Gurstein blog. Moreover, in the comments of this World Bank blog post, Tim Davies highlights the importance of the skills to access, work with and interpret data widely amongst policy makers and local communities.
  • The public value perspective is introduced in this paper from the Center for Technology in Government (CTG), Albany, NY. Basically, this approach suggests that government should consider different users needs and the impact of a set of value generators on different groups of users.

So, what should public agencies do to ensure data inclusion and public value generation?

I recently presented a paper at EGOV 2011 conference entitled “Information strategies for Open Government in Europe: EU Regions opening up the data on Structural Funds”. In the paper I identified three groups of European Public Agencies publishing the data on the beneficiaries of EU Regional Policy:

  1. Agencies that publish the data in PDF with little information and detail on projects and financial data
  2. Agencies that focus on data quality, detail, accessibility and machine-readable formats
  3. Agencies that focus on data visualization, maps, graphs and interactive search, but only a few of them let the user download the underlying raw data

It seems that the second group is following a good strategy from an “invisible hand” point of view, but is lacking actions to include non-technically oriented citizens. The third, even if it can be argued that is not pursuing even an “open” data approach, shows some interest in data inclusion since it’s presenting the data in a “easier” way (maps, etc.) and/or in an aggregated form, which are useful for non-technically oriented citizens.

One conclusion that can be drawn is that both the approaches are necessary. But is it really necessary that every agencies develop their own data visualization tools? How many tools are necessary for the same kind of data (e.g. beneficiaries of EU funding) in EU regions? What is the minimum set of information (metadata, notes from the public administration to suggest a correct interpretation, etc.) required for this kind of data?
For example, in the case of European Common Agricultural Policy: should each State develop geo-referencing tools and maps or let do all the work?

0 likes no responses
25/07/11 Research

US and EU in search of an Open Government R&D agenda: 44 topics in 4 clusters

Open Government is not only changing politics and policies but is also redefining the notion of established research areas such as e-government and e-democracy.  The world of research – with the active participation of practitioners – needs to define an Open Government R&D agenda for the years to come.

In this post – just a note to myself – I list some interesting research topics classified into 4 main areas.

Barack Obama’s presidential campaign and the US Open Government Directive of December 2009 profoundly changed the way governments of the whole world are conceiving the role of ICT in the Public Sector. Obama’s Directive, which directly (and almost immediately) influenced policy making in most OECD countries and also contributed to the growth of bottom-up initiatives, is now impacting the world of research.

Key questions such as the actual impact of open government data on citizens and enterprises remain largely unanswered. It is not just a matter of democratic principles and political messages, or transparency only. The diffusion of web 2.0 technologies and user-driven innovations in the public sector – along with the creation of new business opportunities coming from the re-use of government data by the private sector – is changing the perspective of interdisciplinary but actually quite separated research fields such as e-government (focused on the use of ICT in internal processes and in public services provision) and e-democracy (focused on citizen engagement through technologies such as on line polling and voting, deliberation, consultation). Teresa M. Harrison and her colleagues from the Center for Technology in Government (CTG) at the University of Albany SUNY made this clear in a paper published a few days ago: “Although e-democracy in political and e-government in administrative realms have historically been largely separated, it now appears Open Government brings these two spheres of activity together”. 
On the one hand, the provision of e-government services not only requires technical expertise but also, inevitably, implies political choices. On the other hand, e-government implementation should take advantage of the “power of the crowd” and the opportunities that come from involving the citizen and the private sector in new forms of public-private collaboration.

As boundaries between research domains are blurring, time has come to define an Open Government holistic framework and a global Open Government R&D agenda.

In Europe, the CROSSROAD project, a Support Action funded by the European Commission, has produced a Research Roadmap for “ICT for governance and policy modeling”, as defined by the objective 7.3 of the EU Seventh Framework Programme (FP7) 2009-2010. A white paper published in December 2010 and edited by Fenareti Lampathaki, Sotiris Koussouris, Yannis Charalabidis and Dimitris Askounis (National Technical University of Athens) identifies five main research themes and a three-level taxonomy. As the point of view is the broader concept of ICT for governance and policy making, a set of useful tools and research domains that are not usually considered in the current debate on Open Government are included here. This is the case of public opinion mining tools, which could be used to find out, for example, what types of citizens care about which type of government information. Another examples are the technologies that the EU classifies into the “Future Internet” studies, some of which (e.g. the Internet of Services) are based on government linked data availability.

In the US, the President’s Council of Advisers on Science and Technology (PCAST) highlighted the importance of establishing an R&D agenda for open government in a report issued in December 2010. The Open Government Research & Development Summit was hosted on March 21-22nd, 2011 by the Networking and Information Technology Research and Development (NITRD) Program. The summit brought together government leaders and researchers to explore the needs of the community, and was organized by the office of the U.S. Chief Technology Officer Aneesh Chopra, while Beth Noveck – law professor at the New York Law School – was one of the prime movers on getting the meeting to happen.
Building on this first event, a workshop organized by the Center for Technology in Government (CTG) in Albany, New York on April 27-28th gathered a number of academics, practitioners and, moreover, hundreds of research questions still unanswered. These questions were then clustered into omogeneous groups such as “the value / ecosystem of Open Government”, “What do citizens want?”, “Government capabilities”, etc. As a second step, research questions were considered by four lenses: 1) law and policy, 2) management, 3) technology and 4) cross-cutting. Professor Ines Mergel reported on this in her blog: day one and day two. Furthermore, a full list of all the questions is now available in a CTG report prepared by Meghan Cook and M. Alexander Jurkat, which also include an interesting list of the biggest challenges faced in Open Government as perceived by the participants.

EU CROSSROAD project and US CTG workshop came up with quite similar research themes and questions, with CTG themes mainly comprised in the first section of CROSSROAD taxonomy “Open government Information and Intelligence for transparency”. Other CROSSROAD areas partially in common with the US approach are, for example, “Social computing, citizen engagement and inclusion” and “Identity management and trust in governance”.

In the following table I try to combine some of the most interesting aspects of the CROSSROAD and CTG exercises, that is a robust identification of research clusters and the use of “lens” corresponding to different disciplines.
Questions and themes are grouped together on the basis of data and information flows from government to citizens and back from citizens and businesses to government. With reference to the figure:

  1. Open / linked data “supply side”: how to foster meaningful and useful government data publication? What implications / impact within the government agencies?
  2. Open / linked data “demand side”: how to meet citizen and businesses needs? How to support data use and re-use?
  3. Social computing: How to involve the citizen in collaboration projects / activities?
  4. Citizen engagement: How to involve the citizen in democracy?
    For each combination of cluster / research theme vs. lens / research discipline I list some examples of questions and topics particularly interesting to me.

0 likes no responses
18/03/11 Research

Towards EU Benchmarking 2.0 – Transparency and Open Data on Structural Funds in Europe

The first output of a web‐based survey shows that the European Cohesion Policy is only halfway to accomplishing a paradigm shift to open data, with differences in performance both between and ‐ in some cases ‐ within European Countries.
Low scores are attributed to the formats the authorities are choosing when publishing their data on the web, while other indicators such as the level of granularity are positively influenced by the requirements of current regulations.

Availability of Open Data on projects and beneficiaries of the European Cohesion Policy (or Regional Policy) – which is the second-biggest EU policy after agriculture with a budget of EUR 347 billion for the period 2007-13 – can surely help foster transparency in the use of public money in Europe.

The European Union lacks common initiatives such as the US or USA to track government spending and improve transparency of public policies. In particular, in the case of Structural Funds, there is no single point of access to the data, since each single EU Region and National agency acting as Managing Authority of the Funds is responsible for publishing data on the beneficiaries and the amount of public funding received.  This implies that hundreds of Managing Authorities are following different paths and implementing different information strategies when opening up their data. Many databases (often simple PDF lists) are now uploaded to regional or national institutional websites, showing huge variation not only in the way they can be accessed (formats, search masks, data visualization etc.) but also in content and quality of data provided (detail level, granularity, description, etc.).

Last summer, after a first analysis on the prevailing formats, I started to design an independent web-based survey on the overall quality of data published by each Managing Authority responsible for the 434 Operational Programmes approved in July 2009. Data was collected in October 2010 by me and Chiara Assunta Ricci, a brilliant PhD student in Economics at La Sapienza University of Rome. We were inspired by what people at the fantastic project had been able to do with the data from the other big EU policy, the Common Agricultural Policy. While their greatest achievement is having gathered the PDF documents every State has to publish on line in one real database, they also provide an evaluation of the lists of projects they have used by a transparency composite indicator. The same exercise could be applied to Structural Funds.

The first output of the survey was published a few days ago in the European Journal of ePractice. Here you can download the paper and here the full issue “The Openness of Government”. The paper is based on David Osimo’s seminal proposal for a “Benchmarking 2.0” and represents a pilot of a measurement framework for comparing governments’ efforts to make data available.

This exercise could represent a first step for improving current ‘traditional’ EU e-Government benchmarking. In fact, the new edition “Digitizing Public Services in Europe: Putting ambition into action – 9th Benchmark Measurement” (page 19 and 137) confirms the importance of updating and expanding the scope of the analysis by including new metrics on “Transparent and Open Government” .
The evaluation scheme is based on the Eight principles of Open Government Data, which are considered as a key reference and a worldwide de facto standard. This scheme is meant to be flexible and could be applied to other kinds of Government Data. Gianfranco Andriola, one of the promoters of the Italian Open Data Licence, helped me define the methodological approach for the principles “format” and “licence”. I must also thank the “man behind the curtain” of Spaghetti Open Data initiative, Matteo Brunati aka Dagoneye, for his suggestions about open formats, and Sergio Scicchitano for his many advices and support.

Results can be summarized as follows:

  1. The European Cohesion Policy is only halfway to accomplishing a paradigm shift to open data, with differences in performance both between and – in some cases – within European countries. Best performing countries such as the Czech Republic and Finland obtain a score of 71%, while the worst performing Member State is Latvia with 25%. Countries from the eastern Europe often appear in the first half of the chart.
  2. Very low scores are attributed to the formats the authorities are choosing when publishing their data on the web, while other indicators such as the level of granularity are positively influenced by the requirements of current regulations.
  3. A considerable difference in performance is shown when comparing datasets that are shared and centralized at national level with those which are managed by a single regional authority. This variation is also statistically significant with regard to all the indicators examined, and is probably due to the fact that a centrally managed programme has the advantage that information flows are easier to manage and local actions are more easily coordinated.
  4. The use of open, machine-processable and linked-data formats have unexpected advantages in terms of transparency and re-use of the data by the public and private sector. The application of these technical principles does not need extra budget or major changes in government organization and information management; nor does it require the update of existing software and infrastructures. What is needed today is the promotion among national and local authorities of the culture of transparency and the raising of awareness of the benefits that could derive from opening up existing data and information in a re-usable way.


0 likes no responses
15/12/10 Open Policy , Research

European regions financing public e-services: the case of structural funds

As reported in one of the papers underlying Barca Report on the future of European Cohesion Policy, “In the 2007-2013 planning period the share of Structural Funds of the European Union allocated to Research and Innovation received the largest increase, in absolute and relative terms. It is no exaggeration to claim that, for many countries, the entire Lisbon Agenda rests on Structural Funds”.
This is particularly true for the lagging regions of the “Convergence” objective, where structural funds are by far the main source of funding for innovation in general and for e-services in particular. A specific “category of expenditure” is in fact dedicated to public e-services such as e-health, e-government, e-learning, e-inclusion, etc. which are named “services and application for the citizen” (Regulation no. 1828/2006).

Using European Commission data on programmed resources for the 2007-13 period, it is possible to explore the amount of total resources dedicated to this topic by each single Operational Programme (OP). 
The map above shows the amount of resources programmed by all types of OPs (regional, but also national and interregional), with regional disaggregation (NUTS2). Regions from Slovack Republic have planned high investments in e-services (more than 189 million euros); Campania (147,5 million euros), Andalucia (Spain) and Attiki (Greece) also belong to the cluster of Regions showing the highest absolute values.

Moreover, considering the percentage of the resources not only for e-services but also for the other categories of expenditure dedicated to Information Society, it is possible to analyze the strategy each region implemented when allocating public funds to public e-services, broadband, ICT diffusion among enterprises or infrastructural services.
In the “Convergence” Regions, a specific “public e-services strategy” emerges. That means that Regions investing in public e-services tend to exclude the other matters; they concentrate available resources to e-government or e-health, and very low percentage of total funding is dedicated to the other categories such as broadband or infrastructural services. For example, while funds dedicated to ICT diffusion among enterprises are always accompanied by measures for broadband penetration, resources for e-services “stand alone”, and show low correlation with the other components of Information Society funding. 
This fact, if confirmed, seems not really positive, since the development of e-services should come along with the diffusion of the necessary pre-conditions.
Another interesting question is: what determine this strategic choice? is it possible to isolate context-specific factors or the choice is based only on political criteria?

Preliminary results of this study are included in the presentation embedded below, which Sergio Scicchitano and I have prepared for the first public meeting of Technology Adoption and Innovation in Public Services (TAIPS) research project at University of Urbino, Italy. The project is funded by Eiburs – European Investment Bank University Research Sponsorship Programme. In the presentation you can find graphs and other figures showing the allocation of resources at national and regional level, and the details of the principal component analysis.

0 likes no responses
My Twitter Feed
RT @Erica_sir: Da questo momento sulla pagina fb @AgidGov ( potete fare le vostre domande su #spid ci vediamo lì #s
Interesting views from @damienlanfrey on how to build better media (in Italian) @EthanZ #infomediaries
RT @aberemliysky: @luigireggi @ActionAidItalia @Monithon @EU_Regional indeed, the positive feedback should make the ambition even greater!
Cool that Italy's Integrity Pacts get so much attention from national press! @ActionAidItalia @Monithon @EU_Regional
RT @Monithon: Ora sono 94 i report di monitoraggio civico di #ASOC1617 approvati e pubblicati. E la qualità è migliorata ancora! https://t.…
RT @Monithon: Oggi abbiamo letto e approvato 45 report di monitoraggio civico dei ragazzi di #ASOC1617. Solo per dirvi che sono tutti belli…
RT @cohesify: #Cohesionpolicy accounts for 1/3 of the EU budget, but does this attract the attention of political parties?…
RT @Monithon: Sull'importanza del monitoraggio dell'efficacia degli interventi pubblici cc: @ParlWatchItalia @transparency_it…
RT @Monithon: Nel giorno dei #EU60, ecco i primi report di monitoraggio civico sui #FondiUE dei ragazzi di @ascuoladioc!…
European #OpenData Dashboard: indicators on #EU Countries' policies, features and impact via @EU_opendata #OpenGov
Recent Comments
- Putting research into practice: Training academics to use Open Data as OER: An experience from Uruguay | Thoughts on Open Education to OpenCoesione School – A scalable learning format using OpenData as Educational Resources
[…] developed by A Scuola di Open Coesione, and in the work t Chiara Ciociola and Luigi Reggi...
- Così fallisce l’Open Government: quando lo Stato fa auto-gol | to A (long) list of the risks of Open Government
[…] un post di un paio di settimane fa mi sono cimentato in una prima lista dei possibili ris...
- Così fallisce l'Open Government: quando lo Stato fa auto-gol | Tech Economy to A (long) list of the risks of Open Government
[…] analisi dell’esistente, l’Open Government porta con sé grandi potenzialità m...

A citizen monitoring marathon of the development projects financed by the European Union