Barriers and Accelerators to Crowdsourcing and Citizen Science in Federal Agencies: An Exploratory Study
Draft Summary for Discussion — September 5, 2014
The Commons Lab within the Wilson Center’s Science and Technology Innovation Program conducted an exploratory study to identify key accelerators and barriers to the use of crowdsourcing and citizen science in four federal agencies. The goals of this study were twofold:
- Identify and understand key factors in designing, launching and sustaining successful federal agency crowdsourcing and citizen science initiatives;
- Identify and understand internal and external roadblocks to federal agency crowdsourcing and citizen science initiatives, both experienced and perceived; and,
The results of this study will be used to inform a subsequent legal and policy analysis of the key barriers, which in turn will be used to suggest alternatives for addressing these barriers.
From July to August 2014, the Commons Lab conducted 27 interviews with personnel in four federal agencies – two regulatory agencies and two science agencies. These conversations took place on the phone and in person with four different levels of personnel: agency executive, program office, field office, and stakeholder. Interviewees were given the option to remain anonymous,to take comments off the record if needed, and to withdraw at any time. Interviewees were sent a copy of the questions in advance, as well as a copy of their transcript for vetting and approval.
The Commons Lab recognizes that this is only an exploratory study and in no way a representative sample or represents the experience of every person with an agency. However, conversations at each level revealed thought provoking accelerators, barriers, and suggested solutions. The following is a brief overview of the preliminary findings.
The following is a first pass assessment of the accelerators and barriers based on the interviews:
- High-level champions within agencies and at the executive level.
- The legal authority of some agencies. One agency, for example, found the successful pitch of crowdsourcing and citizen science as an educational outreach tool to be helpful.
- Personnel dedicated to helping staff navigate the bureaucracy. Agency staff, however, noted that the Paperwork Reduction Act approval process was very time-consuming even with this assistance, but the availability of an expert allowed for more efficiency.
- Success stories / best practices case studies.
- Partnerships with universities, other federal agencies, contractors, museums, philanthropists, and schools. Often problems that federal agencies faced with the bureaucratic process could be avoided through partnerships with other organizations not subject to the same administrative requirements and regulations.
- Working groups and communities of practice within and across agencies were cited as useful to forming networks and navigating bureaucratic processes.
- Previous executive orders for open data.
Interviewees at the project level, as well as executives reporting on overhead and challenges that were relayed to them, provided anecdotal evidence of barriers and accelerators they found in their experiences. The following list is in no way exhaustive, including select interviewee experiences or general trends.
- The Paperwork Reduction Act (PRA) was mentioned in nearly every interview across agencies and levels within each agency. Some agencies have infrastructure in place that makes the process more clear; but overall, in each of four agencies in which interviews were conducted, staff lack clarity, direction and counsel in navigating this process. Even when the requirements are met, interested parties find the approval process to be slow and arduous, leaving them unsure as to start dates and sometimes resulting in missed opportunities. At one agency, this resulted in limiting citizen involvement in projects or staff hesitation to further develop platforms.
- Privacy, much like the PRA, is of great concern to many agency staff as there is a lack of clarity as to what is and is not allowed to be collected from members of the public. Often the fear of privacy violation stops agency staff from pursuing citizen science and crowdsourcing projects, or collecting data that could be valuable.
- Data quality troubles many agency staff. Some view citizen science and crowdsourcing as purely an educational or public relations tool, which is harmful to perception of citizen science/crowdsourcing as a viable methodology of collecting scientific data. Personnel in multiple agencies have noted a general skepticism of the scientific community toward data collected in this comparatively uncontrolled way, as it is unfamiliar.
- Liability for the agency and the volunteers. The Volunteer Protection Act provides some safeguards for volunteers, but may be irrelevant to the kind of work crowdsourcing/citizen science activities include, inhibiting the kind of data agencies can collect.
- Agency culture can be perceived highly resistant to crowdsourcing/citizen science (or innovative methodology generally), depending upon an individual’s place and experience within an organization
- Perceived utility of the data was a concern of many agency staff interviewed. Agency scientists may be skeptical of the utility of the data, and were not convinced that it was worthy of use or expended time for its nontraditional methodology.
- There is a difference in perception within agencies. Some individuals defined this gap by age (older v. younger), profession (scientist v. IT), or length of time at the agency (under 5 years versus over 10 years).
- A lack of funding for citizen science and crowdsourcing specific projects was of a universal concern across those interviewed at four agencies. While some noted an availability of funds, the emphasis on small grants for educational purposes was common.
- A lack of relevant examples of successful crowdsourcing or citizen science endeavors pertinent to specific agencies was cited as a problem. Many were able to identify successful projects, but not ones that were relevant to their issue, or useful to pitching their idea as feasible to a higher level within the agency. There also was a lack of examples from the regulatory perspective.
- A lack of active support or champion for project managers was difficult. Support in many cases was cited as passive, not active, leaving project managers to navigate the bureaucracy on their own. This, compounded with the fact that agency direction is often unclear, left many feeling alone.
- A lack of an IT department for quick design and building infrastructure makes the process more expensive and unrealistic.
The following items, listed in no particular order, were suggested by interviewees as useful resources or changes they anticipate would help future iterations of projects they embarked upon. Again, this is a snapshot and is not exhaustive.
- A clear defined roadmap of required action to navigating the bureaucracy of PRA and Privacy: Interviewees frequently cited the uncertainty of next steps or requirements to be frustrating and time-consuming. A roadmap could be provided at the agency level or Office of Management and Budget level so project managers would have a better idea of what was needed to go through the process. One agency project manager stated simply “I just needed to know what I needed to do.”
- Personnel to field questions and answers regarding the process, whether within OMB or agencies: Having an accessible point of contact that knows the approval process at an agency and/or OMB level would be tremendously helpful, staff indicated, and PRA applicants would not be “shuttled to different people to piece together the steps and best practices.”
- An Executive directive approving this kind of data collection:
- In the past, executive directives have been useful for agencies. Such an order could provide an authority that some agency officials do not believe they yet have to engage in crowdsourcing and citizen science projects.
- As IT is often side swept in budgetary concerns or contracted out, some agency staff have reported a lack of access or inability to act nimbly. An executive order to make IT a core government function, some have suggested, may allow for greater control and possibilities for creative projects.
- Clauses of flexibility added to PRA
- Combining or streamlining the application process for identical web-based and mobile applications would allow for projects to adapt their models to accommodate for changing technology. Currently, web apps and mobile apps that do the same thing require individual PRA approval.
- Greater flexibility in PRA for volunteer crowdsourcing and citizen science projects, and/or opportunity for accelerated timeline for issues of pressing scientific concern could be useful to avoid missed opportunities.
- Interagency Communication: Interviewees often expressed the concern of repeating actions across the government, and wasting resources. Interviewees suggested it would be useful to have a government-wide database to query existing projects and facilitate partnerships between people across the government to allow for more cost-effective, powerful data collection and citizen science/crowdsourcing projects.
Suggestions for Future Study
Further exploration into this topic would include a deeper dive into the four agencies, deriving a more representative sample and range of experiences. It also could include a wider sample of agencies. Ideally, this also could be expanded to include staff from sixteen agencies participating in the Federal Community of Practice for Crowdsourcing and Citizen Science.
Next steps will include a legal follow-on study to this project by the Wilson Center.
About the Authors:
Melissa Gedney conducted the majority of the interviews as part of her independent summer research project for the Commons Lab, under the guidance of Lea Shanley, Director, Commons Lab, Wilson Center. Melissa is a recent graduate of George Washington University with a Bachelors in Political Science, and has recently joined Digital Promise as an Associate.