In Governance, Foresight, Technology and the Law, News and Events, Commons Lab on May 22, 2013 at 10:13 am
In our recent post on the Open Data Policy, we mentioned Project Open Data as an exciting manifestation of collaborative government concepts put into practice. To learn more, we reached out to GitHubber Ben Balter, former Presidential Innovation Fellow and previous contributor to the Commons Lab. Ben also provided input on agile development for our paper on the National Broadband Map.
How did GitHub become a part of this project?
I was working as a Presidential Innovation Fellow when the process to create the Open Data Policy began. Anyone within government is used to seeing documents circulate with no real idea of when it was edited, by whom, whether it was the most current version, and so on. This is very opaque. So while we’re working on open data policy, the process itself was very not open. Open source developers within the Innovation Fellows started talking about using GitHub to create the actual document. Lowering the barrier to entry was always the idea—we want people editing this and sharing their perspectives.
In Commons Lab, Foresight, Governance, News and Events, Technology and the Law on May 9, 2013 at 12:08 pm
FCC Visualization of Low Power FM Availability, built on open data and explained on GitHub.
Today, the Office of Management and Budget and the Office of Science and Technology Policy jointly released a new Open Data Policy directing agencies to implement specific structural reforms. In conjunction with an Executive Order prioritizing open and machine readable government information, these adjustments are forward looking and exciting. They speak to a general understanding that a deliberate approach to the way that data are processed and released can exponentially enhance their value.
We’ve seen several examples of the drive towards open data and innovation, and today’s releases should augment previous improvements. Philosophically, the Open Data Policy commands agencies to view information as an asset, building an approach that will allow people inside and outside of government to leverage its potential. The Policy’s requirements look at the life cycle of data in practical terms, envisioning a process where accessibility is the norm. This includes using machine-readable and open formats, following data standards, using open licenses and choosing common core and extensible metadata. The Policy directs that information systems support interoperability and information accessibility, and strengthening data security to protect privacy and confidentiality.
An exciting manifestation of this Policy is Project Open Data. Provided via the dynamic coding community GitHub, Project Open Data could be a rich resource for agencies working to implement this Policy and citizens looking to learn more. It is designed as “an online repository of tools, best practices and schema to help agencies adopt the framework presented in this guidance.” Envisioning it as a “community resource,” it will include definitions, explanations, code, checklists and case studies. This type of educational and collaborative hub has helped facilitate thriving technology communities. Project Open Data could go a long way towards providing agencies the tools they need to get the job done.
We’ll continue to track these developments as they arise. Today’s releases show an encouraging emphasis on the nuts and bolts of making open government data the status quo, not an exception.
View the Commons Lab’s case study on the National Broadband Map as government open innovation here.
In Commons Lab, Crowdsourcing, Disaster Management, Foresight, News and Events on May 6, 2013 at 2:28 pm
Editor’s note: In September 2012, the Commons Lab hosted the Connecting Grassroots to Government for Disaster Management workshop. Over two days, we spoke with a number of event participants for a series of video podcasts covering various aspect of the proceedings. The conversation below with Eric Rasmussen is the first of these podcasts. Please stay tuned: Additional installments will be posted in the coming weeks and the workshop summary report will be published in June.
Eric Rasmussen wears many hats: He is a medical doctor, a research professor for environmental security and global medicine at San Diego State University, an affiliate associate professor of medicine at the University of Washington, and the managing director at Infinitum Humanitarian Systems, a “profit-for-purpose” company in California that focuses on reducing vulnerability for systems and populations. In addition to sitting on a number of boards, Rasmussen served in the Navy for more than 25 years and was deployed more than 15 times to Iraq, Afghanistan and other countries.
In this podcast, Rasmussen discusses the limitations software developers face when moving ideas from concept to implementation in disaster response, noting that developers often have too little access to end users and too little understanding of the constraints faced by those users in the field. He also discusses the need to engage agencies and other responders early on to make sure new systems are incorporated into agency response plans and the role of policymakers in addressing these challenges.