Both sound software verification techniques and heuristic software<br/>flaw-finding tools benefit from the presence of software annotations<br/>that describe the behavior of software components. Function summaries<br/>(in the form of logical annotations) allow modular checking of software<br/>and more precise reasoning. However, such annotations are difficult to<br/>write and not commonly produced by software developers, despite their<br/>benefits to static analysis.<br/><br/>The Crowdsourcing Annotations project will address this deficiency by<br/>encouraging software-community-based crowd-sourced generation of<br/>annotations. This effort will be supported by tools that generate, use,<br/>and translate the annotations; the results of annotation efforts will be<br/>shared through openly available repositories. We will also use pilot<br/>projects to demonstrate and encourage the use of annotations and static<br/>analysis. The project will leverage and interact with the Software Assurance Marketplace (SWAMP)<br/>project's collection of static analysis tools and example software. Some<br/>of the technical challenges are developing uniform styles and languages<br/>for annotations, reliably validating crowd-sourced submissions, merging<br/>annotations and the corresponding source code, version control, and<br/>integration with typical software development environments. The social<br/>challenges are also important: designing and implementing a<br/>crowd-sourcing infrastructure in a way that enhances and motivates<br/>community and individual technical and social benefits.