Glossary of collective intelligence methods frequently used for the SDGs
Each use case illustrates how different collective intelligence methods, sometimes used in combination, are deployed to achieve a particular goal. A striking finding of this analysis is the flexibility that this toolbox offers to achieve different ends.
Our analysis found these 15 methods are currently being used most frequently for SDG-related activity, which we summarize below.
Citizen-generated data is a broad category that includes any information that can be collected from people either by active involvement (e.g. videos, reports, ideas – usually using digital platforms) or passively (e.g. transactions data, call detail records, wearables).
Citizen science is any process where scientists and (usually unpaid) volunteers work together to collect or process scientific data or observations. Citizen science unlocks new resources for research, experimentation and analysis by opening the process to anyone.
Combining data sources
Combining data is a process of bringing together two or more different datasets to unlock new value or generate new insights that would not otherwise be exploited. It may involve partners entering into an agreement to exchange data for a specific social cause. These datasets may include data that is passively generated by people (e.g. call detail records), or actively contributed (e.g. citizen reporting).
The ability of a computer to understand, analyze or generate images and/or videos. Frequently used to help classify drone or satellite images or user-generated images.
Crowd forecasting is a method that asks small or large groups to make predictions about the future. Individual predictions are aggregated using statistics into a consensus crowd forecast. It’s inspired by research which showed that small crowds of non-experts can often forecast political events more successfully than individual experts.
Crowdmapping is a type of crowdsourcing which gathers data from different sources (including social media, text messages or geographic data) to provide real-time, interactive information about issues on the ground. Crowdmapping can create detailed almost real-time data in a way that a top-down, centrally-curated map may struggle to replicate.
Crowdsourcing is an umbrella term for a variety of approaches that source data, information, opinions or ideas from large crowds of people, often by issuing open calls for contribution. It can help bring new ideas to light that hadn’t previously been considered, or to gather expertise from people who have specialized knowledge or understanding of an issue.
Microsurveys are a short, abbreviated form of surveying which typically take the respondent only a few minutes to complete. Microsurveys are often delivered by mobile phone, text message or a digital platform. Benefits include a much faster turnaround, and higher frequency of results, compared to traditional surveys.
Natural Language Processing (NLP)
NLP allows computers to understand, interpret and extract key information from human language. NLP techniques can be used to carry out automated analysis of user-generated text from sources like social media, to better understand what issues matter to people, translate languages or simulate language.
Open data is the raw data that is gathered by people or organizations and published in an electronic format that machines can read. It’s then shared online and allowed to be re-used by others instead of keeping it private.
Open source repository
An open source repository is a digital repository where content (e.g. code, text or DIY designs) can be stored and freely downloaded with few restrictions on use. Many open source repositories aid collaboration by providing a space for uploading documentation, monitoring and version control.
Peer-to-peer exchange refers to the process of sharing information horizontally to build and maintain a community, to collect data, connect people or send alerts. Platforms for this vary, ranging from messaging platforms to online forums or collaborative platforms. Some rely on the internet but others do not (e.g. SMS or mesh networks).
Predictive analytics encompasses a variety of statistical techniques that enable a computer to analyze structured data using numeric and machine-readable data. It typically relies on algorithms from classical machine learning. It can be used to make predictions about the future or otherwise unknown events.
Remote or in-situ sensing
Collecting information from satellites or physical sensors recording actions and physical changes (e.g. traffic cameras, weather sensors, ambient sensors, wearables or drones). This data can provide cheap, real-time measurements of anything from pollution to crop yields.
Web scraping is a method for extracting unstructured data from across the web, such as company websites or social media. Where official datasets are costly to gather and infrequently updated, web scraping can provide more timely insights into social or economic trends.