In the age of Big Data, the collective intelligence of human beings can generate a lot of data to solve some of humanity’s major problems. Similarly, it allows certain data to be analysed more efficiently than computer algorithms. Discover the close relationship between collective intelligence and Big Data.
Collective intelligence: definition
The collective intelligence refers to a group intelligence, or shared intelligence emerging from collaboration, collective efforts or competition among several individuals. This allows decisions to be taken by consensus. Voting systems, social networks, and other methods of quantifying mass activity can be considered collective intelligences.
This type of intelligence appears as an emerging property of the synergy between the knowledge offered by data, software, hardware, and experts in specific fields, enabling better decisions to be made at the right time. Simply put, collective intelligence results from the association between humans and new ways of processing information.
A widespread concept
The concept of collective intelligence has sociology, in computer science, but also in business. For Pierre Lévy, it is a form of universally distributed intelligence, which is constantly improving, is coordinated in real time and results in an effective mobilization of skills. The foundation and purpose of this form of intelligence is the mutual recognition and enrichment of individuals rather than the worship of hypostasized communities. In the eyes of Pierre Lévy and Derrick de Kerckhove, it refers to the capacity of computer technology networks to deepen the collective pool of social knowledge by simultaneously extending the scope of human interactions.
It contributes strongly to the transition of knowledge and power from the individual to the collective. According to Eric S. Raymond and JC Herz, Open source intelligence will eventually generate superior results to the knowledge generated by proprietary software developed within companies.. For Henry Jenkins, she’s an alternative source of media power. The latter criticizes schools and education systems in particular, promoting autonomous problem solving and individual learning. However, he remains hostile to learning through this medium. Nevertheless, like Pierre Lévy, he considers it essential for democratization, as it is linked to a culture based on knowledge and fuelled by the sharing of ideas. Indeed, it contributes to a better understanding of a diverse society.
Origin of the concept of collective intelligence
The concept of collective intelligence dates back to 1785. It is at that time that the Marquis de Condorcet stressed that if each member of a group has a higher probability of not making a correct decision, the probability that the majority vote of that group is the correct decision increases with the number of members of the group. It’s called jury theorem.
Another precursor to this concept was the entomologist William Morton Wheeler. According to his 1911 statement, seemingly independent individuals can cooperate to the point of becoming a single organism, a collective intelligence. The scientist perceived this collaborative process by observing ants, acting as cells of an individual entity.
In 1912, Emile Durkheim identified society as the only source of logical human thought. According to him, society is a superior intelligence, as it transcends the individual in space and time. In 1962, Douglas Engelbart established the link between collective intelligence and business efficiency. According to him, three people working together to solve a problem will be far more effective than three times as effective as one person.
Collective Intelligence in the Age of Big Data
In the age of Big Data, many companies tend to look for answers to their questions where they are easy to find, rather than where they are likely to find them. In reality, the probabilities of a Big Data research group discovering useful information depends on the type of data available. Structured, numerical, explicit and smooth data will be more easily processed by computers, while unstructured, analogous and ambiguous data make more sense to the human brain.
However, for a human as well as for a computer, the larger the data sets, the more computing power is needed.. In the case of structured data, more powerful computers will do the trick. On the other hand, for unstructured data, it will be essential to rely on the collective intelligence of several human brains.
If the objective is to predict the future, a statistical approach to Big Data is particularly flawed. Indeed, the available data are necessarily rooted in the past.. In fact, while these data can predict situations similar to those in the past, for example for a mature product line in a stable market, they become useless for forecasts related to new products or disrupted markets.
Collective Intelligence: example of applications
This is some examples of situations in which the predictions made by collective intelligence prove more useful than the predictions made by the Big Data :
Turmoil in the markets: collective intelligence in companies
In the mid-2000s, global demand for dairy products suddenly tripled in the space of a few months. After a decade of stability, dairy producers could no longer rely on data-based predictive models. As a result, Industry players have relied on the collectivization of agriculture, closer to consumers, to better understand and model the factors of this new demand..
A few years ago, Lumenogic collaborated with a team of marketing researchers to perform market prediction for a Fortune 100 company, focusing on new products. This Predictive methods proved to be more relevant than data-based forecasts in more than 67% of cases, reducing the average error by about 15%, and reducing the range of errors by 40%..
Over the past 20 years, prediction markets have become famous for their ability to outperform polls in predicting election results. Last November, during the U.S. presidential elections, the collective intelligence of Hypermind traders surpassed all the data-driven statistical prediction models put in place by the media giants. The explanation is simple. It is capable of aggregating a lot of unstructured information about what makes each election unique, at a level that is inaccessible to statistical algorithms, however sophisticated they may be..
Despite the current flood of structured data, which is in full swing, it is essential to keep in mind that the world is full of unstructured data. Only the human mind finds meaning in it. So, if you find yourself searching in vain for answers while exploring the Big Data, remember that it could remedy the problem.
Collective intelligence and Open Data to understand epidemics
Researchers at Welcome Trust Sanger Institute and Imperial College London have developed Microreact. It is a free platform for real-time visualization and monitoring of epidemics.. This tool has been used to monitor outbreaks of Ebola, Zika, and antibiotic-resistant microbes. The team has collaborated with the Microbiology Society to enable researchers around the world to share their latest information about outbreaks.
So far, data and geographical information on the movement and evolution of infections or diseases have been confined to databases that are not accessible to the public. Researchers have had to rely on information published in research articles, sometimes expired. They contained only static visuals presenting a small portion of the epidemic threat.
Microreact System: Facilitating Data Sharing
The system Microreact is cloud-based, combining the power of open data and collective intelligence from the web to provide real-time global data visualization and sharing.. Anyone can explore and examine information with unprecedented speed and accuracy. This tool can play a key role in the surveillance and control of epidemics such as Zika or Ebola.
Data and metadata are uploaded to Microreact from a web browser.. They can then be viewed, shared and published from a permanent web link. The partnership with Microbial Genomics allows the journal to create data from prospective publications. This project promotes open availability and access to data, while developing a unique resource for healthcare professionals and scientists around the world.
The The work of Dr. Kathryn Holt and Professor Gordon Dougan is an excellent example of how Microreact can democratize genomic data and the insights it provides.. They recently published two articles on the global distribution of typhoid bacteria and the spread of drug-resistant epidemics. They have also published their data directly on Microreact to help other researchers develop further work.
In publishing this data on Microreact, the researchers ensured the data’s sustainability. They allowed others to learn from their work. They used the information as a basis for comparison or as a foundation for future projects.. Microreact also enables individual researchers to share information globally and in real time.
Collective intelligence to solve humanity’s major problems
By 2050, humanity is likely to face many problems. The rising oceans, global warming, scarcity of resources are some of the challenges we will have to face.. To achieve this, we can and must rely on collective intelligence.
Thanks to With the rise of internet forums, research groups, wiki pages, social networks and the blogosphere, new methods of problem solving have emerged.. Scientists now perceive the Internet as a common research group. This mode of learning and communication, which can be categorized as collective intelligence, makes it possible to determine the consensus of many minds in order to find an answer to complex challenges.
Managing climate change through this prism
MIT’s Center for Collective Intelligence is developing an online forum called the Climate Collaboratorium. This forum presents itself as a computer model in perpetual evolution. It represents the atmosphere of planet Earth and human systems. It is fed by online scientific discussion rooms.. All variables and factors related to climate, how the environment, human interactions and ecology are included in this evolutionary model.
The Professor Thomas W. Malone, founder of the center, compares the Collaboratorium to the Manhattan Project. The Manhattan Project developed the atomic bomb during the Second World War…. The difference is that the Collaboratorium aims to solve a problem concerning all human beings. Thanks to new technologies, starting with the Internet, it is possible to bring together many more people than during the Second World War.
In At the end of 2014, the Climat CoLab had 33,000 members from more than 150 countries.. NASA, the World Bank, the Union of Concerned Scientists, many universities and other government agencies are involved in the project. The aim of the project is to combine all the possibilities of human beings to fight climate change. Social, political, economic, and engineering solutions are reviewed.
Developing better jets with collective intelligence
Boeing uses the creative potential of collective intelligence to design jets. The 787 Dreamliner was created in collaboration with over 1000 partners, each with their own ideas to create the ultimate aircraft. This jet, which will be launched in 2011, incorporates elements of the 777’s design combined with composite materials such as carbon fibre-reinforced plastic. This represents approximately 50% of the main structure to replace aluminium. This jet has set a new standard in terms of efficiency and comfort.
In order to accelerate the design process for this innovative aircraft at a lower cost, Boeing has decided to rely on its suppliers. The Global Collaborative Environment (GCE) links all members of the design team at 787. Previously, Boeing designed 70% of the plane. Then it let its 43 suppliers and many other subcontractors from 24 countries work together at 135 sites. As the project progressed, these partners abandoned their respective computer-aided design systems for the common language and format of Boeing’s Catia V5 system. With this standardized data communication and design program, Boeing reduced its documentation from 2,500 pages to 20 pages.
Collective intelligence, online search groups, cloud storage and Big Data are the new drivers of creative thinking. The complex and critical problems facing humanity today require solutions to be deployed more rapidly than in the past. This is why the use of these technologies is now indispensable.
Seoul Innovation Challenge, an urban planning project based on collective intelligence
The Seoul Innovation Challenge aims to use collective intelligence to solve urban problems encountered by Seoul in the fields of safety, environment and traffic.. This challenge lasted 200 days and was open to citizens, foreigners, companies and universities. The key words were cooperation, innovation and openness. When someone suggested an innovative idea on the platform, all participants engaged in a collaborative process with 100 professional mentors over the next 7 months.
The preliminary stage took place in July 2017. 32 projects were selected. During the following three months, ideas were developed. The final stage took place at the end of November. Support will be provided to the 32 projects for commercialization. The registration of intellectual property, demonstration and partnership building, as well as the search for investors will be supported. It was It is possible to register to this collective intelligence project on the official website..
Can collective intelligence surpass artificial intelligence?
The French philosopher Pierre Lévy, who specializes in collective intelligence, has been developing software since 2015 to analyze data from social networks. He wants to understand the real motivations of humans. This software automatically transforms words from various languages into a symbolic algorithmic hyper-language called “Information Economy MetaLanguage”.
According to the author of “Collective Intelligence: For an Anthropology of Cyberspace”, it is initially found in nature. However, thanks to language and technology, that associated with humans is far superior. Why is that? Because it relies on the manipulation of symbols…. We’ve entered the era of algorithmic symbol manipulation.
For the philosopher, this collective form of intelligence is opposed to artificial intelligence. His The goal is not to make computers smarter, but to use computers to make humans smarter.. To do so, Lévy intends to create a universal categorization system as flexible as natural language. It will be used to classify the countless data available on the web. The semantic relationship determines the orchestration of the data. This system will allow new ideas to emerge. In short, the IEML allows ideas to be connected through computing. The philosopher considers that the Internet already represents a form of intelligence of this type, but wishes to add a notion of reflection to it.
Aware that this system could allow large companies like Apple and Google, or government agencies like the NSA, to access an unprecedented amount of information, Pierre Lévy says, however, his goal is to bring the power of information to the people.. Just like the Silicon Valley activists in the 1970s, who wanted to give everyone access to computers, this philosopher wants to give people the opportunity to analyse and make sense of the data available on the Internet.
Collective intelligence: a software created by Pierre Levy
To do so, Pierre Lévy intends to rely on two tools. The first tool is the IEML language, and the second is the software that implements this language.. Companies distribute this software openly and free of charge. It has the third version of the GPL. Finally, all changes to the IEML must be fully transparent.
Admittedly, not everyone can contribute to the IEML dictionary, as it is specialized knowledge. Language skills, or mathematical knowledge, are essential. However, not everyone can contribute to the IEML dictionary, anyone can create new labels. The philosopher considers that this is the maximum that can be done to allow people to have access to freedom. We cannot force people to be free, but we can give them all the tools they need to become emancipated.
Does Big Data promote Collective Intelligence training?
Training in collective intelligence is increasingly being emulated. Companies want to ensure that their employees pool their cognitive resources in order to progress towards the same goal, which can generally be summed up as increasing the company’s profit. In this sense, some organizations take part in team seminars. The aim is to encourage this collective reflection, since it is based on achieving group cohesion..
Big Data coupled with artificial intelligence can facilitate this training phase. Indeed, matching algorithms can find commonalities between members of a team and thus promote fruitful interactions. Lood understanding is not the only element of good collective intelligence. It is also important to know that an individual is working on the same project as you. In a large company spread over different parts of the world, this is not necessarily obvious. A lake of data distributed in a Cloud and enterprise access tools can provide this information.
From a training perspective, Big Data would ideally provide a means of measuring the progress of seminar members. This would allow for the evaluation of interactions within the group, and even predict how much collaboration within the company will be improved.