Archives for category: English


I’ll be in Colorado College on Thursday September 18.

I will talk about my current research and my view on collective intelligence, 20 years after the original publication (1994) of my book “Collective Intelligence” in french.

What are the current research programs to augment collective intelligence through the use of the algorithmic medium? I will present the main technical and scientific enterprises related to this question, including knowledge management, the digital humanities, the semantic web and the singularity. I will also discuss my own research program, which aims to build a scientific mirror of human collective intelligence from collaborative data curation.

Here are the slides of my communication!

And there is a longer, more detailed version.


I will speak the 5th of September 2014 in Rio de Janeiro at the event Educaçao 360.

Here is my presentation: Algorithmic communication


Musée de demain

First, here is my presentation at the Rio Content Market, called “The Algorithmic Medium and its Content”

Then, see the slides and the video of the two conferences that I am giving in SENAC Sao-Paulo:
Cyberdemocracy (slides)
Lesson in cyberdemocracy (english/portuguese) video

Collective-Intelligence (slides)
Lesson in collective intelligence (english/portuguese) video

And finally the slides of my presentation in the Bienal do livro in Brasilia:
Algorithmic Textuality





Steve Jankowski’s Master Thesis (Wikipedia and Encyclopaedism: A genre analysis of epistemological values Click Here!) is proof that a supervisor (me) can learn more from his student than the student from his supervisor. And I’m not speaking here about learning some interesting facts or methodological tricks. When reading the final version of the thesis, I truly learned about new, relevant and important ideas pertaining to digital humanities and post-colonial epistemology.

The main theme of Jankowski’s work is the “epistemological conservatism” of Wikipedia. This conservatism can be seen in two important features of this encyclopedia: its categorization system and its general theory of knowledge.

First, based on rigorous scientific methodology, this groundbreaking research shows that the paratextual headings of the famous online encyclopedia are very close to those of the 19th century Britannica. Since headings and disciplines form the meta-data system of the encyclopedia, or its chief categorization apparatus, we can say safely that it is one of the place where its tacit epistemology is hiding.

Second, based on a thorough historical study of the encyclopedic genre, Jankowski shows that the theory of knowledge officially followed by Wikipedia is also the theory of knowledge stemming form the movement of enlightenment and developed by modern Europe in the 19th century. According to this general framework, there is an “objective” scientific truth, that is produced by the scientific community according to its own academic rules (the primary sources) and a vulgarization of this objective truth by the writers and editors of the encyclopedia. Vulgarization is understood here as a kind of synthetic compendium of scientific knowledge for the “cultivated public” (meaning in fact: people having at least a secondary education).

These two discoveries are important for several reasons.

Wikipedia is one of the most consulted sites of the Internet and it is the first place where journalists, students and professors alike, go to find some basic information on any subject. This means that any epistemological bias in Wikipedia has more influence on the contemporary public mind than those exerted by the news outlets. A deeper influence, indeed, because Wikipedia is not only about facts, news or events but also about the basic structure of knowledge.

The idea that Wikipedia is epistemologically conservative may be counter-intuitive for many. Is not Wikipedia completely open and free? Don’t we know that anybody may write and edit this encyclopedia and that the editing process itself is transparent? Isn’t Wikipedia a fantastic example of international collective intelligence and one of the big successes of crowd-sourcing? Of course! But the big lesson of Jankowski’s work is that all this is not enough. There are still some emancipatory efforts to be made.

Wikipedia has opened new grounds by using hyper-textual tools, a crowd-sourced editorial process and an open intellectual property. These are all good and each should be pursued to further develop collective intelligence in online environments. But Wikipedia also contains within its DNA the typographic epistemology and the categorization system of good old colonial Great Britain.

In an increasingly data-centric society, when mastery of knowledge is the main asset of cultural, economic and military power, epistemology is key to the new politics. In this regard, Jankowski implicitly asks us two strategic questions. How can we build an organic and unified compendium of knowledge allowing as many categorization systems as possible? How can we integrate the different point of views on a given subject in a perspectivist way instead of imposing a “neutrality” or “objectivity” that reflects inevitably the dominating consensus? These sort of questions address epistemology’s crucial role in the new politics and within personal and collective identities.




Computation, Cognition and the Information Economy.

(Translated By Phyllis Aronoff and Howard Scott)

New advances in digital media offer unprecedented memory capacities, an omnipresent channel of communication, and ever-growing computational power.
We must ask ourselves how we can exploit this medium in order to augment our own social cognitive processes for human development.
Through a combination of a profound knowledge of humanities and social sciences, and an understanding of computer sciences, Pierre Lévy proposes a collaborative construction of a global hyper-cortex, coordinated by a computable metalanguage.
By fully recognizing the symbolic and social nature of human cognition, we could transform our current, opaque, global brain into a reflexive collective intelligence.


Written Interview in english:…

Video interview in english, sub-titled in portugese, about collective intelligence and the semantic sphere:

Review in english, by Yair Neuman: Technology becoming an Hypercortex

Written interview in english and spanish

More information here


1. General Introduction.

Part 1. A Philosophy of Information

2. The Nature of Information.
3. The Symbolic Cognition.
4. The Creative Conversation.
5. Toward a Mutation of Humanities and Social Sciences.
6. Information Economy.

Part 2. Cognition Modeling
7. Introduction to a Scientific Understanding of the Mind.
8. Computer Perspective: Towards a Reflexive Intelligence.
9. Overview of the Semantic Sphere IEML.
10. The Metalanguage IEML
11. The Semantic Machine IEML.
12. The Hypercortex.
13. A Hermeneutic Memory.
14. Humanistic Perspective: Towards Explicit Knowledge.
15. Observe the Collective Intelligence.



Inter-religious dialogue is expected to increasingly take on the form of online creative conversations that rely on digital data and documents. The first part of this paper is about the current symbolic obstacles on the road to cultural and religious intercomprehension in this context: mainly the incompatibility and the cultural biases of classification systems. To overcome these obstacles (and some others), I propose the use of IEML (the Information Economy MetaLanguage), a computable language specially suited for the online intercultural dialogue that I developed at the Canada Research Chair in collective intelligence at the University of Ottawa. The second and main part of this paper presents some examples of the application of basic IEML categories to the religious domain.

To get the paper in PDF, Click on the link.