Popular Science 2.0

GenteOrdenador
GenteOrdenador

The smarter we get, the dumber we become

The definition of intelligence is certainly flimsy, in fact there are different types of intelligence and, accordingly, there are activities that nurture some cognitive areas but ignore others. After reading the opinions of two writers with diametrically opposed views on how the Internet is affecting our intelligence, I cannot help but to agree with both. Although on the surface they appear to be saying the opposite, what they really illustrate is that the Internet does make us smarter, but increasingly dumb at the same time. Everything depends on the type of intelligence we want to measure.

The two authors I refer to are, on the one hand, the expert in semiotics from Brown University, Steven Johnson and, on the other, the science writer and 2011 Pulitzer Prize finalist Nicholas Carr. In Johnson’s view, mass culture, i.e. popular culture, particularly transmitted through television, video games and Internet, is getting more and more complex, which leads to certain cognitive areas getting stimulated to levels as never before. This is what Johnson calls the «Sleeper Curve». This growing complexity of popular culture is inevitable, and is getting faster, mainly due to three interrelated factors: the brain’s natural desires (knowledge is addictive), the economics of the entertainment industry (products must be complex enough to make consumers purchase the product to watch again and again, often in different formats) and the up and coming technology platforms.

JohnsonyObra
© Flickr
For Johnson, increased complexity of mass culture makes  that our IQ scores are going up from one generation to the next.
In the picture, Steven Johnson, he expert in semiotics from Brown University and his book cover Cultura basura, cerebros privilegiados (Roca editorial, 2011).

For instance, just look at the kind of TV series being televised before 1980 and those currently on the air. Hill Street Blues was the first series to expand its story lines and increase its structural complexity, overwhelming many viewers. Now, shows like Lost or The Sopranos make Hill Street Blues seem easily digestible and even naive.

Of course, the Internet has speeded up these processes: flooding us with information with just one click, providing more and more options that continue to deepen the divide between «digital immigrants» and «digital natives». For Johnson, this increased complexity of mass culture is even the main reason underlying what is known as the «Flynn effect», i.e., the assumption that our IQ scores are going up from one generation to the next.

On the other side of the coin, we have the ideas expressed by Nicholas Carr. He argues that new information technologies, like the Internet, are undermining our critical intelligence, forcing us to become shallower, gathering information that is increasingly fragmented, less organized and more superficial. Although we read and write more, it is less common to read a 200-page book from start to finish with undivided attention and without distraction.

The pros and cons of popularising science on the Internet

The Internet offers both scientists and popularisers of science clear advantages. For one thing, it provides easier access to wider sources of readily available information. Millions of people can create, but likewise millions of people criticize, edit or modify what is written. Wikipedia, Flickr or WordPress were unimaginable before the advent of the Web, let alone specific tools for scientific research, such as Medline, the standard browser of articles and academic papers in medicine, open access to scholarly journal articles in PLoS (Public Library of Science), the CERN podcast or video files of countless congresses, lectures and conferences. In short, clearly the digital revolution has transformed the creation, storage and transfer of scientific knowledge.

But the advantages are so plain and impressive that they can overshadow the drawbacks, which also exist and influence not only the information we consume and produce but also how we do so (remember, Johnson is right, but so is Carr).

As McLuhan suggested, the environment also shapes the thinking process. The Web, therefore, gives us more information but simultaneously weakens our powers of concentration and deep analysis. Now more than ever, we read short entries full of images, links and videos. But it is increasingly harder for us to read War and Peace. As stated in a study by the business consultant nGenera (Tapscott, 2008) into the effects of the Internet on young people: «They do not necessarily read a page from left to right and from top to bottom. They might also skip pages, looking for relevant information».

This view is not intended to be neo-Luddite: while hyperlinks are very useful to feed our curiosity, they break up a long and complex text. Reading hypertext is not like reading plain text. Hypertext substantially increases our cognitive load and therefore undermines our ability to understand and retain what we read, as indicated by several studies, such as those by Rouet and Levon (1996) and Miall and Dobson (2001). Hyperlinks do not have the same effect on us as footnotes or citations, as Carr states in his book The Shallows (2010), hyperlinks invite us to multitask, not just leading us to related or complementary works, but rather tempting us to click on them -encouraging us to abandon any text we may be immersed in, rather than devoting it our sustained attention.

The expert on reading, Maryanne Wolf, also asks whether, faced with images, sounds and hyperlinks that appear in the text on the screen, the constructive component nestled in the heart of reading will start to change and atrophy and whether there is enough time to process information in such a deductive, analytical and critical way as before (Wolf, 2008).

As well as this reduction in the concentration and patience required to delve into complex issues, there is also diminished originality or heterodoxy in popularisation itself and even in scientific research. Most popularisers using the Net keep track of one other, and this inevitably leads to certain contents or trends being mimicked. A study published by the University of Chicago sociologist James Evans in the journal Science suggests that this affects the progress of scientific research (Evans, 2008). Having assembled a database of 34 million academic articles published in scientific journals from 1945 to 2005, Evans notes that automated information filtering tools, such as search engines, tend to act as popularity boosters, quickly creating and then continually strengthening the consensus as to which information is important and which is not. That is, former research without Internet support may have been much less efficient; however, it entailed certain advantages, such as the probability of exploring intellectually less frequented paths.

A vicious (digital) circle

In short, popular science 2.0 offers the advantages and disadvantages of mutual feedback. On the one hand, the Internet encourages thirst for knowledge, especially for surprising news or information. On the other hand, the Internet also devalues ​​the quality of what we learn or at least how we learn. The more we feed our hunger to learn, the greater becomes our desire to consume more anecdotal and superficial data. And so it goes on, like a hamster spinning on its wheel in a dopamine cage.

To all this we must add that the time spent reading printed publications such as newspapers, magazines and books is reducing at the expense of reading in digital environments, even in academic circles. In June 2011, the PISA report by the Organization for Economic Cooperation and Development (OECD) presented the results of its assessment of fifteen-year-old students’ skills in digital reading tasks, i.e., their ability to perform tasks such as to prioritize information or construct new knowledge from what they had read. The results point to a striking fact: moderate use of the computer guarantees better results, better than either those who spend a lot or those that sped very little time on the computer (El País, 2011). At the same time, the “Fifth Survey on the Social Perception of Science” made by the Spanish Foundation for Science and Technology (Fundación Española para la Ciencia y Tecnología (FECYT)) noted that the Internet is taking giant steps in disseminating science, and that it is the main way people under 34 learn about science.

Therefore, seeing this migration along the digital path has reached a point of no return, one may entertain certain reservation about the fact that everything seems so perfect (there is more information available and we can access it more freely than ever). Perhaps we should also begin to offer special training on how to read the different kinds of texts to be found there, most of which are unchecked and dotted with frequent interruptions that appeal specifically to our thirst for knowledge. Likewise, it seems imperative to create a sort of artificial or emergent intelligence, to act as a kind of reader’s advisor, a guide to stop us from being shipwrecked upon the avalanche of inputs and keep at bay the fears Socrates expressed in Protagoras on referring to those that think «like the papyrus scrolls, unable to answer or to ask themselves questions».

Bibliography
Carr, N., 2010. Superficiales. Taurus. Madrid.
El País, 2011. «A los alumnos españoles se les atraganta la lectura digital». El País.
Evans, J. A., 2008. «Electronic Publication and the Narrowing of Science». Science 321: 395-399.
Goldacre, B., 2011. Mala ciencia. Paidós. Barcelona.
Johnson, S., 2011. Cultura basura, cerebros privilegiados. Roca Editorial. Barcelona.
Miall, D. S. & T. Dobson [eds.], 2001. «Reading Hypertext and the Experience of Literature». Journal of Digital Information, 2, núm. 1.
Rouet, J. F. & J. J. Levonen, 1996. «Studying and Learning with Hypertext: Empirical Studies and Their Implications». In Rouet, J. F, Levonen, J. J., Dillon A. & R. J. Spiro [eds.], 1996. Hypertext and Cognition. Erlbaum. Mahwah (Nova Jersey).
Tapscott, D., 2008. «How to Teach and Manage “Generation Net”», BusinessWeek Online.
Wolf, M., 2008. Cómo aprendemos a leer. Ediciones B. Barcelona.

Sergio Parra Castillo. XatakaCiencia writer and editor.
© Mètode 2011.

 

© Heku. morgueFile
Internet does make us smarter, but increasingly dumb at the same time. Everything depends on the type of intelligence we want to measure.

«The Internet encourages thirst for knowledge, especially for surprising news or information but it also devalues ​​the quality of what we learn or, at least, how we learn »

 

 

 

 

 

 

 

 

«Johnson believes mass culture is getting more and more complex, which leads to certain cognitive areas getting stimulated to levels as never before»

 

 

 

 

 

Carr
© Flickr
In the picture, Nicholas Carr, the science writer and 2011 Pulitzer Prize finalist.

«Nicholas Carr argues that new information technologies are undermining our critical intelligence, forcing us to gather information that is increasingly fragmented, less organized and more superficial.»

 

 

 

«Search engines, tend to act as popularity boosters, quickly creating and then continually strengthening the consensus as to which information is important and which is not»

 

 

 

portada-superficiales

Carr’s book cover, Superficiales (Taurus, 2010).

«It seems imperative to create a sort of artificial or emergent intelligence, to act as a kind of reader’s advisor»

 

 

 

«Most popularisers using the Net keep track of one other, and this inevitably leads to certain contents or trends being mimicked»

 

 

 

© Mètode 2011

Periodista i divulgador científic. Coordinador de Xatakaciencia, Barcelona.