"The Master Switch" is an intriguing history of radio, telephone, cinema and television business in the USA (note: "in the USA", which is not clearly stated in the introduction). The central theme of the book is the "oscillation of information industries between open and closed", a recurring pattern that he finds across those four industries… and that he projects into the age of the Internet. The pattern looks like this: scientific innovation creates an information technology, the information technology opens a new market, an industry is created to serve that market, a monopoly eventually comes to control that market and therefore the flow of information.
Fareed Zakaria's thesis is that the USA is moving towards an excessively democratic system in which polls are having a perverse influence on a system that was designed to be less about democracy and more about liberty. He doesn't quite offer a crisp definition of "liberty" but roughly it means individual freedom and protection from abuses of authority by the state. "Freedom" is a vague term, that has been used throughout history in different contexts (for most nations it meant "freedom" from foreign oppression). "Liberty" is about personal freedom.
Ryle thinks that Descartes invented a myth when he provided definitions for the mental and the physical, as if they were two different things; when he assumed that every human is both a body (that is in space and is subject to the laws of Physics) and a mind (that is not in space and is not subject to the laws of Physics); that a person lives two parallel lives, one as a body and one as a mind (one being a public history and the other being a private history because nobody can witness your inner thoughts).
A general rule that rarely fails is: “Be wary of books written by multiple authors”. Multiple authors tend to amplify each other’s crap instead of edit it down, and the results are often embarrassing ideological pamphlets, no matter how smart the premise.The premise here is interesting. The digital age has changed the world in which we live in not only from the point of the consumers but also from the point of the producers. We live in the age of Web platforms that were not designed top-down but sprung up bottom-up. The future may be more of it, and faster.
The "transactional" interpretation of Quantum Mechanics was devised by John Cramer in 1986 to provide a complete and consistent interpretation of Quantum Mechanics without introducing new elements.The story begins in 1925 when, de facto, Heisenberg made a metaphysical revolution by surrendering the concept of reality in favor of the concept of observables: we can't know what really exists, we can only know what we can observe. Kastner points out that Heisenberg's metaphysical move was essential to discovering a theory that turned out to correctly predict observation.
Ramachandran begins the book by wondering which features are truly unique to the human brain. Many features of the ape brain were hijacked by evolution to produce novel functions in the human brain (a process called “exaptation”). For example, mirror neurons are responsible for human culture and ethics. Ramachandran believes that these unique traits are perfectly consistent with Darwinian evolution: millennia of gradual evolution can produce the mental equivalent of phase transitions, when suddenly a substance reorganizes itself into a different substance with different properties.
Hayles has written a complex and erudite book on the hidden premises and visible consequences of the information age. Ultimately, her thesis is summarized by a sentence in the prologue: “thought is a much broader cognitive function depending for its specificities on the embodied form enacting it”. Rewritten in plain English, it means that you cannot separate your “i” from the body that you inhabit. Her nightmare is “a culture inhabited by posthumans who regard their bodies as fashion accessories rather than the ground of being”. Her dream is a society in which we “understand ourselves as embodied creatures living within and through embodied worlds and embodied words.”
The US neurophysiologist Paul Nunez previously wrote “Electric Fields of the Brain” (1981) and “Neocortical Dynamics and Human EEG Rhythms” (1995), and in fact his credentials in the field of brain studies harken back to a paper originally written in 1972 and ambitiously titled “The Brain Wave Equation” (an equation that eventually he resurrects in this book, 40 years later). In this book Nunez summarizes his novel ideas on the way that “brains cause minds” (to use Searle’s expression).
Digital technology is instead progressing very slowly when it comes to government: the link between the citizen and the politician is often just a “feedback form” on the politician’s website. Very little effort has been made to link the citizen and the decision making process in more effective and creative ways.
When international agencies started noticing that new technologies would soon cause a dramatic shift in the oil market, one country took notice and, well, panicked: Saudi Arabia. Its wealth and relatively new political power are entirely due to the oil that sits under its soil.
Bostrom writes that the reason A.I. scientists have failed so badly in predicting the future of their own field is that the technical difficulties have been greater than they expected. I don't think so. I think those scientists had a good understanding of what they were trying to build. The reason why "the expected arrival date [of Artificial Intelligence] has been receding at a rate of one year per year" (Nick Bostrom's estimate) is that we keep changing the definition. There never was a proper definition of what we mean by "Artificial Intelligence" and still there isn't.
The “Singularity” seems to have become a new lucrative field for the struggling publishing industry (and, i am sure, soon, for the equally struggling Hollywood movie studios). To write a bestseller, you have to begin by warning that machines more intelligent than humans are coming soon. That is enough to get everybody’s attention.
This book offers a colossal synthesis of history, biology, philosophy, psychology and neurophysiology. Surprisingly, the latter is the least plausible region of the book (we still know too little about the brain). But by mixing historical facts and evolutionary theories and using a bit of logical thinking, Pinker comes up with great insights into human nature. Pinker synthesizes the work of (literally) hundreds of thinkers and researchers and draws his own original conclusions.
Doc Searls makes a great talk on the State of the Net in which he explains how we are reaching the personal data cloud. And also how, also this time, like in the past, with computers, mainframe, networks, mobiles, the corporate world is trying to control those data. And in doing so they limit their usefulness. What gets built outside always becomes more valuable than what is being built inside. Inevitably ending to support those corporations prefer to interface with what is outside than build their own walled gardens.
Most of the world’s genetic diversity lies in viruses. The longest living beings are bacteria. No wonder that these microscopic organisms kill more humans than any other dangerous animal. (Technicalities: viruses are not form of lives, since they cannot replicate without a host cell; bacteria are living organisms, perfectly able to replicate on their own, but they are limited to one cell).
Ultimately, the most structured society will be a society in which every action has to comply with some rules, i.e. its citizens will de facto be robots with no brains. Why does brain/mind want to get rid of brain/mind?
The moment one argues in favor of liberalizing drugs people accuse him of being a drug addict: i have not drugs, do not do drugs and do not intend to do drugs. I care for my brain. Just like i do not smoke because i care for my lungs and i do not eat junk food because i care for my heart.
The Missing Mutation: Are We Really Smarter than our Ancestors? Several innovations that happened in the Neolithic seem to provide no advantage and sometimes create problems instead of solving them. About 10,000 years ago the burial ritual lasted a lifetime and the living were supposed to give offerings to the dead that were not valuable. At some point the burial ritual became much simpler but the living were supposed to give offerings to the dead that were valuable (e.g., food at times of starvation). Neither attitude makes a lot of sense from a materialistic viewpoint: what is the adaptive advantage of wasting goods and food for dead people? At the same time the cult of the dead moved from underground to aboveground (temples, pyramids), again an incredible waste of resources.
Historians, scientists and poets alike have written that the human being strives for the infinite. In the old days this meant that it strives to become one with the god who created and rules the world. As atheism began to make strides, Schopenhauer rephrased the concept as a “will to power”. Nietzsche confirmed that god is dead, and the search for “infinite” became a mathematical and scientific program instead of a mystical one. Russell, Hilbert and others started a logical program that basically aimed at making it easy to prove and discover everything that can be.
The impact of technological progress on jobs has been the topic of countless books: most of them are forgotten because they were so wrong about it. Predicting the future has always been a lucrative business (Delphi’s Oracle, Nostradamus, George Orwell), but rarely a science. If all of them had been right, today we would all be unemployed and, in fact, extinct. Instead, guess what: humans are wealthier than ever in history, the world has never been so peaceful and we all buy machines by the millions. Pistono’s book is the refreshing exception: no, we are not doomed. That, per se, is a good reason to read it.
Is the World Wide Web and modern technology… replacing the human mind? Knowledge today is not a set of theories, but instead just Data, and the machine is a better place to store data than the human mind.
The democratic revolutions of the Middle East (Tunisia, Yemen, Egypt, Libya and now Syria) actually started in Iran in 2009 when supporters of opposition leader Mir Hossein Moussavi protested loudly against rigged elections “won” by incumbent president Mahmoud Ahmadinejad.
Iran’s foreign minister dismissed Israeli threats of an imminent attack against its nuclear facilities because such a “stupid act” would provoke “very severe consequences.” But there are several reasons why an Israeli attack is more likely than ever.