IEET Intern Edward Miller argues that we can have a more democratic world by embracing the decentralizing and liberatory potentials of open source or P2P approaches to biotechnology and informatics.
Knowledge is power, or so the old axiom goes. There is a great deal of truth in that statement. While they are not the same thing, they are mutually dependent. Power determines the types of knowledge which can be produced and reproduced, and vice versa.
Consequently, feedback loops are created which promote certain types of knowledge and practices of power based on reproductive fitness. Just as with natural selection, dominance is achieved by those with the highest reproductive fitness. In human history, the result of this process has been a highly routinized transnational system of power relations characterized by the perpetual consolidation of wealth.
A primary reason why the transnational capitalist system has become routinized is that those with centralized power have long controlled the means of knowledge production, such as the news media, the publishing industry, and the educational system. Individuals who are assimilated into this system are conditioned to engage in actions, communications, and transactions on a day-to-day basis which recursively reinforce the structure of the system. This is what Anthony Giddens calls the duality of structure.
Oppression and warfare have overwhelmingly developed in the context of centralized power, whether that power is political, religious, corporate, or otherwise. These problems not only manifest themselves in obvious examples like global superpowers but can be seen even in the smallest of communities and families. The current situation is an especially dangerous one, since technological progress has amplified our power to destroy. Yet, reversing technological progress would be both foolish and futile. To avert crises, socio-political change is necessary.
Since centralization of power continues to plague human civilization with numerous crises, some form of decentralization is needed. Those who engage in second-order analysis, and understand how the process of power-knowledge reproduction functions, recognize that they must play by the rules of the system. The methods of decentralization must be constructed to have high reproductive fitness, and yet, by their structure, impossible to be co-opted as centralizing forces.
II. A Brief History of Centralization
Since the dawn of civilization, certain people have amassed more wealth and power than those around them. There are all sorts of reasons for this, such as differences in luck, talent, inheritance, unscrupulousness, etc. From these imbalances arise unequal power relationships and hierarchies form.
As is usually the case, that power is used to further enhance and maintain the power of those who wield it. Throughout most of history, a popular method has been direct coercion, such as the establishment of empires and institutionalized aristocracies. In modern times, indirect actions such as investment, marketing, and lobbying are preferred, and the methods are continually being refined over time.
In just this past century, humanity has suffered through two World Wars and stood on the brink of nuclear annihilation. Since power is still highly centralized, the potential for future cataclysms has by no means disappeared.
III. What is Decentralization?
Decentralization is a process whereby the distribution of power becomes more diffuse. Since power is so fundamental, this can affect every aspect of society from the inside out. Peter Kropotkin, a Russian anarchist intellectual, defined decentralization as the true measure of progress:
True progress lies in the direction of decentralization, both territorial and functional, in the development of the spirit of local and personal initiative, and of free federation from the simple to the compound, in lieu of the present hierarchy from the centre to the periphery. (1911, Encyclopaedia Britannica)
By this, Kropotkin meant we should be working towards decentralizing all aspects of life, with the ideal being a society without hierarchy where all relationships are based on the free association of equals. This is a profound challenge, and may be a never-ending one. Moreover, our opinions may not always align on how to best achieve this goal. Yet, it is a goal worth striving for.
The virtues of decentralization have been recognized from all corners of the political compass. Since at least the time of Ancient Greece, it has been accepted that some degree of decentralization is desirable. A more formalized understanding of it has been around since at least the 1800s. Marx is widely credited for being among the first to study this, and he saw the transition from older systems like feudalism to capitalism as a long trend of decentralization. Though society is still highly centralized, there has been gradual progress toward decentralization made possible by advances in technology, especially communications technology.
Critical thinking, our internal defense mechanism against harmful ideas and the true currency of any healthy democracy, is actually a decentralized force. Accordingly, it is bound up with our access to information. Without the printing press, democracy might not currently exist. Democracy is a decentralized mode of governance that requires the distribution of a tremendous amount of information. It is understandable that the Enlightenment and modern democracy came only after its invention, and rather quickly after. The more widely available books are, the more relative power every individual has. One cannot reflexively defend one’s human rights unless one is aware that those rights are being trampled upon. Knowledge truly is power.
By that same token, without mass printed bibles, people had no other choice but to learn religion from authority figures. Thus, it makes perfect sense that Luther and Calvin sprung up shortly after the printing press. The Protestant Reformation challenged the need for clergy to interpret the religious texts which were becoming widely available. Even more radical theological criticism soon followed as philosophers began printing books of their own and further debating the logic behind their positions.
Democratization has even been taking place within families and sexual relationships. The ideal of equal respect between individuals, virtually an explicit goal of democracy, may never be completely attained, but there is a higher degree of equality now than ever before. Since the Women’s Rights movement and Sexual Revolution, both women and men are less confined to predetermined roles, and there is a higher expectation of communication and mutual agreement between family members.
It must be noted that in certain contexts, some centralization can be necessary given certain practical realities. Parenting would be an obvious example, since young children are unable to care for themselves. Other justifiable contexts for centralization could include defense against threats to world peace. President Eisenhower informed us in his farewell address that, as a result of involvement in the World Wars, the United States has created a highly centralized Military-Industrial Complex. However, he warns us that even though it was created out of necessity, we must remain critical toward it.
“In the councils of government, we must guard against the acquisition of unwarranted influence, whether sought or unsought, by the military-industrial complex. The potential for the disastrous rise of misplaced power exists and will persist. We must never let the weight of this combination endanger our liberties or democratic processes. We should take nothing for granted.” (1961, Farewell Address to the Nation)
Thus, even in cases where some degree of centralization seems wholly justified, people should remain deeply skeptical toward it. What is just in one moment in history is not necessarily just in another.
IV. Is Capitalism a Decentralized System?
When the laissez-faire economist Friedrich von Hayek spoke of decentralization, he used it like a synonym for capitalism, and characterized socialism as merely centralized economic planning.
If we can agree that the economic problem of society is mainly one of rapid adaptation to changes in the particular circumstances of time and place, it would seem to follow that the ultimate decisions must be left to the people who are familiar with these circumstances, who know directly of the relevant changes and of the resources immediately available to meet them. We cannot expect that this problem will be solved by first communicating all this knowledge to a central board which, after integrating all knowledge, issues its orders. We must solve it by some form of decentralization. (1945, The Use of Knowledge in Society)
A number of contemporary economists have myopically begun to think of capitalism as the final, natural, and most decentralized state of affairs. Surely capitalism was more decentralized than the systems it evolved from; however, contrary to the claims of Hayek’s Road to Serfdom, capitalism and democracy do not always reinforce each other. Capitalism has existed, and continues to exist, under a number of political systems, including theocracies like Saudi Arabia.
Furthermore, capitalism will always encourage some degree of market interventions because it is in the interests of those who have accumulated wealth to further bolster their power through lobbying for favorable subsidies, regulations, and so forth.
Even the most progressive policies like environmental regulations and anti-trust laws have been lobbied for by enormous sectors of business. The reason is these interventions change the distribution of wealth in such a way that some will profit more than others. This sort of rent seeking behavior is inevitable within capitalism, and always creates some degree of corporatism, or state-sponsored artificial imbalances in the market.
Therefore, all things considered, mixed economies are actually more natural than so-called free market capitalism, especially considering only the former has ever existed. In any event, just because something is natural does not mean it is correct or desirable. That is a prime example of an Appeal to Nature, an all-too-common logical fallacy.
V. Social Ecology vs Tragedy of the Commons
Capitalism, including its mixed economy form, has far-reaching consequences, and is predicated on certain conditions. This is where Social Ecology, a concept formulated by Murray Bookchin, becomes relevant.
Social Ecology argues that the root cause of our environmental problems is the same as the our social problems: the way people treat each other (greed, power centralization, etc). There is a old notion which has been coined, â€œthe Tragedy of the Commons.Commons are any public goods, such as uninhabited lands, which are not formally owned by any one person.
Unfortunately, instead of rationing, it is perceived to be in the interest of people to use up as much of a public good as possible, as quickly as possible. This can lead to complete devastation of the common resource. As any game theorist would know, while it may be better for everyone to cooperate and not engage in wanton destruction, people have a tendency to take more than their share. This leads to a breakdown of trust and the new rule becomes all against all. While this isn’t always the case, nor must it be a permanent feature of human civilization, it is nevertheless a feature of our present reality that must be addressed.
Some have decided, based on a narrow view of this tendency, that greed is the quintessential trait of human beings. The being they envision is not Homo sapien, but Homo economicus, a fictitious species interested purely in maximizing self-interest. Humans are much more dynamic and multifaceted, and we are also more biased and illogical, than the mythical Homo economicus. The reason our society is so obsessed with consolidation of wealth is because of the way power and knowledge are reproduced in society, not because it is our defining characteristic.
As Immanuel Wallerstein points out, capitalism, by its structure, necessitates the â€œendless accumulation of capital. From a sociobiological perspective, this feature gives it incredible reproductive fitness, which is why modern liberalism and capitalism is the only system to ever form an integrated world-system. Although we are in a more decentralized age, and emperors, kings, and warlords are more rare, capitalism still promotes massive centralization of power. Endless accumulation is a recipe for disaster, or rather tragedy (of the commons). Except, the commons we are talking about right now could be the entire Earth, considering trends like global warming, nuclear proliferation, the militarization of outer space, and new weapons of mass destruction.
We must overcome this endless centralization by creating more rational modes of production that can coexist with, and eventually supplant, the capitalist world-system. Any such new mode of production must be structured in such a way as to be non-hostile toward capitalists, and even profitable, but also impossible to be co-opted to the point where its decentralized principles are compromised. The remainder of this treatise deals with such practical solutions.
Murray Bookchin also created a concept of Post-Scarcity Anarchism. Thus far, all economic systems, by definition, have had to deal with the problem of scarcity. There has only been a limited amount of resources, and it must be allocated somehow.
The gradual reduction of economic scarcity has been a great factor in society’s decentralization. It is possible to envision a society in which scarcity has been eliminated to such an extent that there is little need for economic systems like capitalism or socialism. Considering the virtually exponential advancement in technology and the seemingly limitless amount of matter, space, and energy in the universe, it seems silly to think that we will be forever confined by scarcity.
Only an infinitesimal fraction of the Sun’s energy hits the Earth (about one hundredth of a millionth of a percent). Yet, every minute, enough energy travels to Earth to meet the current needs of the world population (6.6 billion people) for a whole year. The reason solar power is not yet ubiquitous is that the technology is not presently efficient enough or cheap enough to be justifiable from a cost/benefit standpoint; however, progress is happening quite rapidly. Nanotechnology has already begun to boost efficiency and holds great promise for future improvement. Eventually, as progress continues, energy could become as abundant as air.
Decentralized manufacturing has been going on for as long as humans have walked the Earth. For ages people have been able to be somewhat self-reliant by growing their own food, sewing their own clothing, and being generally handy with things. The reason it cannot be used in every circumstance is that it is so labor intensive and not always practical, especially with regard to capital-intensive industries. Technologies does make these things easier, such as plows or sewing machines, but a good deal of labor is still required. The key to changing this is automation. The Fab@home project from MIT is a great start. The designs for their â€œdesktop manufacturing device are completely open source and allow for the automatic construction of simple items right from home.
As nanotechnology advances, molecular manufacturing comes closer to reality. This would be an order of magnitude more efficient and versatile than something like Fab@home. Using molecular manufacturing, even production of goods that are currently capital-intensive, such as microprocessors, could be decentralized. It could theoretically reduce economic scarcity to the point where nobody will be forced to work for sustenance and people will be free to follow their creative instincts.
New developments in technology have already virtually eliminated scarcity in certain areas. Knowledge has, until now, always been extremely scarce in the sense that a majority of the population usually is unable to access certain information based on a number of factors such as illiteracy, inability to afford information, geographic and logistical issues, and so forth.
The Internet has allowed for information to no longer be anywhere near as scarce, at least among those with access to the Internet. It is important to note that there is a digital divide. 90% of the world still lacks internet access, but at least now with the ever increasing speed of technological advancement and programs like the One Laptop Per Child initiative, extreme inequalities in access to information may one day be a distant memory.
Knowledge, and the power that comes with it, can now be transmitted instantaneously across the world and copied endlessly. This has allowed for not only a more abundance of knowledge, but new modes of production. Wikipedia is a prime example of decentralized information sharing. The content is no longer provided by large centralized institutions, but by you and me. Sites like YouTube or Digg are also decentralized in this way, but Wikipedia is decentralized on two levels.
VII. The Copyleft Meme
Wikipedia, and its offshoots like Wiktionary, use Open Source principles. That means that the creators of the MediaWiki software have placed the code online for free for everyone to download, modify, tweak, redistribute, or repackage in any way they see fit, with one important exception. This has allowed many private websites to freely use the wiki format for their own purposes. However, the MediaWiki software isn’t merely public domain, it uses a particular open source license called the General Public License (GPL).
The GPL is the most popular and philosophically superior Open Source license. While there is a great deal of freedom provided for, there is a very important stipulation that those who modify a GPL work must also place their modifications under the General Public License. At first glance, this idea may seem mundane, but upon further inspection it is astoundingly revolutionary. This is what makes Open Source possible. It is structured in such a way that it cannot be co-opted by centralizing forces, since those who want to add value to the work must, in turn, share their additions. This idea is known as Copyleft.
If this were not the case, and the GPL were more permissive, it would be possible for a company to simply fork the software, add lots of modifications, put it all under copyright and close the source. This is one of the dangers of the Apache license.
Additionally, the nature of the GPL has allowed Open Source projects to become effective in a memetic sense. According to Richard Dawkins, memes are ideas that reproduce similarly to microorganisms, with human minds as the hosts. Memes which are most effective are ones that have high reproductive fitness.
Memetics implicitly recognizes the duality of structure and the nature of the relationship between knowledge and power. Yet, it provides a fascinating reductionist perspective since it argues that it might be more productive to think of memes as the true guiding forces of this new natural selection, and human minds as mere hosts.
Religions are often giving as examples of the most efficient memes. Many of the major religions are so highly tuned that their hosts are willing to evangelize in the most remote regions of Earth. Some religious memes have been known to even override basic survival instincts and compel the host to commit suicide for spiritual reasons.
Democracy seems to have a similar quality. Representative democracy is by far the most common form of governance in the world. According to most estimates, there are over 120 representative democracies and much of this can surely be attributed to its reproductive fitness as a meme. Some criticize Open Source as having an almost religious-like quality that compels people to evangelize its virtues. This is not too far from the truth, considering that openness an decentralization coupled with Copyleft principles has very high reproductive fitness. An important difference is that most democracies are not structured in ways that prevent abuse by centralized powers, such as regulatory capture and outright corruption, whereas copyleft is less vulnerable in this respect.
Knowing this, it would make sense to keep this in mind and domesticate our memes in order to further enhance their reproductive fitness, yet ensure their incorruptibility. By recognizing that people are constantly tweaking their ideas to become highly infectious, one is engaging in second-order cybernetics. This allows for a modicum of freedom insofar as one then possesses the capability to consciously engineer memes. Using this power, one could choose to promote long-term sustainability rather than leaving it up to evolutionary roulette, which has so far favored a narrow and destructive form of self-interest. Just as we are beginning to take our genetic fate into our own hands through genetic engineering, we need to pay equal, if not more, attention to memetic engineering.
VIII. The Many Faces of Open Source
This idea of Open Source started with two projects: GNU and Linux. These projects were attempts by hobbyists to make homebrew operating systems. They published their code online for free and without many restrictions on use. This attracted huge followings from programmers all over the world, and eventually the corporate world became interested and it grew to a multi-billion dollar industry.
It was a completely new and revolutionary way to produce a product. Never before could production involve the collaboration of thousands of people from across the globe, let alone result in free products. Of course the principles of decentralization are not lost because of corporate involvement because of the nature of the GPL.
It is promising that all of this can grow with our current socio-economic structure, and no violent revolution is necessary. Surely Wikipedia is not throwing molotovs at the headquarters of Encyclopaedia Britannica!
Currently, there are a host of open source software projects. Some of my favorites are Ubuntu, Firefox, OpenOffice, and KDE. There are many many technical reasons why these products are superior to proprietary alternatives, including cross-platform portability, customizability, security, and the ability for peer review by thousands of computer programmers. Closed source is simply less flexible in all these respects. Software is much like a cooking recipe with step-by-step instructions for a computer to follow. Ever try to customize a recipe that is kept secret? It can get pretty messy.
Software is not the only thing that can be produced in a decentralized way. Art, music (Creative Commons), movies (Elephant’s Dream), architecture, hardware (OpenSPARC), and even beverages (OpenCola) can all be open sourced. This sort of thing is a boon to artists of all types who love to remix old works to create novel juxtapositions in a hassle-free manner.
The content of Wikipedia, in addition to the software, is under a copyleft license called the GNU Free Documentation License, since the terms of the website require all contributors to follow this license. Although, because of the flexible nature of Open Source, the content of independent wikis can have restrictions about who is allowed to post, and they can place the information created by those people under copyright.
Of course it must be realized that these tools can be used by centralized authorities to make their own operations more effective. Even the US intelligence services have set up their own classified wikis, known collectively as Intellipedia, to coordinate intelligence data. Thus, although the MediaWiki software is incorruptible by centralized forces, its uses are not.
However, this is nothing surprising since all technologies give users, whoever they may be, more power. This includes those who believe in freedom of information and government transparency, such as those who provide access to leaked government documents. A website called Wikileaks has been set up for just that purpose.
A related concept is P2P, which is a decentralized mode of data distribution. The best examples of P2P are file sharing networks. Napster was the first big one, but there quickly arose a variety of others. GNUtella is an open source and fully decentralized P2P network. Recently, the open source BitTorrent protocol has become immensely popular. In 2004 it was reported that Bittorrent alone accounts for a startling 35% of all web traffic. Each of these networks and protocols can be connected to using client software, such as the open source program FrostWire.
Piracy and intellectual property issues aside, what is great about these sorts of systems is that if independent artists, filmmakers, musicians, or programmers want to share their creations they don’t need to invest in multi-million dollar server computers, nor rent such servers, in order to make their content available to people. This is also great for those who are downloading, since they get incredibly fast download speeds.
X. Distributed Computing
Another development in the world of decentralization is distributed computing. Grid computing has long been common practice. Using lots of smaller processors in unison to create larger computers or even supercomputers has been standard practice for those with large computing needs, typically governments and large corporations or institutions. Unfortunately, it is incredibly costly and cannot be afforded by just anyone.
Distributed computing is a way to make use of numerous computers over the internet to split up tasks and have them all work together to solve a single problem. Thus, if many people donate their spare processing power, especially when their computers are idle such as in screensaver mode, a project could have a functional supercomputer without the cost.
There is an open source software program called BOINC that facilitates the majority of these type of projects. One of the most famous distributed computing projects was SETI@home which was sponsored by NASA to look for signs of intelligent life. There are many other uses of this technology though, such as Rosetta@home and Folding@home which are studying the structure and folding mechanisms of proteins, which can lead to new cures for diseases and anti-aging therapies.
Of course, as with any other technology, this idea has also been used for malicious purposes. Cyber-criminals have employed this concept to create â€œbotnets. By exploiting holes in software, it is possible to upload malware to a computer undetected, including distributed computing software. Some modern botnets can have the combined power of millions of PCs, which is enough to bring down virtually any website it wishes to target.
While security will always be an issue, at least it has been proven that open source development models are generally more efficient at discovering and patching security vulnerabilities.
XI. Enhancing Profitability via Open Source
Most of the examples I have given so far are non-profit examples. However, Open Source is big business. Billions of dollars are made through it. Companies like Red Hat (Fedora), Novell (OpenSuse), Google (OpenSocial), and Sun (OpenSolaris, Java, OpenSPARC) all make enormous amounts of money indirectly through open source. Whether it is by providing technical support services or by improving brand recognition. In fact, most of the bigger open source projects, such as the Linux kernel, are funded by corporations such as these, and the programmers are often employees of these companies.
If a company decides to open source a product, this tends to attract a community of developers. Thus, software development costs can shrink since individual coders or other organizations are free to make improvements that can be peer reviewed and merged into the software. Even just open-sourcing the specifications of hardware can have this effect with regard to driver development. Since Intel and AMD have open-sourced their graphics drivers, the free software community has embraced them.
Furthermore, the fact that Open Source software is often free in price provides huge incentive for businesses to use it to cut costs. That is why Apache is the most popular webserver software in the world, and runs many of the largest corporate websites. (Unfortunately, Apache does not use a copyleft license, so it is a corruptible form of open source).
In addition to cutting overhead costs, Open Source can also cut the cost of products. Wal-Mart realized this and is now selling $200 computers like hotcakes that run a version of Ubuntu linux. Thus, this stuff doesn’t merely co-exist with capitalism, but the profit motive can actually become a major driving force toward decentralization.
It is also a benefit for governments, which is why so many have decided to use open source software, including: Brazil, South Africa, Argentina, Venezuela, etc. It is even widely used in the government of the United States, including the US Department of Defense.
Openness also has subtler, yet more profound, benefits than pure cost-cutting. Whenever there is any new conceivable space, virtual or real, that can be owned, there is usually a frenzied rat race to own it. People love to copyright or patent everything under the sun (see: Amazon.com’s 1-click court case). To maintain this ownership, the owners require strong enforcement. Often, governments acts as a proxy for that enforcement. This inevitably results in restrictions for users.
Companies are beginning to realize the futility of placing restrictions on users with schemes like DRM. These restrictions are always reverse engineered and bypassed. Thus, the companies that use them do not prevent piracy, and the companies that do not use them are applauded by their customers for a hassle-free experience.
The same was true when IBM created some of the first affordable personal computers. Companies such as Compaq reverse engineered the computing architecture and created IBM â€œclones. Realizing the advantage this could give them over closed competitors like Apple, IBM finally woke up to the futility of keeping their designs secret. They opened up their computer hardware architecture to allow for IBM Compatible PCs, and only then did the personal computing revolution really get started, allowing for massive profits for all involved.
Thus, decentralization of industry reduces the number of areas in which any sort of economic force is necessary, be it private corporations or public institutions. Yet, this generates vast amounts of prosperity for most other industrial sectors and society as a whole.
It is actually possible, via dual licensing, to make money directly off of open source software. According to this business model, anyone is allowed to use the software for non-profit or personal uses under the terms of the GPL, but corporations must pay licensing fees. MySQL and Trolltech are some of the classic examples of the effectiveness of this model. It is precisely because there are such large numbers of users who do not pay anything that a community is formed upon which the business can grow. This provides interesting ways to make profit in which both the corporation and the users receive all the benefits of open source discussed previously, such as peer review, portability, lower development costs, etc.
Nevertheless, this model still limits the scope of the consolidation of wealth in society, and only draws revenue from other centralized entities, providing a public service to all non-profit uses. Yet, it is precisely by providing this public service that they gain an edge over their competitors and increase profitability. It is very likely that dual licensing, or similar models, could have applications beyond the software industry.
XII. The Necessity of Open Biotechnology
Many of the new opportunities for decentralization are thanks to the greatest information-sharing explosion since the printing press, the Internet. After the Internet, the only foreseeable communications revolution will be through the enhancement of human beings themselves. Two ways to go about this are biotechnology and brain-computer interfacing. It is imperative that these new technologies, which will so radically reshape the human condition, become open source.
As we speak, corporations are gobbling up patents on all sorts of biotechnologies. In fact, they are beginning to patent the entire genomes of natural and genetically modified species. Most outrageously, they have even begun patenting the human genome, which by any definition should be part of the commons.
Considering the dangers of all this, we must start open-sourcing and decentralizing biotechnology as quickly as possible. Luckily, there are some biopunks working toward this goal, such as CAMBIA, but for the most part these industries are highly centralized. It is vitally important that the business world quickly wakes up to the increased profitability of decentralized biotechnology, and begin to support such efforts.
It is in the interest of every corporation that does not own biotechnology patents and not directly profiting from biotechnological knowledge to fight for open source biotechnology. Indeed, all of these corporations should band together and form organizations to promote open source biotechnology. The same process should be used for nanotechnology and virtually any other industry that hampers progress via patents, especially emerging technologies that could have huge impacts on the human condition.
Centralized powers have always had far more ability to monitor others and, simultaneously, maintain their own privacy. For most of the age of electronic surveillance technology, this has remained true. Governments and corporations had access to this technology, but nobody could monitor the monitors.
The growing amount of surveillance is inevitable, and so is the resulting reduction in privacy. The only way to prevent abuses is to create decentralized surveillance systems where everyone, not just the government or large corporations, can monitor public spaces.
Cameras have become much cheaper, smaller, and ubiquitous. In particular, cell phone cameras have allowed some of this potential to be seen. Many criminal acts and incidences of police brutality have been captured with cell phone cameras and posted on the Internet for all the world to see. Thus, ordinary people become better at policing themselves and are less subject to the whims of those who would cause them injustice.
Our privacy is decreased in virtually any realistic scenario of the future, but at least with decentralized surveillance, aka sousveillance, ordinary people have power to monitor as well. Open source camera technology would be a logical next step toward furthering this goal of equal access.
With the development of satellite imaging systems, we have the same old story. At first it was only accessible by elites. Recently there has emerged websites and software, such as Google Earth and GPS systems, which allow for just about anyone to access satellite surveillance. Though one is still reliant upon corporations to provide this imagery, and there is a great deal of censorship with regard to places on Earth that governments don’t want displayed. These potential problems must be kept in check.
XIV. Direct Democracy through Technology
Despite how commonly some deride the intelligence of the general public, there is actually an amazing amount of collective intelligence and creativity that has only begun to be tapped.
Technology is already being employed in not-so-radical ways in current US elections to allow for electronic voting and ballot counting. The problem is that these electronic voting machines are closed-source. This has led to allegations that the results from these machines are inaccurate. Disturbingly, there are few options to verify the results since the designs machines are shrouded in secrecy by the corporations who manufacture them.
As with capitalism, the current predominant form of democracy is not the final, natural, or most decentralized state of affairs. It is important to understand the opportunities for more direct systems of democratic governance that are becoming possible because of new technologies.
Part of the reason representative democracy has been so appealing is the fact that implementations of direct democracy have had lots of technical and logistical difficulties. Now with better communication technologies, such problems could be a thing of the past. This opens up new possibilities for grassroots organizing, â€œcrowdsourcing, workplace democracy, and even direct democratic political governance.
It is even possible to democratize science through the use of these new technologies. One idea that has been proposed is dubbed Wikiscience. Using the methodology of Wikipedia, it is possible to allow hundreds of scientists to work together in ways that were never before possible, and create a more transparent scientific process with peer review along every step of the way.
Technological progress undoubtedly has stemmed primarily from the capitalist system, yet that system is by no means perfect and should not be seen as the End of History. As the World Social Forum claims, Another World is Possible.
Throughout history, as new technologies have been created, there has usually been a mad rush to conquer and own them. When the printing press was invented, the manufacture of consent was born. At the same time, people began using this technology to print literature on radical topics like democracy.
With the advent of new technologies like the Internet and biotechnology, we have the same old story rehashed for the 21stcentury. Those of us who are using Open Source principles to decentralize industry, information, and politics are on the front lines of this age-old battle. We are the modern day revolutionaries.
In this era of user-generated content it is becoming increasingly apparent that the public is far more critical and reflexive than it is given credit for. By utilizing this stored up power-knowledge via decentralization, there is great potential for creating positive, transformative feedback loops in society.
It is imperative that the methods of decentralization used are structured to be as incorruptible as possible, much like the model of Copyleft, with high reproductive fitness. That is the only way to ensure their success.
This document is licensed under a Creative Commons Attribution-ShareAlike 3.0 License