2011-12-21

Democracy is the Killer App for the Semantic Web


I've been harboring some big ideas for a long time, and over the last 20 years grown them through accretion in historical and technological context. I'd hoped to make them my life's work, but I now realize it's more than I can probably do (I need help), and I wonder if the world can wait (people are beginning to express similar ideas). Furthermore, I don't really want my ideas or the value they might bring to be trapped in a proprietary vertically integrated product, and I think I'd prefer them to be expressed like internet standards. So, I hereby release these ideas into the public domain on this day, the fifteenth of December, 2011.

What follows is complicated, but basically I think we can use existing technologies to implement direct and representative democracy at organizational, local, community, and state level. I found Rootstrikers after watching a Lawrence Lessig speech at Google on YouTube. His premise is that money controls congressional campaigns ergo elections, therefore money controls congress. In reality, low voter registration and even lower turnout allows small numbers of independent or "swing" voters (and the propaganda that swings them) to impersonate a representative majority. I believe well designed and integrated technology can disintermediate politics in America and beyond.

I think (hope?) people can be reengaged to form real consensus, by presenting them with technology tools to analyze the practical implications of political platforms, and present them with simple evaluations of how they might be harmed or benfit. The problem is that public exchange of politics today is too ideological. Politics need to be consistent with people's principles, but the connections from political philosophical principles to practical policy consequences are negotiated entirely by the lobbying industry behind closed doors. Even though mounting public participation in Internet discussion attempts to address this concern, it doesn't congeal into a clear likelihood of voter behavior (what they call "plusses" in political campaigns).

I think Summly has implemented some of what I had envisioned, but haven't been able to define concretely enough until recently: reducing/compressing content with ontology to make it more easily digestable by both people and computer systems. I think Hypothes.is plans to implement another aspect: adding logical qualitative metadata to the presentation of existing Internet content to help evaluate the gist of that content against the best standards available. Google+ is doing some of what I have envisioned in the way it collects/defines each user's perspective. Git has implemented some of the persistence model I have been struggling to define for this system, and BitTorrent has implemented most of the network model I think will be necessary to provide fault tolerance that tolerates attempts at censorship.

The idea is that text, found on the Internet (as Summly does), or discussion items supplied in context, contain some logical propositions (ignoring ambiguous stuff). Using Natural Language Processing, these propositions can be reduced to ontologies, basically a table of subject-verb-object mappings like "Lassie is a dog," "(a) dog is a canine," and "(a) canine has a tail" from which logical inferences (which are not explicitly in the text) can be deduced: "Lassie has a tail" using what's called Expert Systems. In Expert Systems, sets of conditional rules are applied to given factual statements like an ontology provides, to either infer the unstated implications of a proposition, or to prove the logical (in)consistency of a proposition.

Apparently Summly demonstrates the ability to automatically break down some content into crystallized logical components (propositional logic), and existing Expert Systems can perform automatic logical inference to produce a hierarchical outline of any argument's premises and implications (either categorically or with variable probabilistic confidence). My proposal would be to provide an implementation and an interface that automatically outlines an argument, during or after it is composed, and provide logic checking, like spellchecking. The rules used to check logic would be faceted, like a social network. What any person sees, is their perspective, represented by the stuff they read and watch and listen to, ranked by how important they feel it is.

Articles, audio, or even video, could be shown side by side with an outline of the logical representation of that content. The patterns of logic are networks (the technical term is "graphs") which can be navigated with hyperlinks to see content which represents similar premises or implications. How well two arguments (fail to) fit together can be represented as a distance vector of the matching/conflicting premises and implications. 

This type of system would also accommodate the scientific community, given that they make assumptions on certain facts (data), perform some analysis, and infer some conclusions in the form of propositional logic (with statistical confidence). Theoretically, any and all of scientific knowledge can be interpreted as a big list of probabilistic propositional logic statements. I really hope to invite politics into the world's largest science fair.

In Political Science, and Economics, this system would also accommodate Game Theory. Given a list of rules about players in a hypothetical situation and how they would form preferences to judge outcomes (which might be borrowed from or compared to the ontological representation of historical accounts), decision tree  could be calculated and Nash equilibria or dilemmas identified. This provides the capability to discover political consensus and compromises. Peace in the Middle East? I can dream! Materially, I'd like to turn the vulgar popularity contest into a popularity contest of ideas based on robust predictions of the consequences. I think the needs of the world's poor will be relatively easy to model, ergo their political consensus can be easily optimized. At least the necessary and sufficient conditions for corruption and strife will be more easily demonstrated. What can help people decide how to vote in elections can also be used to help people decide how to vote on juries.

It's not magical, but as I see it just represents the process implemented manually, in secret/private, in the heads of people in the political/economic elite and in quality public discourse for the rest of us. The more "plusses" a political candidate in a democratic system can get without spending money, the less influence money has on that candidate. The system also would provide a tremendous amount of data for social science. There are commercial implications, as anything that can model arbitrary game theory can also implement markets. Anything that can implement markets can also implement democratic process itself.

The most powerful and dangerous idea I would like to share: law is code, and code is law. Executive functions of government might be fully implemented by machines, and the code which those machines execute should be subjected to the same legislative democratic participatory process/consent as we expect the laws today should be subject to democratic consent. What if tax policy and government finance were all just a system of fully automatic monetary transactions governed by algorithms and implemented in code that we could all read and consent to? Open Source the IRS, or even the whole Department of the Treasury.

2010-11-14

Think Cloudy

When are you going to get in on Cloud Computing? It's the new black. It's Dot Com all over again. Think CloudyTM. You saw it here first.
Is anyone else tired of Cloud Computing? Is it really any better than Web 2.0? Where is my server? How do I know I'm not being gouged by someone preying on my ignorance? Plenty of people got burned the first time these trendsetters started prancing around the campfire.

2010-10-26

Why the US economy will be in the toilet for a long time US Gini 0.542

calculated using corrected formula and sanitized Social Security Medicare flat tax data
http://ssa.gov/cgi-bin/netcomp.cgi?year=2009

The blue line represents a perfect flat distribution of income. The bottom 10% according to this ideal gets 10% of the income, and the bottom 30% gets 30% of the income, and the bottom 70% of earners gets 70% of the income. The green line represents how much income those fractions of the population actually get. The more the green line sags, the greater the Gini Coefficient.

0.542 (much better than the old .0608 calculated with a mistake from erroneous data but still) puts the US in league with Africa and Central America. A low Gini coefficient is a mark of civilization. A high one is trouble. Countries with bad income disparity don't have vibrant economies. The same supply-sider arguments against progressive taxation dragging on the economy also apply to an oligarchy distorting markets. Now that the economy is not growing, there's no "increasing the whole pie"  argument to stave off "why is my slice so small?"

2010-09-15

Take a ride on the Case Shiller index 1890-2010

I couldn't make contact with the author of the original rollercoaster at SpeculativeBubble.com, but I thought we'd all like to see the latest housing price excitement from the rollercoaster perspective.



Using inflation adjusted data, and biased it to place lows (1920s) near ground level, you can get a feel for the various periods of the cost of home ownership.

Spoiler: The latest (2007-2010) data drops too fast, so I used a little artistic license and changed from annual to quarterly data in 2007 to get a better camera angle. Timeline is annotated for perspective.

2010-09-11

Calculating the Bottom of Residential Real Estate

What's the least for which you could sell a house? If I have to get out, how far underwater is my mortgage? What should I offer? How should I negotiate?

Forget everything else. Credit is no longer free and easy. The only buyer competition you should worry about is a cash buyer, and this is how, in cold blood, they calculate the cash value of your dream home. Motivated sellers might need to take less if there aren't enough buyers. Buyers might have to pay more if other investors expect to get more rent.

Rule of thumb: $700 rent = $100,000 price. If in your market you can get rent for the as-is home on offer totaling $1400/mo. then it's worth $200,000. That's more or less the economic equilibrium point between the rents supported by jobs in the local economy and the cost of providing housing.

Change the number in either one of these blanks to see how comparable rentals translate to price.

$controls $purchase price.