Talking to Howard Rheingold about my previous post on how people might get sick of rating things, he said that his thinking is that you'll opt in to tools that will monitor very specific aspects of your behavior and report on it as part of your opinion on events or individuals.
For instance, he has this example of how people could carpool through a shared rating system: you'd use some device to look where people needed rides on your way into work, and anyone who other drivers had ranked highly as good passengers you might pick up. Likewise, people might refuse to accept rides from low-ranked drivers.
In my initial understanding of this, I was thinking that you would have to take intentional steps to rank the driver or passenger, and that this starts to wear on people. Howard said that in his real vision, systems with permission would monitor this interaction and then report your actions implicitly as opposed to requiring action on your part.
This seems much more likely as an emergent property. For instance, if I control my financial networks, I could let my agent software comb Visa records and make relationships: Glenn likes sushi at these restaurants because he goes there once a week.