GAME THEORY: The New Solution To The Iterated Prisoner's Dilemma
It's not tit for tat. It's trust.
If you are new to game theory, the most important one to understand for real life application is the Prisoner’s Dilemma. This well-tested game theory shows that for short-term relationships (or any kind of collaboration), if you are a scam artist, you will likely win. But only in the short term. When you decide to engage in long-term games with other people with many iterations - such as building companies, families, societies, systems, or governments together - long-term trust and cooperation will almost always win in the end. This long-term version is called the 'iterated' prisoner's dilemma, and the best-known solution for it is called 'tit for tat.’
‘Tit for Tat’ basically means that you want to cooperate, so you do unto others what they do unto you, with some room for forgiveness. One of our favorite thinkers on this subject is Naval Ravikant, and he explains here how he was able to build many successful teams and systems using trust, compound interest, and the tit-for-tat solution to the iterated prisoner’s dilemma to create long-term relationships with people:
If you want to really understand the iterated Prisoner’s Dilemma, we suggest this video. It is simple and clear and has great visuals:
Groups of people collaborating, cooperating, and sharing knowledge is arguably the most powerful force on Earth. Think of nearly any amazing thing that has ever been built, and chances are that many people were involved. While some of those people were paid with money, forced into labor, or coerced with fear, it doesn't alter the fundamental premise: People collaborating is a powerful force. We would argue that the more voluntary and freedom-enriched that collaboration is, the more likely an amazing outcome is probable.
In a free world, if groups of people trust each other, they can work together and achieve much more than if they spend most of their time either working alone or assessing each other's trustworthiness in the first place. The amount of energy spent on gauging trust could have been invested in building things, creating new knowledge, or solving problems together.
But when people hear about trust in the online realm, their minds immediately jump to fears of Social Credit Scores and CBDCs (Central Bank Digital Currencies), and these fears are justified. We should absolutely fear allowing centralized and powerful entities to issue us credit scores or have the power to dictate our lives with programmable centralized digital currencies, or worse.
However, these fears often lead us to overlook two things. First, is the difference between centralized and decentralized systems that we wrote about HERE. The difference between a centralized trust system and a properly built decentralized and transparent one is the same difference you would find between CBDCs and, say, Bitcoin. Sure, they are both forms of 'crypto,' but they are worlds apart. That's because of the 'Last Hand on The Bat' theory (which we also explain in depth in that same article HERE). Basically, who has the final control? A centralized entity like the Chinese Communist Party? Or a decentralized entity like all the individual nodes and users on Etsy?
The second thing that these fears sometimes cause us to overlook is the fact that if we solved the trust problem that exists both online and in the real world, it would immensely facilitate our ability as humans and societies to embark on even bigger and more exciting projects together. If we are ever to become a more advanced civilization, we need a way to work together more effectively, and trust each other and our systems in a high confidence way. Trust is the cornerstone of creative collaboration, especially when it comes to mixing IRL and URL (In Real Life and Online).
Also, another major difference between centralized social credit scores (like what they have in China) and what could exist in a digital 'high trust' society is that the latter would be voluntary, while in China, participation is mandatory. Imagine you want to join a group of high-trust individuals working on a project together. Is there a way to voluntarily enter this project with trust quantified in advance? If you are a high-trust person, shouldn’t you want to be part of a high-trust society?
If we are trying to work with others - both in person or online in groups such as Network States (like we wrote about HERE) or what we truly believe in, Collective Intelligence “Swarms” (like we wrote about HERE), we will always have that initial problem to solve: should we trust the other people involved? Do we trust the people in the group with us?
This is also true in almost every interaction online.
While tit for tat seems to work very well, we believe there is an even better solution to the iterated prisoner's dilemma. What if the prisoners had a way to know in advance how the other person had played long-term iterated games in the past? This would be the difference between blind faith and having a good explanation of why you trust someone. If one prisoner could see the other's history, it would create that explanation. And if it were voluntary, people could avoid working with those who do not perform well in long-term iterated relationships.
Think of it like how on eBay and Amazon, both buyers and sellers have versions of this. Peer-to-peer trust scores based on previous interactions/iterations.
We have been exploring solutions to this problem of trust in large groups, and nearly all of them revolve around a decentralized system. A few examples them are:
Vouches for Meeting in Person - mixing IRL (In Real Life) with URL (online). Create a “vouching” system where at the very least people can meet others in the real world and “vouch” that they are a real person. Ai and Bots would have a very tough time infiltrating this system with any level of trust as their networks would not connect to enough real people.
Trust Tokens - imagine giving crypto tokens to people who perform trustworthy acts for you, and keeping all the interactions in a transparent ledger online. (We expand on this fascinating idea deeply HERE).
Weighted decentralized trust ratings - similar to trust tokens, but the more trustworthy you are, the more value your tokens or ratings have when you give them to others.
Interview Processes for High Trust Societies - have randomly selected, already trusted members of the society interview new members in various formats. Maybe even need to pass a test or do an act for the community before being allowed in.
Trust Tests - Anyone who appears untrustworthy, such as new members of a high-trust society, could be subjected to random tests administered by other members until they reach a baseline level, allowing them to participate in missions or iterations with the rest of the network society.
If we want to excel as problem solvers, builders, and knowledge creators (and use the internet to connect people to do so), then trust is something we should constantly be on the lookout for. Because once we have trust, we possess a force. Think of it like a SEAL team, where each member has the others' backs. Collective Intelligence swarms can be built on this trust.
The challenges the world currently faces will not be rescued by our corrupt systems. We must build new high-trust systems, network states, and digital communities on our own - from the bottom up - if we are to combat corruption and foster cooperation. That starts with being able to lock out the bots and ai and the people doing the corrupting. Solving the Iterated Prisoner’s Dilemma in a better way is the foundation all of this and more can be built on.
Thanks for reading!
All problems that do not defy the laws of physics are solvable.
Solving problems is happiness.
And humans solve problems better in high-trust groups.
#CollectiveIntelligence
Please join our think tank message board at SwarmAcademy.ai where we continue conversations like this one and where you will be able to participate in swarm intelligence problem solving with us on SWARM FORCE once it is finished being built.
For over 3 billion years on this planet there were only single-celled organisms. Then one day they somehow learned to work together and make complex multi-celled creatures . Right now we are like those single-celled organisms. Our next evolution is finding how to work together, better… (like we wrote about here).
#SwarmAcademy #NetworkState #LEADERLESS #ResultsMatterMost #DecentralizeEverything #DemandTransparency
COMMENTS ARE FOR EVERYONE AS A PLACE TO THINK-TANK SOLUTIONS. They will never be for paid-only subscribers and we will never charge a subscription.
Add in a mechanism to censor one or two levels of the personal (and virtual) connections if someone is shown to be a calculating fraudster as they may be promoted or promoting other bad actors. At least leave flags that they are linked to a fraudster. People can distance themselves by withdrawing their trust and then their trust falls further and the wiser old previous friend will be upgraded to previously friends with a fraud.
All this sounds great.
I would like to have an option to meet someone in person and exchange one time code books with them that would allow us to communicate without the method being broken by mathematics (quantum computing weakness).
Saw a report yesterday that someone can detect what is displayed on a computer from the radio noise that the DRAM is transmitting.
We need to airgap our crypto terminals (R-Pi) and have them some distance from our network terminals (old mobile phone with Sailfish or Graphene OS or another micro with just WiFi access) )with message transfer only via optical means (camera looking at displayed QR code) to a machine that can be expected to not be able to read info past a air gap.
Where is the prototype?