Feeling sick this morning, but I got this so that's nice :) Really thankful to have Chiaki right now
I guess I'm doing some reading to catch up on all my notes that I haven't taken a look at yet.
First thing is:
One thing I find interesting about this reading so far is that to a large degree we have exactly what he is talking about here. With digital we are able to etch an enormous amount of information into a tiny space. However as he notes, even this is massive compared to what he is talking about. I guess it's really to say the beauty of our world. That the entire universe is really capturing an immense amount of information. We are just mere travelers along the way. Just like this big ball of Earth floating through the vastness of space.
He is about to get into DNA and I am also interested in this space. I am not sure how I can contribute or have ideas but I do thing it's quite fascinating.
Do Biological systems have autonomy? How do they 'know' to make proteins? Why does this happen? How did multicellular life form? How did cells form to begin with? What gave cells an advantage in the world? Fundamentally these questions are coming from Feynman. That cellular life is complex, DNA perhaps is a store of information but it is actively being acted upon. How and why did this occur?
Interestingly enough Feynman basically outright predicts Machine Learning, specifically in facial recognition. A technique many of us use on a daily basis now.
Rest of the paper was interesting, and I like his economic incentives at the end.
My brain is struggling to focus for any period of time right now.
I guess now we can go read:
So far enjoying the first 10 examples. However education resonates with me the most.
People ask why we can’t reform the education system. But right now students’ incentive is to go to the most prestigious college they can get into so employers will hire them – whether or not they learn anything
This criticism of the education system is nearly exactly what I think as well. From what I have seen from within, this is exactly what happens. I am part of that. Yet I think the most dangerous piece is that, it is not actually dependent on learning. I don't think people were taught to critically think in college.
In some competition optimizing for X, the opportunity arises to throw some other value under the bus for improved X. Those who take it prosper.
And it occurred to me that maybe there is no philosophy on Earth that would endorse the existence of Las Vegas
Eventually after testing numerous strategies, he might find his slaves got the most work done when they were well-fed and well-rested and had at least a little bit of time to relax. Not because the slaves were voluntarily withholding their labor – we assume the fear of punishment is enough to make them work as hard as they can – but because the body has certain physical limitations that limit how mean you can get away with being.
Business practices are set by Moloch, no one else has any choice in the matter.
Governments solve arm races within a country by maintaining a monopoly on the use of force, and it’s easy to see that if a truly effective world government ever arose, international military buildups would end pretty quickly.
The libertarians make a convincing argument for the one side, and the monarchists for the other, but I expect that like most tradeoffs we just have to hold our noses and admit it’s a really hard problem.
(or just invent a robot that doesn’t need food or sleep at all. What happens to the slaves after that is better left unsaid)
I hope it’s not too controversial here to say the same thing is true of religion. Religions, at their heart, are the most basic form of memetic replicator – “Believe this statement and repeat it to everyone you hear or else you will be eternally tortured”.
The worst-case scenario is that the ruling party learns to produce infinite charisma on demand. If that doesn’t sound so bad to you, remember what Hitler was able to do with an famously high level of charisma that was still less-than-infinite.
We could thus imagine, as an extreme case, a technologically highly advanced society, containing many complex structures, some of them far more intricate and intelligent than anything that exists on the planet today – a society which nevertheless lacks any type of being that is conscious or whose welfare has moral significance. In a sense, this would be an uninhabited society. It would be a society of economic miracles and technological awesomeness, with nobody there to benefit. A Disneyland with no children.
Competition and optimization are blind idiotic processes and they fully intend to deny us even one lousy galaxy.
Land argues that humans should be more Gnon-conformist (pun Gnon-intentional). He says we do all these stupid things like divert useful resources to feed those who could never survive on their own, or supporting the poor in ways that encourage dysgenic reproduction, or allowing cultural degeneration to undermine the state. This means our society is denying natural law, basically listening to Nature say things like “this cause has this effect” and putting our fingers in our ears and saying “NO IT DOESN’T”.
Instead of the destructive free reign of evolution and the sexual market, we would be better off with deliberate and conservative patriarchy and eugenics driven by the judgement of man within the constraints set by Gnon. Instead of a “marketplace of ideas” that more resembles a festering petri-dish breeding superbugs, a rational theocracy. Instead of unhinged techno-commercial exploitation or naive neglect of economics, a careful bottling of the productive economic dynamic and planning for a controlled techno-singularity. Instead of politics and chaos, a strong hierarchical order with martial sovereignty. These things are not to be construed as complete proposals; we don’t really know how to accomplish any of this. They are better understood as goals to be worked towards. This post concerns itself with the “what” and “why”, rather than the “how”.
Evolution doesn’t care. But we do care. There’s a tradeoff between Gnon-compliance – saying “Okay, the strongest possible society is a patriarchal one, we should implement patriarchy” and our human values – like women who want to do something other than bear children.
Too far to one side of the tradeoff, and we have unstable impoverished societies that die out for going against natural law. Too far to the other side, and we have lean mean fighting machines that are murderous and miserable. Think your local anarchist commune versus Sparta.
The opposite of a trap is a garden. The only way to avoid having all human values gradually ground down by optimization-competition is to install a Gardener over the entire universe who optimizes for human values.
To expect God to care about you or your personal values or the values of your civilization, that’s hubris.
To expect God to bargain with you, to allow you to survive and prosper as long as you submit to Him, that’s hubris.
To expect to wall off a garden where God can’t get to you and hurt you, that’s hubris. ` To expect to be able to remove God from the picture entirely…well, at least it’s an actionable strategy.
I am a transhumanist because I do not have enough hubris not to try to kill God.
Somewhere in this darkness is another god. He has also had many names. In the Kushiel books, his name was Elua. He is the god of flowers and free love and all soft and fragile things. Of art and science and philosophy and love. Of niceness, community, and civilization. He is a god of humans.