| | davidgerard on Dec 2, 2019 | next [–]
Nick Johnson from the Ethereum Foundation tweets what he thought the guy was doing - Virgil Grififth seriously, literally, thought that he personally could, singlehandedly, through the power of his own intellect, bring peace between North and South Korea ... if only those dum-dums in the US Government would let him apply ENOUGH BLOCKCHAIN. https://twitter.com/nicksdjohnson/status/1201212127945605122 | | | | pvg on Dec 2, 2019 | parent | next [–]
That kind of use of 'Peace' is pretty standard communist regime verbiage, though. Maybe Griffith thought that but the word in the talk title is not that strong a basis on which to draw this conclusion. Maybe Nick Johnson knows more - the tweet doesn't say much beyond musing on the title. | | | | api on Dec 1, 2019 | prev [–]
I knew Virgil and considered him a friend long ago when we were both very into evolutionary inspired AI research. I met him at one of the ALife conferences (in Bloomington, IN at IU) and he later attended a group I helped found in Boston called Grey Thumb. Our archives are still at greythumg.org including many talks on ALife, genetic programming, and theoretical biology. Anyway a few years ago I watched him get sucked into the same vortex as many others I knew back then. I am not talking about cryptocurrency but... I guess I'll call it "alt-right" for lack of a better term. His feeds started featuring race-and-IQ material and such and I heard stories about him behaving like a misogynistic asshole which is not the Virgil I met in the 2000s. Seems like a ton of people got their brains sucked out around 2012-2016. I know many others including a once brilliant writer and artist who now sounds like Vox Day and has done nothing but rant about it for years (and zero artistic output of course). Virgil understands a lot about evolution, so I spent some time working on a letter intended as an attempt to deprogram him. I took the approach of explaining from first principles why I reject this ideology for not only moral but also practical and theoretical reasons. (I now feel motivated to turn it into a blog post if I can find the time, but I have a startup and am time poor these days.) When I saw him get into Ethereum stuff I congratulated him and hoped it would give him something sufficiently interesting to do. Anyway I wish I had some Earth-shattering point or revelation here, but this just saddens me. Virgil had a misanthropic streak I can empathize with as someone else who grew up as a geek having an awful time in public school. Underneath that he struck me as a basically playful person with a powerful mind who could have done great things. I wish he would have found something more productive to sink his brain into like AI. Now I'll have to listen to the media trash him too, calling him a "techbro" when he was anything but that. Like I said I was hoping Ethereum would be that but that was before I saw how toxic that world was becoming. These events make it look worse than I thought. Honestly I partly blame whatever weird nexus he fell into. I feel like someone brainwashed him and strapped a bomb to him and sent him off to do their dirty work. None of the other high up bag holders in Ethereum went to the DPRK and now they are washing their hands of it. Anything to pump the token value I guess. I hope if that's the case someone is held accountable but they won't be. I bet they won't even contribute to his legal defense fund. I also wish I was less time poor. There is something deeply toxic that has infected our community. I am still not quite able to see it in its entirety, though I can see its edges and when people get infected by it its obvious. For years I've been straining to compose some magnum opus to deprogram people but I can't grasp the essence of the thing quite well enough. Maybe this is how people felt when Scientology took over Hollywood. | | | | davidgerard on Dec 1, 2019 | parent | next [–]
I know other people who know Virgil (surprised how many), and they talk about him getting "rat-pilled" - getting into LessWrong Rationalism, neoreaction, race and IQ theories. There's lots of LWers in Ethereum, but (of course) most are much more on the ball than to get into scientific racism. | | | | api on Dec 1, 2019 | root | parent | next [–]
I remember being in the same circles circa 2012. I found LessWrong to be a bunch of not very interesting wank and pseudoprofound bullshit. I couldn't decide if neoreaction was serious or some kind of many layered satire (later learned it was 100% serious). I know too much about actual evolutionary theory to buy into the IQ cult much. When Gamergate hit my reaction was "this is the stupidest thing I have ever seen... why would anyone actually care about this?" Then they dropped like flies. Vast numbers of very bright people that I knew got "pilled" in one way or another. As it goes in the poem "Howl" I saw the great minds of my generation... something something something. I personally know at least a dozen formerly smart and creative hackers, scientists, and artists who got their brains sucked out in that era. I still don't get it. I feel like a survivor of some kind of zombie plague slowly making contact with other survivors. It's so weird its enough to make me consider the strongest versions of the Russia / underground Reich conspiracy theories and think we were hit by some kind of scientifically designed mind control project... but Ockham's razor suggests simpler explanations unless I see evidence to the contrary. It was most likely just an outbreak of contagious nonsense that happened to spread virulently through a certain population. I think the hooks were male insecurity (common among male nerds) and a very catchy form of pseudoprofound bullshit that felt like deep thought and profound insight. | | | | impeachgod on Dec 2, 2019 | root | parent | next [–]
I considered Virgil to be a good friend of mine for a very long time, and the last few years have been deeply saddening for me due to his increasingly bizarre beliefs and his irresponsible and inexcusable behavior, especially towards his female friends and colleagues. This is not the Virgil I knew when I first met him. I cannot speak for other people, but for me personally, the pseudoscientific reactionary ideas you mention were very appealing to me at one point in time because I had, and still have, a very difficult time attracting women. There was a period of time in my life when I spent a lot of time reading pickup artist/RedPill/men's rights content on the Internet until I managed to escape it, mostly via the help of supportive friends. I think that there are a lot of misguided nerdy young men who fall into this frightening rabbit hole, where one bad ideology leads to another. | | | | api on Dec 2, 2019 | root | parent | next [–]
The worst part is that it drives good women even further away. Now you're not just a nerd but also an asshole. The kinds of women you will get by being an asshole are the kinds who are assholes themselves. The result is usually an abusive relationship where she treats you like shit, which further reinforces the misogynistic ideology by "proving" once again how bad women are. So yes it's an obvious downward spiral. Kudos for seeing it. It sounds like Virgil slammed more kool aid than I thought. | | | | davidgerard on Dec 1, 2019 | root | parent | prev | next [–]
yeah - I think it's fully explained by a virulent plague of memetic brain worms, and doesn't need a conspiracy. I recommend Neoreaction a Basilisk by Elizabeth Sandifer to get a handle on it (and not just because my name's in the first sentence). | | | | api on Dec 1, 2019 | root | parent | next [–]
I've heard some great podcast interviews on it. I think the answer is some of both, since I do think political opportunists of many sorts manipulated and rode the wave. I do also recall LW as having a fever ward stench too when I visited it back then... not as overpowering as the 'chans, but bad. Its the stench of a lot of people thinking about and discussing nothing with many words. In any case maybe we will emerge with a greater immunity to pseudoprofundity and superficial appeals to ego wounds and personal triggers (like male insecurity). | | | | lidHanteyk on Dec 2, 2019 | root | parent | prev | next [–]
It's because, just like with Objectivism before it, once one start to accept neoreactionary tenets, then their conclusions seem unavoidable. They're less illogical than one would hope or imagine. The biggest inoculant, personally, was reading "Guns, Germs, & Steel" in my late teens. Other important concepts that have helped me to design arguments against bullshit: * Pedigree collapse (undermines race realism and Great Replacement) * Sex chromosomes/allosomes (undermines binary sexual theories and pre-queer gender theory) * (Computational) complexity theory (undermines classical analysis of Chinese Room, Simulation Arguments, Newcomb's Paradox, other LW favorites) * Plate tectonics/basic geology (undermines space denial) * Worryingly: History of the USA (undermines Lost Cause, Deep State, etc.) There is a big underlying current that I have noticed: Those who committed to classical liberalism were the ones swept away, while neoliberals, metaliberals, feminists, and Marxists were not moved. This is the big change in the shape of the left wing, to the extent that such a thing exists, and probably a large part of why it seems so much further to the left than it used to be. The alt-right tide washed in, and washed out a large part of the center of the beach when it went back out. Also, as you point out, people aren't getting laid. It is depressing to ponder whether the real inoculant may have been the fact that I had a girlfriend in high school. | | | | api on Dec 2, 2019 | root | parent | next [–]
Yeah, I also learned to talk to other humans in my late teens and had girlfriends (and friends in general both male and female). I think a lot of this bollocks preys on people with severe social issues. Unfortunately it sucks them into hate and resentment circle jerks and keeps them from finding actual escape hatches. BTW I have lots of issues with Ayn Rand but IMHO Objectivism is less loony than this stuff. She once wrote that "racism is the lowest form of collectivism," and I agree. She wouldn't be a friend of this stuff if she were still around, but she died in the late 80s. I actually think reading Rand in my teens and early 20s was an inoculant for me. I do think she was an original and very often misunderstood thinker even though I don't think she was ultimately successful. I consider her a liberal heretic rather than a right-winger and oddly enough I compare her to Marx as someone who had brilliant insights but whose prescriptions don't quite work. This podcast is worth a listen: https://parallaxviews.podbean.com/e/ep-8-jeffrey-a-tucker-on... | | | | davidgerard on Dec 3, 2019 | root | parent | prev | next [–]
> Also, as you point out, people aren't getting laid. It is depressing to ponder whether the real inoculant may have been the fact that I had a girlfriend in high school. I can't quickly find a cite, but I recall something about how, when the PLO didn't need the Black September guerilla group any more, they deradicalised them by ... finding them girlfriends | | | | tptacek on Dec 2, 2019 | parent | prev | next [–]
You should absolutely write this post! | | | | api on Dec 2, 2019 | root | parent | next [–]
Yes I should. Here is one of several important bits: Eugenics fundamentally won't work because you're replacing the environment as the learning target with the system's own model of fitness. You are saying "it is fit because the fit say it's fit." The result of eugenics over a long period of time would be random drift into oblivion as the goal function chases itself as the objective. Had Hitler won and the Nazis implemented their Aryan breeding program the result after 5-10 generations would resemble the antagonists in the film "The Hills Have Eyes." Eugenics might seem to work great for 1-2 generations as you eliminate obviously unfit genotypes like severe metabolic disorders, then you'd top out and start spiraling down into a sea of ubermenschen drool. This is also a problem by the way for runaway self-improving AI nonsense of the sort that played a role in the LessWrong mass psychosis. How will Roko's Basilisk get so super-smart without a goal function rooted in the physical universe it must model? Evaluation of that goal function requires embodiment, so you must have many many many generations of little basilisks and that takes quite a long time. There are other problems with runaway self-improvement too, like P probably not equaling NP and the implications for the rate at which any learning system can go "where there are no roads." An AI could copy the human brain to achieve human-level AI, but beyond that there are no roads so our hypothetical super-mind is confronted by bignum^bignum combinatorial search problems. The search domain is convex so it's not as hard as cryptography, but that's like saying you must only examine every sand grain for fifty miles of the beach instead of the entire coast of Florida. There is a reason nature took almost five billion years to "invent" conceptual thinking and such, and it's not because "evolution is slow." Biology is the largest massively parallel computer in the known universe. Biological information systems outperform our own computers by multiple orders of magnitude. Look at what the brain does on ~40W of power vs what our supercomputers can do on megawatts. That "machine" has been running for billions of years consuming many times more power than all of human civilization to generate present-day lifeforms including ourselves. Some hacker is not going to assemble some code on a Von Neumann machine that runs away and outperforms that in a matter of months or years. That's sci-fantasy nonsense on the order of Star Wars dog fighters making pew-pew noises in space. Evolution is incredibly efficient at finding and gobbling up any free lunches like that, so if such shortcuts existed they were probably found in the seas of the Cambrian epoch. (I don't dismiss all potential AI or cyborg type scenarios, just the obviously impossible and/or silly ones.) That's like 5% of my notes. Problem is you're arguing with a cult. The "pill" ideologies are not based on reason but on rationalism driven by emotional appeals to personal wounds (triggers) like male insecurity and compensatory narcissism. I'm not convinced my post wouldn't be an exercise in pissing into a hurricane. | | | | cyril_figgis on Dec 4, 2019 | root | parent | next [–]
>An AI could copy the human brain to achieve human-level AI, but beyond that there are no roads so our hypothetical super-mind is confronted by bignum^bignum combinatorial search problems. If you can copy human minds into a machine-world, then you can copy the most intelligent human minds. The most intelligent humans are pretty darn smart, and with no obligations, a surfeit of time, and effective immortality, they may make short work of your claims that intelligence can't be effectively bootstrapped. It's difficult to imagine the inner lives of people much smart than we are. So neither of us can imagine a world occupied by persons belonging to a 300 IQ race of supermen, but there are presently 200 IQ persons, if only as genetic anomalies, and technology already exists sufficient to replicate them an almost arbitrary number of times. The future's so bright I gotta wear shades. | | | | tptacek on Dec 2, 2019 | root | parent | prev | next [–]
All I can say is that I have some of the same frustrations dealing with otherwise productive and intelligent people who are being sucked into and spaghettified by the memetic black hole you're talking about. There aren't a lot of good counterprogramming resources and coming up with that material in the moment is a hopeless drag. | | | | AlexCoventry on Dec 2, 2019 | parent | prev | next [–]
> There is something deeply toxic that has infected our community. I am still not quite able to see it in its entirety, though I can see its edges and when people get infected by it its obvious. I think SlateStarCodex(!) said it well, recently: > Rationalists wasted years worrying about various named biases, like the conjunction fallacy or the planning fallacy. But most of the problems we really care about aren’t any of those. They’re more like whatever makes the global warming skeptic fail to connect with all the evidence for global warming. https://slatestarcodex.com/2019/11/26/mental-mountains/ | | | | davidgerard on Dec 2, 2019 | root | parent | next [–]
Given SSC, its subreddit and its (Scott-recommended) offshoot /r/themotte, this is a case of the beam in one's own eye. Same problem with LessWrong - the best of Yudkowsky refutes the typical of Yudkowsky. | | | | sbierwagen on Dec 4, 2019 | root | parent | next [–]
Are there global warming denialists on r/SSC? Are there so many global warming denialists on r/SSC that it repudiates the concept of rationality? | | | | AlexCoventry on Dec 3, 2019 | root | parent | prev | next [–]
Did he repudiate rationalism? Otherwise, if he's talking about rationalists, he's talking about the beam in his own eye. | | | | unknownkadath on Dec 2, 2019 | parent | prev [–]
Please do so! Based on this post, I believe you would do a great job. | | |