10-15 years ago it was easy to think that the internet was going to set us all free because it's an inherently decentralizing, anti-authority technology.
Turns out we were just in a brief window of lag before companies learned to use it for winner-take-all monopolies, governments learned to use it for surveillance, and authoritarians learned to use it to mobilize reactionary mobs
When trying to brush my 2-year-old's teeth I suddenly realized: she's the guerilla force and I'm the conventional army. She can't win, but she can drag the conflict out and make it so expensive for me that I give up.
Also, it's impossible to bargain or negotiate with someone who doesn't yet have a concept of past or future. "if you do this now, then we can do the thing you like later" is too abstract for her to understand yet.
One of the biggest threats to humanity is our inability to imagine what comes after capitalism.
Capitalism isn't sustainable, communism didn't work. We figure out a new way or we collapse.
We lack the words or ideas to imagine or describe the next system. We're like feudal peasants trying to describe multinational corporations.
UBI is not the thing, but it feels like the first halting attempt to describe, using the words we have, the dim shadow of the future cast onto the wall of our cave.
4. "Oh right of course!": this is something I just realised, and the reason I'm writing this thread.
Often you know of a phenomenon, maybe you even deal with it routinely, but you don't realise its importance in your understanding of the trend.
Like the Hindu fundamentalist shopkeeper who doesn't hate the fact that the shop is in a Muslim area.
This last one is interesting, because it is usually read as hypocrisy, where whereas the other three are read as simple stupidity/evil.
The answer, or at least part of it:
The world is complicated and it's hard to get "clean data," and so every observation is slotted somewhere a spectrum in which I want to identify four points.
1. Signal: an observation that fits the trends. Used as evidence in conversations.
2. Noise: the world is complicated so not every observation is an example of a trend.
3. Exception: sometimes things go against the trends for very specific and contextual reasons.
First, in case you don't know:
Paradigms are interpretations of object-level science and they underlie all scientific research in that they shape acceptable questions, acceptable types of questions, etc.
When a paradigm reaches the end of its usefulness, there is a crisis period when there's a paradigm shift.
Beyond science, all explicit thought rests on something akin to paradigms.
The question: how does our reasoning get invisibly constrained by paradigms?
So I've been thinking about how paradigms are invisible to us, and I'd like to propose a quadrichotomy:
Signal/noise/exception/name to be decided.
My windows OS is named after an apocryphal God of Darkness, and therefore my linux OS is named after a solar deity that definitely existed.
Harking back to the Thailand rescue story from last month, this thread (along with the article) offers up the contrasting natures of Silicon Valley's "move fast & break things" approach and a different culture of "move slowly & don't break things" - both schools of thought very useful in building expertise.
Remember the story of the pottery lessons?
I like this max allowed length. A lot can be said, and not much can be said.
Important point that a friend made:
Foundational/observational isn't a clear division. They're a feedback loop, sensitively affecting each other in detail and direction.
This is why a purely observation orientation (eg lw) has so much trouble with understanding different paradigms.
And why a purely foundational orientation (eg the straw-intellectual from above, also sj people for some reason) takes so much umbrage at the first orientation.
I mean, grounding ain't just words.
It's often one concrete (for some definition of concrete) way in which the observations can be properly (for some definition of properly) grounded.
This is very much related to an important lesson I learnt in my time submerged in the SJ and LW memeplexes:
Never take people's justifications at their words, but do believe their feelings.
For me, this is an important piece of postrationality.
@vgr definitely has it, and maybe it's a defining characteristic of mystics in general.
(I often describe myself as a mystic contaminated by rationality.)
This is a beta thought, inspired by a lot of internal fuming about classical intellectuals.
One way in which I differ from most intellectual-types is that they take the grounding of a set of thoughts --- equally in philosophy, science, etc --- extremely seriously.
For example, people who read Kuhn, Feyerabend, Zizek, etc, often decide that there can't be a real world because of how much sense these guys are making.
I strongly believe, on the other hand, that the fundamental value of a framework is its observational implication, and grounding is just words.
I don't know anyone who's actually imagined the entangled bank.
Either that, or their imaginations are not wide enough to think that maybe other banks are entangled too.
"Her ContraPoints persona is decadent in the mold of Oscar Wilde by way of Weird Twitter."
- Katherin Cross on ContraPoints
You’ve heard about the ‘unreasonable effectiveness of mathematics’.
There’s also what you could call ‘the unreasonable depth of reality’
Reality just has such mindboggling depth of mindless detail you can keep modeling to infinite weariness.
It just never ends. No matter how much artifice you impose on a piece of reality, 9/10 of it is still left, showing up as territory noise in your knowing map.
And knowing is so fragile. Poof and you’re liminally entangled in unfactored reality again.
Roger Ebert on Ocean's Twelve, demonstrating how engagement with the Real will always be worth a hundred theories:
> This isn't a caper movie at all, it's an improvisation on caper themes. If at times it seems like a caper, well, as the fellow said when he got up from the piano, it might not be Beethoven, but it has a lot of the same notes.
@vgr Story time: the people who discovered "high temperature" superconductors apparently purposefully mistyped the name of the compound in the manuscript and left the typo there till the very last step of the review process, so that the published paper was the first time anyone saw the name of the real compound.