The word "government" doesn't magically erase all the same individual & institutional incentives, ambitions, biases, & flaws that exist elsewhere.
And sometimes, the extant magical belief that "government" is different & immune lets those same human factors be ignored until they feed bigger, slower disasters that everyone is afraid to admit, because (ostensibly) "we all did this together".
Emacs ofc :) seriously it should not be too much work although org-mode syntax would be even easier, there is a markdown mode here: https://jblevins.org/projects/markdown-mode/
The email part is not something i have done myself but it has been a feature for a very long time and you can find plenty of guides online.
It would first require a standard for Markdown. After that there would be very little stopping anyone from implementing it. I guess a MIME type for standard Markdown would also be nice.
Pretty sure I've said it before, but it would be a nice middle ground between text and all the complexity HTML+CSS brings in (if you want to compete with other HTML clients).
AI is neither bored or nor engaged with us. It's just a technology that we can use or abuse. I doubt it'll become conscious anytime soon though our the desire to invent God or deceive others will push us to invent many contraptions to make it appear conscious.
Should you be counting on confusion of an underpowered text-merge to catch such problems?
It'll fire on merge issues that aren't code problems under a smarter merge, while also missing all the things that merge OK but introduce deeper issues.
Post-merge syntax checks are better for that purpose.
And imminently: agent-based sanity-checks of preserved intent – operating on a logically-whole result file, without merge-tool cruft. Perhaps at higher intensity when line-overlaps – or even more-meaningful hints of cross-purposes – are present.
> It'll fire on merge issues that aren't code problems under a smarter merge, while also missing all the things that merge OK but introduce deeper issues.
That has not been my experience at all. The changes you introduced is your responsibility. If you synchronizes your working tree to the source of truth, you need to evaluate your patch again whether it introduces conflict or not. In this case a conflict is a nice signal to know where someone has interacted with files you've touched and possibly change their semantics. The pros are substantial, and it's quite easy to resolve conflicts that's only due to syntastic changes (whitespace, formatting, equivalent statement,...)
If you're relying on a serialized 'source of truth', against which everyone must independently ensure their changes sanely apply in isolation, the. you've already resigned yourself to a single-threaded process that's slower than what improved merges aim to enable.
Sure, that works – like having one (rare, expensive) savant engineer apply & review everything in a linear canonical order. But that's not as competitive & scalable as flows more tolerant of many independent coders/agents.
Decentralization in this case means one can secede easily from the central authority. So anyone working on a project can easily split away from the main group at any time. But every project have a clear governance where the main direction is set and the canonical version of the thing being under version control is stored.
That canonical version is altered following a process and almost every project agrees that changes should be proposed against it. Even with independent agents, there should be a way to ensure consensus and decides the final version. And that problem is a very hard one.
And yet after all these year of git supporting no source of truth we still fall back on it. As long as you have an authoritative version and authoritative release then you have one source of truth. Linus imagined everyone contributing with no central authority and yet we look to GitHub and Gitlab to centralize our code. Git is already decentralized and generally we find it impractical.
He's not saying you shouldn't have conflicts; just that it's better to have syntax-aware conflict detection. For example if two people add a new function to the end of the same file, Git will always say that's a conflict. A syntax-aware system could say that they don't conflict.
> Should you be counting on confusion of an underpowered text-merge to catch such problems?
This does not really follow from my statement.
I said that underpowered text merge should not silently accept such situations, not that it is the only way to catch them. It doesn't replace knowing something about what you are merging, but it is certainly a good hint that something may be wrong or unexpected.
> Post-merge syntax checks are better for that purpose.
Better, yes, but I was addressing semantic issues, not syntactical. I have seen syntactically valid merges result in semantic inconsistency, it does happen.
I do agree with your last statement.. unit & integration tests, agent checks or whathaveyou, these all contribute to semantic checking, which is a good thing.
Can they be relied on here? Maybe? I guess the jury is still out. My testing philosophy is "you can only test for what you think of testing". And tests and agent checks have a signal to noise ratio, and are only as useful as their SNR allows.
There is no guaranteed way to stop bugs from happening, if there were it likely would have been discovered by now. All we can do is take a layered approach to provide opportunities for them to get caught early. Removing one of those layers (merge conflicts) is not clearly a good thing, imho, but who knows.. if agent checks can replace it, then sure, I'm all for it.
Yeah, screw Robert Reich! Always looking out for the workers who make up the majority of this country. Why won't he look out for the poor multi-national corporations, who have no one to advocate for them or their tax rates?
Your parent post isn’t suggesting it’s always the same user submitting, just that users submit a lot of posts from this person.
Can’t say I agree, though. I don’t recall ever having seen one of his posts on HN, and a cursory search suggests they’re not even upvoted that much. Highest I found was under 30 points. But my methodology is flawed, as I basically searched for the name.
It was income you dicks. Someone above crunched the numbers. Why do you hate rob reich so much your willing to make shit up and get mad at him about it?
Like partial courses of antibiotics, this will only relatively-advantage thoae leading efforts best able to ignore this 'poison', accelerating what you aim to prevent.
"It had been a slow Tuesday night. A few hundred new products had run their course on the markets. There had been a score of dramatic hits, three-minute and five-minute capsule dramas, and several of the six-minute long-play affairs. Night Street Nine—a solidly sordid offering—seemed to be in as the drama of the night unless there should be a late hit."
– 'SLOW TUESDAY NIGHT', a 2600 word sci-fi short story about life in an incredibly accelerated world, by R.A. Lafferty in 1965
> A thoughtful-man named Maxwell Mouser had just produced a work of actinic philosophy. It took him seven minutes to write it. To write works of philosophy one used the flexible outlines and the idea indexes; one set the activator for such a wordage in each subsection; an adept would use the paradox, feed-in, and the striking-analogy blender; one calibrated the particular-slant and the personality-signature. It had to come out a good work, for excellence had become the automatic minimum for such productions. “I will scatter a few nuts on the frosting,” said Maxwell, and he pushed the lever for that. This sifted handfuls of words like chthonic and heuristic and prozymeides through the thing so that nobody could doubt it was a work of philosophy.
Sounds exactly like someone twiddling the knobs of an LLM.
Anecdotes of occasional problems, even at a low or unquantified rate, are valid & useful evidence that something negative is happening.
Anecdotes that sometimes those problems don't occur are nearly worthless. Of course that's true - the original anecdotal
complaint already implicitly relies on, & grants, the idea that there's some default, "hoped for" ideal from which their experience has fallen short.
To chime in, "never had your problems" thus adds no info. Yes, people lucky enough not to hit those Signal limits that cause others to lose data exist, of course. But how does that testimony help those with problems? Should their frustration be considered less important or credible, because of your luck?
The as-if portrayal is one way your anecdote will be perceived, even if that wasn't your intent.
And sometimes, the extant magical belief that "government" is different & immune lets those same human factors be ignored until they feed bigger, slower disasters that everyone is afraid to admit, because (ostensibly) "we all did this together".
reply