Hacker Newsnew | past | comments | ask | show | jobs | submit | orbital-decay's commentslogin

It encodes the data on top of locally optimal trajectories in the physical world that were learned in millions of years of evolution. Treat this as context, not weights.

>Way too risky to use Google services like this tied to your primary account.

Google has no problem correlating your accounts unless you know what you're doing and are ready to switch to the cypherpunk mode.


There is realistically no way to evade the account correlation systems that Google (likely) has.

It's not that hard actually, but separation has a maintenance/convenience cost and requires you to think before doing anything.

We're speculating here but between time correlation, browser fingerprinting and telemetry, the average user attempting to pull off a clean compartmentalisation of two accounts has no chance, even when they think they do.

I'm not speculating, but yes, ordinary users won't be able do that, that's what my warning is about.

What would Microsoft do in such a case? Unlikely to be any different.

I have no context since the original post is deleted, but in general, it seems the supposed account of the son was not tied in any direct way to the other accounts...

The solution is to use a true business-oriented provider and not an ad agency for one's file and email hosting.


No idea, but at least Microsoft has some semblance of support where you can talk to a person.

> The most energetic voice of public dissent has been heat shield expert and Shuttle astronaut Charles Camarda, the former Director of Engineering at Johnson Space Center. [...] he ended up publishing a cri de coeur that I encourage everyone to read in full.

https://docs.google.com/document/d/1ddi792xdfNXcBwF8qpDUxmZz...


Some of that is rose-tinted glasses.

1. Sharing was never really safe, open source by default only became possible because of SaaS and rent-seeking behavior.

2. Early web (not internet) wasn't hyperconnected. With the advent of global-scale social media it was immediately obvious to many this will lead to monoculture and reduced diversity. What thought to be the information superhighway became the information superconductor with zero resistance, carrying infinite current. Also known as short circuit.


Beware of the confirmation bias, it works both ways. Reporting might be alarmist (it always is), actual research is largely not. This study doesn't discredit the entire field, it's pretty obvious that microplastics are everywhere and different types are harmful to an unclear extent, even if the amount might be overestimated in some studies.

>This type of research requires very little creativity or study design -- just throw a dart in a room and try and find microplastics in whatever it lands on. Boom, you get a grant for your study

Precisely, and mapping of that kind is entirely valid and required in huge amounts to have the full picture. Somebody has to do the grunt work.


Unfortunately too often authors of non alarmist research end up promoting alarmist interpretations of their work in science media.

Somehow I'm not surprised that Zero's software is terrible. I don't think being new has anything to do with it, they are just that type of company.

AI unreliability aside, Microsoft suing the hell out of them was always a concern. They do clean room reimplementation to insulate themselves from legal risks as much as possible, another incentive is not what anyone wants.

Well about clean room, you almost got a haircut due to Google v. Oracle in the Android-Java API dispute

Unlikely. Games need a stable ABI and Win32 is the only stable ABI on Linux.

Proprietary software needs a stable ABI. Not games.

DOOM runs on any Linux system since forever because we had access to the source. You can build it for Linux 2.6 and it’ll probably still work today.

Sadly most games are proprietary


Even if all games were FOSS, without - at least - a stable API, most games will remain a hassle to run. DOOM doesn't deal as much with this due to the high amount of volunteers, but relying on community support for all games is just outsourcing labor to some unlucky fellows. At best, it's yet another pain for Linux users. At worse, it's the death of unpopular games. Either case, a hurdle for Linux adoption.

Not really. I actually tried building an "old" game (read: not updated since 2014 or so) on Linux when I used it. It didn't work because autotools changed, some weird errors with make, and the library APIs have changed too.

In the end I gave up and just used proton on the windows .exe. Unbelievable. :(


I should clarify my original comment about stability only applies to glibc itself. Once we go out of glibc there will be varying degrees of API/ABI stability simply because at that point it’s just different groups of people doing the work

In some cases such libraries are also cross-platform so the same issues would be found on Windows (eg: try to build application which depends on openssl3 with openssl4 and it will not work on either Linux or windows)

For future reference if you ever need to do that again, it would be way easier to spin up a container with the build environment the software expects. Track down the last release date of the software and do podman run —-rm -it ubuntu:$from_that_time and just build the software as usual.

You can typically link the dependencies statically during build time to create system independent binaries. So the binary produced inside the container would work on your host as well.


That sounds almost as easy as just copying an .exe file from Windows and running it.

/s


> Proprietary software needs a stable ABI.

Open source software also needs a stable ABI because:

a) i don't want to bother building it over and over (not everything is in my distro's repository, a ton of software has a stupid building process and not every new version is always better than the old versions)

b) a stable ABI implies a stable API and even if you have the source, it is a massive PITA to have to fix whatever stuff the program's dependencies broke to get it running, especially if you're not the developer who made it in the first place

c) as an extension to "b", a stable API also means more widely spread information/knowledge about it (people wont have to waste time learning how to do the same tasks in a slightly different way using a different API), thus much easier for people to contribute to software that use that API


People who keep parroting this clearly have no experience of gaming on linux.

I am playing both modern and old games on Linux. Games outside a super narrow enthusiast realm are always closed-source (even indie ones) and it's going to stay like that in the foreseeable future, that's just a fact of life and gamedev incentives and specifics.

Please elaborate.

Wine has constant regressions. What works fine today will completely fail next year. Which is why steam lets you pick which proton version you want to use.

Which means that a .exe without the exact version of wine won't run.

Plus of course there's the whole vulkan stuff. Older cards aren't well supported but it will rather crash than just run openGL normally where it would work fine.


Those issues seem othorgonal to stable ABI issue from OP, specially the OpenGL one (that is more like a hardware incompatibility issue). When apps fail to run due to Wine updates, they are considered bugs to be fixed. On the native side, apps may break becuase: 1) required library is unavailable, normally because it is too old and unsupported; 2) required library's path is different in distro A from B. None of these are considered bugs and, as such, are rarely addressed. I believe Steam Linux Runtime is an attempt at fixing this,but I'm not sure about its effectiveness. Also, you are exaggerating on the "exact Wine version". It helps to know which versions don't have a regression by knowing which specific version an app used to run on.

> I believe Steam Linux Runtime is an attempt at fixing this,but I'm not sure about its effectiveness.

It's effective enough for it to be practically a solved problem now.


In practice, Wine is constantly improving. It's in active development and not that stable, but regressions are mostly local. Treat its releases like bleeding edge.

>What works fine today will completely fail next year.

Usually not on the timescale of a year. I have many new games that worked a year ago and none of these stopped working now. The worst breakage I had recently was some physics glitches in an old RPG (released in 2001) on Wine 11.0, and it was fixed in the next release.


Are you able to run any of the old Loki games on Linux these days?

With compat libraries and OSSPD it will run even under Pulseaudio.

There's nothing recent about the most popular media being manipulated and/or biased. Discussions on this forum date back two decades, however the specific narrative depends on the context.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: