For the best experience on desktop, install the Chrome extension to track your reading on news.ycombinator.com
Hacker Newsnew | past | comments | ask | show | jobs | submit | history | Pannoniae's commentsregister

Also another incorrect factoid: "The original Xbox (2001) was built on familiar PC hardware (Pentium III derivative, Intel GPU, standard hard drive)"

(it was an NV GPU)


Good catch. Indeed the GPU was Nvidia’s NV2A, not an Intel GPU. I will correct that in the article. Thanks for pointing it out.

>When you want to cast a spell you have to enter the number of the spell from the manual, maybe because there was not enough memory to fit the names of the 94 spells into RAM

Probably not ;) "Enter things from a manual" was a tried old copy protection technique. If you used the warez version you presumably did not have a manual so you got stuck. This didn't run on the 8008 or whatever, I'm sure the game could have known the names of spells fairly easily.


Ah, that makes more sense than my theory. It's a weak copy protection method, though, as you can just try and see what happens, and I think they dropped it in M&M3.

Yes, and it was pretty easily photo-copied since it had to be printed all in one place anyway. That's probably why even print-based protections tried to get cleverer. Like the code wheels, although I remember those didn't take that much more effort. Disassemble the original, copy all layers, cut out the right holes, put back on a spindle.

I remember one game I had that tried to protect against it by having a manual of about 100 pages, with the passcodes being spread across all of them. I believe it was Gunship 2000.


I dunno I have 96GB of RAM and I still get the whole "system dies due to resource exhaustion" thing. Yesterday I managed to somehow crash DWM from handle exhaustion. Man, people really waste resources....

Sad to see you being downvoted, but you're exactly right. Well, almost - if you can afford to invest in a good integration test suite, that can catch many errors without requiring a human to regression-test every time.

At the same time, many quality attributes can't really be automatically tested, so automation shouldn't try to replace manual testing, it should be used to augment it.


Not really. I actually tried building an "old" game (read: not updated since 2014 or so) on Linux when I used it. It didn't work because autotools changed, some weird errors with make, and the library APIs have changed too.

In the end I gave up and just used proton on the windows .exe. Unbelievable. :(


I should clarify my original comment about stability only applies to glibc itself. Once we go out of glibc there will be varying degrees of API/ABI stability simply because at that point it’s just different groups of people doing the work

In some cases such libraries are also cross-platform so the same issues would be found on Windows (eg: try to build application which depends on openssl3 with openssl4 and it will not work on either Linux or windows)

For future reference if you ever need to do that again, it would be way easier to spin up a container with the build environment the software expects. Track down the last release date of the software and do podman run —-rm -it ubuntu:$from_that_time and just build the software as usual.

You can typically link the dependencies statically during build time to create system independent binaries. So the binary produced inside the container would work on your host as well.


That sounds almost as easy as just copying an .exe file from Windows and running it.

/s


Yeah exactly. High-level people think the low-level stuff is magic, and us from the other side think the high-level stuff is magic (how can you handle all that complexity?...)


> how can you handle all that complexity?...

You don’t. Someone else smarter than you handled it already and you just need to integrate their solution.


"Someone else smarter than you manufactured microchips and you just need to integrate their solutions."


Yup. AbstractSingletonProxyFactoryBean and SimpleBeanFactoryAwareAspectInstanceFactory agree as well :)


Hey, this isn't entirely accurate!

The 4-bit stuff is a hangover from Notch doing this (I'd maybe even say a similar-calibre programmer to Chris Sawyer...). The sound has nothing to do with technical limits, that's a post-facto rationalisation.

The game never played midi samples, it was always playing "real" audio. The style was an artistic choice, many similar retro-looking games were using chiptune and the sorts. It's a deliberate juxtaposition...

The CPP variant doesn't really perform better anymore either.


Fair enough, I mostly meant to point out some of those design decisions predate MS, as much as I love to hate on them. The music was just an interesting bit of trivia I read the other day.


Yeah, 100% :) Ironically, the design constraints are one of the big things which made it work so much! If it was designed in a "traditional" way, it would have been much less ambitious.


Bedrock Edition has a smaller simulation distance, which is kind of the opposite you'd expect from the more "optimized" version.


This is all true but IMO forest for the trees.... For example the compiler basically doesn't do anything useful with your float math unless you enable fastmath. Period. Very few transformations are done automatically there.

For integers the situation is better but even there, it hugely depends on your compiler and how much it cheats. You can't replace trig with intrinsics in the general case (sets errno for example), inlining is at best an adequate heuristic which completely fails to take account what the hot path is unless you use PGO and keep it up to date.

I've managed to improve a game's worst case performance better by like 50% just by shrinking a method's codesize from 3000 bytes to 1500. Barely even touched the hot path there, keep in mind. Mostly due to icache usage.

The takeaway from this shouldn't be that "computers are fast and compilers are clever, no point optimising" but more that "you can afford not to optimise in many cases, computers are fast."


I actually agree with you.

My point wasn't "don't optimize" it was "don't optimize the wrong thing".

Trying to replace a division with a bit shift is an example of worrying about the wrong thing, especially since that's a simple optimization the compiler can pick up on.

But as you said, it can be very worth it to optimize around things like the icache. Shrinking and aligning a hot loop can ensure your code isn't spending a bunch of time loading instructions. Cache behavior, in general, is probably the most important thing you can optimize. It's also the thing that can often make it hard to know if you actually optimized something. Changing the size of code can change cache behavior, which might give you the mistaken impression that the code change was what made things faster when in reality it was simply an effect of the code shifting.


I originally got into writing compilers because I was convinced I could write a better code generator. I succeeded for about 10 years in doing very well with code generation. But then all the complexities of the evolving C++ (and D!) took up most of my time, and I haven't been able to work much on the optimizer since.

Fortunately, D compilers gdc and ldc take advantage of the gcc and llvm optimizers to stay even with everyone else.


The thing which would really help IMNSHO is to nail down the IR to eliminate weird ambiguities where OK optimisation A is valid according to one understanding, optimisation B is valid under another but alas if we use both sometimes it breaks stuff.


Yes, one of the unexpected problems I ran into is one optimization undoing another one, and the optimizer would flip-flop between the two states.


Yup :P

As in their post:

"The future of software is not open. It is not closed. It is liberated, freed from the constraints of licenses written for a world in which reproduction required effort, maintained by a generation of developers who believed that sharing code was its own reward and have been comprehensively proven right about the sharing and wrong about the reward."

This applies to open-source but also very well to proprietary software too ;) Reversing your competitors' software has never been easier!


If they really believed that their process eliminated any licensing conditions, why would they limit themselves to open source projects?

High quality decompilers have existed for a long time, and there's a lot more value in making a cleanroom implementation of Photoshop or Office than of Redis or Linux. Why go after such a small market?

I suspect the answer us that they don't believe it's legal, they just think that they can get away with it because they're less likely to get sued.

(I really suspect that they don't believe that at all, and it's all just a really good satire - after all, they blatantly called the company "EvilCorp" in Latin.)


>If they really believed that their process eliminated any licensing conditions, why would they limit themselves to open source projects?

Because this is satire by FOSS people :)


This is satire but this is where things are heading. The impact on the OSS ecosystem is probably not a net positive overall, but don't forget that this also applies to commercial software as well.

There will be many questions asked, like why buy some SaaS with way too many features when you can just reimplement the parts you need? Why buy some expensive software package when you can point the LLM into the binary with Ghidra or IDA or whatever then spend a few weeks to reverse it?


This is going to bring back software patents.


I was discussing that very point yesterday with a colleague after telling him of recent events. I pointed out that leaning on copyright/copyleft for software has always been a risky move.


Considering my name's on a software patent submitted just last year, I don't think software patents have gone anywhere...


What did you patent


The patent application hasn't been published yet so I can't link it, but it's the integration of a bot management system with a queuing system (think, preventing bots from taking space in the line waiting to buy tickets from Ticketmaster when everyone's in the waiting room)


Where did they go?


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search:

HN For You