1997년처럼 Quake를 컴파일해 봅시다

Let's compile Quake like it's 1997

72 pointsby birdculture2026. 2. 8.20 comments
원문 보기 (fabiensanglard.net)

요약

이 글은 게임 Quake의 32비트 Windows 바이너리를 컴파일하는 1997년 개발 경험을 재현하는 과정을 상세히 설명합니다. Windows NT 4.0 또는 Windows 98SE가 설치된 가상 머신을 설정하고 Visual C++ 6을 설치하는 과정을 안내합니다. 이 과정에는 어셈블리 파일을 처리하고 오래된 개발 도구와의 호환성 문제를 극복하기 위한 특정 단계가 포함되어 있으며, 초기 게임 개발 관행에 대한 향수를 불러일으킵니다.

댓글 (13)

webdevver3시간 전
love software archaeology like this.

there was another article where someone bootstrapped the very first version of gcc that had the i386 backend added to it, and it turns out there was a bug in the codegen. I'll try to find it...

EDIT: Found in, infact there was a HN discussion about an article referencing the original article:

https://miyuki.github.io/2017/10/04/gcc-archaeology-1.html

https://news.ycombinator.com/item?id=39901290

clarity_hacker3시간 전
Build environment archaeology like this matters more than people realize. Modern CI assumes containers solve reproducibility, but compiler version differences, libc variants, and even CPU instruction sets can silently change binary output. The detail about needing to reinstall Windows NT just to add a second CPU shows how tightly coupled OS and hardware were — there was no abstraction layer pretending otherwise. Exact toolchain reproduction isn't nostalgia; it's the only way to validate that a specific binary came from specific source.
webdevver2시간 전
there is something to be said about old windows installation CDs being essentially modern-day equivalents of immutable docker layers - i don't think one could say that about modern windows, but then i'm not super clued in into ms stuff.
kelnos2시간 전
> The detail about needing to reinstall Windows NT just to add a second CPU shows how tightly coupled OS and hardware were — there was no abstraction layer pretending otherwise.

In this case there was: the reason you need to reinstall to go from uniprocessor to SMP was because NT shipped with two HALs (Hardware Abstarction Layer): one supporting just a single processor, and one supporting more than one.

The SMP one had all the code for things like CPU synchronization and interrupt routing, while the UP one did not.

If they'd packed everything into one HAL, single-processor systems would have to take the performance hit of all the synchronization code even though it wasn't necessary. Memory usage would be higher too. I expect that you probably could run the SMP HAL on a UP system (unless Microsoft put extra code in to make it not let you), but you wouldn't really want to do that, as it would be slower and require more RAM.

So it wasn't that those abstraction layers didn't exist back then. It was that abstraction layers can be expensive. This is still true today, of course, but we have the cycles and memory to spare, more or less, which was very much not the case then.

knorker2시간 전
> The first batches of Quake executables, quake.exe and vquake.exe were programmed on HP 712-60 running NeXT and cross-compiled with DJGPP running on a DEC Alpha server 2100A.

Is that accurate? I thought DJGPP only ran on and for PC compatible x86. ID had Alpha for things like running qbps and light and vis (these took for--ever to run, so the alpha SMP was really useful), but for building the actual DOS binaries, surely this was DJGPP on x86 PC?

Was DJGPP able to run on Alpha for cross compilation? I'm skeptical, but I could be wrong.

Edit: Actually it looks like you could. But did they? https://www.delorie.com/djgpp/v2faq/faq22_9.html

qingcharles1시간 전
I thought the same thing. There wouldn't be a huge advantage to cross-compiling in this instance since the target platform can happily run the compiler?
bluedino2시간 전
I'd like to see someone build the Linux source code leak that came out not to far after Quake was released.
Maro2시간 전
Quake book incoming from Fabien?
torh1시간 전
I hope so. The other books have been great fun to read, with the detour of CP-SYSTEM as a nice surprise.
ErroneousBosh2시간 전
Funny, I've just been (re-)playing Quake 2 recently.
boznz1시간 전
On one particular project from 1995 where the hardware was very cost optimised, the C program compiled to 1800 bytes which meant we could save nearly a dollar by buying micro-controllers with 2KB flash rather than 4KB flash. We manufactured 20,000 units with this cheaper chip. 2 years down the line we needed a simple code change to increase the UART baud rate to the host, a change that should have resulted in the same sized binary, but instead increased it to 2300 bytes due to a newer C compiler. We ended up tweaking the assembly file and running an assembler, then praying there would be no more changes!

I have always over specified the micro-controllers a little from that point, and kept a copy of the original dev environment, luckily all my projects are now EOL as I am retired.