Gain Ground and Arcus Odyssey both got many hours of play on my Mega Drive back in the day. :)
- 帖子
- 0
- 评论
- 185
- 加入于
- 3 yr. ago
- 帖子
- 0
- 评论
- 185
- 加入于
- 3 yr. ago
I'm not too knowledgeable about the detailed workings of the latest hardware and APIs, but I'll outline a bit of history that may make things easier to absorb.
Back In the early 1980s, IBM was still setting the base designs and interfaces for PCs. The last video card they relased which was an accepted standard was VGA. It was a standard because no matter whether the system your software was running on had an original IBM VGA card or a clone, you knew that calling interrupt X with parameters Y and Z would have the same result. You knew that in 320x200 mode (you knew that there would be a 320x200 mode) you could write to the display buffer at memory location ABC, and that what you wrote needed to be bytes that indexed a colour table at another fixed address in the memory space, and that the ordering of pixels in memory was left-to-right, then top-to-bottom. It was all very direct, without any middleware or software APIs.
But IBM dragged their feet over releasing a new video card to replace VGA. They believed that VGA still had plenty of life in it. The clone manufacturers started adding little extras to their VGA clones. More resolutions, extra hardware backbuffers, extended palettes, and the like. Eventually the clone manufacturers got sick of waiting and started releasing what became known as "Super VGA" cards. They were backwards compatible with VGA BIOS interrupts and data structures, but offered even further enhancements over VGA.
The problem for software support was that it was a bit of a wild west in terms of interfaces. The market quickly solidified around a handful of "standard" SVGA resolutions and colour depths, but under the hood every card had quite different programming interfaces, even between different cards from the same manufacturer. For a while, programmers figured out tricky ways to detect which card a user had installed, and/or let the user select their card in an ANSI text-based setup utility.
Eventually, VESA standards were created, and various libraries and drivers were produced that took a lot of this load off the shoulders of application and game programmers. We could make a standardised call to the VESA library, and it would have (virtually) every video card perform the same action (if possible, or return an error code if not). The VESA libraries could also tell us where and in what format the card expected to receive its writes, so we could keep most of the speed of direct access. This was mostly still in MS-DOS, although Windows also had video drivers (for its own use, not exposed to third-party software) at the time.
Fast-forward to the introduction of hardware 3D acceleration into consumer PCs. This was after the release of Windows 95 (sorry, I'm going to be PC-centric here, but 1: it's what I know, and 2: I doubt that Apple was driving much of this as they have always had proprietary systems), and using software drivers to support most hardware had become the norm. Naturally, the 3D accelerators used drivers as well, but we were nearly back to that SVGA wild west again; almost every hardware manufacturer was trying to introduce their own driver API as "the standard" for 3D graphics on PC, naturally favouring their own hardware's design. On the actual cards, data still had to be written to specific addresses in specific formats, but the manufacturers had recognized the need for a software abstraction layer.
OpenGL on PC evolved from an effort to create a unified API for professional graphics workstations. PC hardware manufacturers eventually settled on OpenGL as a standard which their drivers would support. At around the same time, Microsoft had seen the writing on the wall with regards to games in Windows (they sucked), and had started working on the "WinG" graphics API back in Windows.3.1, and after a time that became DirectX. Originally, DirectX only supported 2D video operations, but Microsoft worked with hardware manufacturers to add 3D acceleration support.
So we still had a bunch of different hardware designs, but they still had a lot of fundamental similarities. That allowed for a standard API that could easily translate for all of them. And this is how the hardware and APIs have continued to evolve hand-in-hand. From fixed pipelines in early OpenGL/DirectX, to less-dedicated hardware units in later versions, to the extremely generalized parallel hardware that caused the introduction of Vulkan, Metal, and the latest DirectX versions.
To sum up, all of these graphics APIs represent a standard "language" for software to use when talking to graphics drivers, which then translate those API calls into the correctly-formatted writes and reads that actually make the graphics hardware jump. That's why we sometimes have issues when a manufacturer's drivers don't implement the API correctly, or the API specification turns out to have a point which isn't defined clearly enough and some drivers interpret it one way, while other drivers interpret the same API call slightly differently.
In my (admittedly limited) experience, SDL/SDL2 is more of a general-purpose library for dealing with different operating systems, not for abstracting graphics APIs. While it does include a graphics abstraction layer for doing simple 2D graphics, many people use it to have the OS set up a window, process, and whatever other housekeeping is needed, and instantiate and attach a graphics surface to that window. Then they communicate with that graphics surface directly, using the appropriate graphics API rather than SDL. I've done it with OpenGL, but my impression is that using Vulkan is very similar.
SDL_gui appears to sit on top of SDL/SDL2's 2D graphics abstraction to draw custom interactive UI elements. I presume it also grabs input through SDL and runs the whole show, just outputting a queue of events for your program to process.
At the very least, please state which section you made small changes to, even if you are sure it's not worth mentioning what or why.
Maybe they believe that most of their customers don't really know much about computers beyond turning them on and "bigger numbers = better". They might not be wrong.
Maybe we could treat the appearances of recognizable, non-living entities in games (cars, buildings, airplanes, etc.) the same way we treat musical scores; the producer would be legally obligated to pay some reasonable, small, fixed fee per use to the original creator, and that creator wouldn't be allowed to object. And this wouldn't entitle the producer to use any trademarked brand or model name, just the form.
I'm not sure how common they are outside Japan, but I have a little (about 12" I think) Panasonic "Let's Note" that I use quite a lot as a lightweight coding (and retro/indie gaming :D) device that I can throw in even my smallest bag when there's a chance I'll have to kill more than a few minutes. They're designed to be a little bit rugged. I had Ubuntu on it previously, now Mint, and the only problem I've had is that Linux somehow sees two screen brightness systems, and by default it connects the screen brightness keys to the wrong (i.e. nonexistent) one. Once I traced the problem it was a quick and painless fix.
They seem to be sold worldwide, so you may be able to get one cheaply second-hand. One thing to be careful about is the fact that in order to keep the physical size down, the RAM is soldered to the board. Mine is an older model (5th gen iCore), and has 4GB soldered on but also one SODIMM slot, so I was able to upgrade to 12GB total. But I've noticed that on most later models they got rid of the RAM slots entirely, so whatever RAM it comes with is what you're stuck with.
Because the Gameboy logo check and the actual display of the logo happen separately, there were ways to pass the check while still displaying a different logo on the screen. Given that I bought cartridges from major retailers that did this, I'm guessing that Nintendo either didn't know about them, or didn't like their odds in court.
Sega was doing something conceptually similar around the same time, and that did get tested at trial (Sega vs. Acclaim), where the court ruled that Sega could go suck a lemon. So there's some doubt as to whether any of this is enforceable anyway, although Sega kept including a similar system in their hardware up to and including the Dreamcast.
Of course, a company as large as Nintendo could just bankrupt a lot of smaller companies with legal fees via delaying tactics.
It's weird to me how GIMP and Krita clearly share a large amount of code under the hood, and even some UI design, but at the same time it feels so much less painful to draw illustrations in Krita than in GIMP. I'm glad I gave it a try.
I think that like a great many game mechanics, the fact that it's been done badly many times doesn't mean that it can't be done well.
Child's play compared to what you'd need to do on a modern chip.
I don't think it's the chips, but the operating environments. Modern CPUs offer dozens of multipurpose registers and many more instructions and addressing modes compared to those old, low-cost CPUs, which should make things easier, not harder. But no-one's building old-style dedicated systems around modern CPUs; our code now has to play nice with firmware, OS, libraries, and other processes, including resource management and preempting.
Compare a single-gear go-kart to an automatic sedan. Getting top performance out of the go-kart on a closed track is difficult and requires nuance. If we could drive the automatic sedan around the same closed track, we could easily demolish the go-kart, and not just with raw engine power. The improved acceleration, braking assist, and power steering are enough. But when we drive the sedan we're usually doing it on public roads with traffic signals, intersections, speed limits, and other road users. That's what's more difficult.
Apparently the original game and Brood War expansion are free to install through the Battle.Net launcher these days.
If you have the original discs, the later official patches added the ability to copy the "mpq" files from the CD into the game's directory, so you no longer need the disc in the drive. Of course, you're still going to need a drive for the initial installation. That should work for single player (it's been a few years since I last did it) but I don't know about online multiplayer.
I haven't kept up with anime much for many years now, but I can easily imagine that this is the case. There had been mecha anime with angsty pilots and behind-the-scenes politics before, but Evangelion pushed it all to a whole new level by adding mysticism, massively flawed characters, and existential dread into the mix. I know that almost immediately following the initial release of Evangelion we got Gasaraki and RahXephon, both of which bear obvious influences from Evangelion.
I didn't know that generative AI could do things like this now.
I had a mini movie night with two colleagues, one is around middle age like me, and the other in their twenties. We were going through some DVDs and Blurays, and Die Hard came up. We two older folks said we liked it but the younger said that they'd never seen it. Well obviously we had to watch it right then.
Afterward, the young colleague said they found the movie boring and unoriginal. Talking it over, we came to the conclusion that while Die Hard had done so much in fresh and interesting ways at the time, it had been so thoroughly copied from by so many other films that it offered little to an uninitiated modern audience, looking back.
Although I haven't played it myself, to read someone saying that Ultima 4 is derivative and lacking in originality feels a lot like that experience with Die Hard. Additionally, I think that the real old games usually expect a level of imagination and willingness to put up with discomfort that even I sometimes find a little offputting in 2025, despite the fact that I grew up with many of those games and had no issues with them at the time. If I don't remind myself of it, it can be easy to forget that old hardware wasn't limited only in audio-visual power, but also storage size and processing power.
I still search through old games, but I'm looking for ideas that maybe didn't work well or hit the market right the first time, but still deserve further consideration, especially in light of technological advances that have happened in the intervening years.
Node =/= JavaScript
I played this on PS2 and I remember thinking at the time that it was extremely adequate. As you say, the reviews at the time were lukewarm but I think it's worth a look for anyone trying scratch that itch who's already finished the bigger names in the genre.
I've never played the GBA games, and I still found Super Metroid bland.
I didn't have an NES or SNES growing up, so I came to those games a little later on. However, Super Metroid was still the most recent game in the franchise when I played it. There were plenty of rave reviews even then, so I looked forward to playing it once I got my hands on a copy. I even bought a new controller for it.
Initially I actually found the game somewhat frustrating, but once I got used to Samus' momentum and how the game had been designed to be played, I found it to be very well balanced. But I never felt like there was any real reason for me to go on other than to open new areas. Since it wasn't referenced in any way (that I noticed) outside of the manual, "The Mission" didn't seem important. And while the graphics were gorgeous for the time (and still are), that wasn't enough for me. People often talk about the haunting and creepy feeling of the game's world, but I didn't get that. I felt that way about the Prime games, but Super Metroid just seemed empty and abandoned to me, not atmospheric.
A few years ago I was able to play AM2R and stuck to it all the way to the end, even 100-percenting it, and enjoying it thoroughly. But I don't think I ever finished Super Metroid. I just put it down one day and never got back to it. And I don't feel like it's something I need to tick off some gaming bucket list. If you're not really enjoying it, stop playing and don't feel bad about it. There are already more good games in the world than anyone can complete before they die. You can't play them all, so stick to the ones that resonate with you personally.
My main concern is getting games in a form that I can store locally for 20 years and then reasonably expect to boot up and play. A secondary concern (ever since I moved permanently to another country) is going digital whenever possible because shipping stuff long distances is expensive. I had hundreds of physical books that it pained me to give away, but it simply wasn't economical to move them to my new home. I kept my physical games, CDs, and DVDs, because they're mostly thin discs and air-filled plastic cases (often replaceable once paper inserts have been removed) and I was able to bring them over affordably.
Over the last few years I'd say I've slowed down on physical retro collecting and only bought a couple dozen retro console games. More often I sail the high seas looking for them because morally there's no sane argument decades after release that paying $50-100 to a private collector or dealer today has any impact on the developer's or publisher's profits in terms of secondary or tertiary sales. The physical game media and packaging have ceased to be games and have become artifacts, almost independent of their content, like other vintage or antique items. Of course that doesn't apply if the game has been rereleased in more or less its original form, in which case I either buy it (if the price is reasonable) or don't play it at all (if the price is unreasonable). I actually have such a game in digital storage that I've been meaning to play for years, and I learned that it's quite recently been put up in GOG, so now I'm morally obligated to buy it if I still want to play it, heh. Luckily for me the price seems fair.
And speaking of GOG, the majority of my recent game purchases have been split pretty evenly between GOG and itch.io; about 95%. I basically haven't bought anything directly from Steam for more than a decade. I understand that many games there are actually DRM-free, but I'm not interested in trying to research every game before I make a purchase. If each game's store page indicated its true DRM status clearly (not just "third-party DRM"), I'd consider buying through Steam again. As it is, whenever I learn about an interesting game that's on Steam, I try to find it on itch.io or GOG, and if I can't, I generally don't buy it; I'll buy it on Steam only if it looks really interesting and it's dirt cheap.
Whenever I look at
buying"leasing with no fixed term" anything with DRM, I assume that it will be taken away from me or otherwise rendered unusable unexpectedly at some point in the future through no fault of my own. It's already happened to me a couple of times, and once bitten, twice shy. I know that everyone loves Gabe Newell, and he seems like a genuinely good guy, and he's said that if Steam ever closed its doors that they'd unlock everything. However the simple fact is that in the majority of situations where that might happen, the call wouldn't be up to Gaben, even for games published by Valve.So yeah, I may put up with DRM in a completely offline context, but in any situation where my access terms can be changed remotely and unilaterally with a forced update, server shutdown, or removal, that's a hard pass from me.