I think that when RHoI wrote "3D", they meant "hardware accelerated 3D". Many early 3D DOS games either did their 3D entirely in software, or included hardware acceleration support as a kind of optional bonus. Software 3D shouldn't give DOSBox much more trouble than most 2D games. The original release of Quake didn't even have any accelerator support; it was patched in later.
- Posts
- 0
- Comments
- 185
- Joined
- 3 yr. ago
- Posts
- 0
- Comments
- 185
- Joined
- 3 yr. ago
A tiny number of original releases don't run properly or at all on some 2600 Juniors or 7800s, due to a reliance on quirks that were changed in later versions of the graphics chip. Probably not a major issue for classic collecting, but if you're interested in modern homebrew, it could be worth considering.
Yeah, Zero Tolerance is amazing from a technical standpoint, and a solid gaming experience as it goes, but I personally felt it dragged on for too long without enough variation. Then again I felt about the same regarding the original Doom and Doom 2, so it's probably more my tastes than the game itself.
So many great games. Some of my favourites (mostly action RPGs, exclusives marked *):
- Landstalker*
- The Story of Thor*
- Wonderboy in Monster World (* virtually an exclusive)
- Soleil*
- Flashback
- The Immortal
- Quackshot*
- Light Crusader*
- Arcus Odyssey
- Sonic the Hedgehog 2* (the other main games are good too, but this is my favourite)
- Desert Strike and Jungle Strike (I don't remember playing Urban Strike, but it got excellent reviews)
- The Lost Vikings
- Sword of Vermillion*
This is another good point. I'd try turning off Fast Startup first, and if that alone doesn't clear the issue, try this (leaving Fast Startup off).
Make sure that Windows Fast Startup is turned off. I don't know if that's specifically the problem here, but in my experience quite a few "everything's fine, it should be working!" boot issues have been resolved by booting into Windows, turning off Fast Startup, and then doing a full shut down before going back to Linux, especially on laptops.
Alas, that's already the name of a game engine.
If you're not going to jailbreak a New 3DS (probably my choice if I was focusing on DS and 3DS games, because those real dual screens make a difference), then why not just get a controller to use with the smartphone you probably already have in your pocket? Even a mid-range smartphone will match or beat most inexpensive handhelds for retro game emulation.
Why not? I think it's just an interesting side project fo the dev. This is a port of the original version for modern home computers, so it's not like they're limiting their audience.
But I would suggest that while it should be possible to make a good adaptation for 3rd generation systems (NES etc.), you're not going to be able to make a substantially similar product until you reach at least the 5th generation (PlayStation etc.), and perhaps not even then. The blurry parallax backgrounds, high number of particles, and level of detail make it necessary.
I was so triggered by the conversion from char-to-int-to-string-to-packedint that I had to write a bitwise version that just does char-to-packedint (and back again), with bitwise operators.
As others have pointed out, there are probably better options for doing this today in most real-life situations, but it might make sense on old low-spec systems if not for all the intermediate conversion steps, which is why I wrote this.
"Windows has inconsistency with icons and design in some areas."
I prefer Linux, but what? Oh, hello pot! Have you met my friend kettle?
Outside the major cities, at least, video arcades in Japan are still hanging on in 2025 with a mix of games. There are a lot of pseudo-gambling token games (think prize tickets), crane-style prize games, and simple, highly physical games (big buttons and levers, controller and body tracking) aimed at the 5-to-10-year-old segment.
In terms of things we'd recognize as "real" games, almost everything is groups of locally networked terminals with some kind of physical gimmick that doesn't translate well to a home experience. There are still some racing games, music games, and the like, with uncommon controllers and layouts, but the most common format right now is probably a flat table with an embedded screen that has some way of scanning and tracking collectible trading cards. The cards aren't just scanned in once for use and then put aside, but actually moved around the table as tokens within the game. Obviously there are "Magic" style games, but also RPGs (both turn-based and action), MOBAs, real-time strategy, and more. Horse racing games are also popular, but to be clear, the players don't "ride" the horses; they raise, trade, manage, and "bet" on them, and watch simulated races.
And these days almost everything uses player profiles saved to IC cards, ranked across the country and sometimes even the world.
Occasionally you'll see four or six of the old sit-down "city" style cabinets (like the ones pictured in the article) in a corner, running 1-on-1 fighting games, but those are mainly found in the specifically "retro" arcades.
One thing that I discovered about charging PS3 pads, which doesn't seem to be mentioned a lot, is that they appear (my guess, unconfirmed) to require proper USB current negotiation before they will start charging. In fact, I've found multiple sources saying that they can be charged from any USB power source, which isn't true.
The original USB standard states that USB hosts should start a connection with 100mA of current, and the client can request increases in 100mA steps up to 500mA. I assume that the PS3 USB ports support this, as do pretty much all computer USB ports. But the majority of wall plug USB chargers don't; they just allow a maximum current draw of 500mA (or more) from the start and ignore increase requests.
It seems like the majority of equipment manufacturers ignored this part of the spec, since the host needs circuitry to limit current in any case, so many chargers don't bother with circuitry to respond, and even when the port does respond to increase requests, the port is actually always allowing the maximum draw and simply approving all requests.
However, I think that the PS3 pads actually wait for an "OK" response before continuing, which the majority of wall chargers (especially the cheap ones) never send. I had to use the PS3 or a PC (direct connection, not through a hub) to charge my pads until I found a cheap PS3 controller charging dock that works with any supply.
I have a stack of Logitec F310 controllers, and I've never had them fail to work on any system (Windows, Linux, Android). They're not "pro gamer" or anything, fairly basic, but they've always responded smoothly for me even after many years of use. They're inexpensive, wired, and have an "XBox - DInput" switch on the back (at least mine do; that feature may have been removed by now).
The F310 (what I use) is wired and has no rumble feedback.
The F510 is wired and has rumble feedback, but I've never used one.
The F710 is wireless 2.4GHz (not Bluetooth) and has rumble feedback. I have two of these, and in my experience neither of them connects reliably, even under Windows with the official software installed.
I loved my MDs and Hi-MDs, but they had so many frills. All the frills. That was part of why I loved them!
The PlayStation 1 had a copy protection system that measured physical properties of the disc which couldn't be replicated by normal CD writers. There were a few ways to get around this, but to be able to put a burned CD into your console and boot directly from it into the game (as usual) required the installation of a fairly complex mod chip. A lot of people alternatively used the "swap trick", which is how I used to play my imported original games.
The DreamCast's copy protection was heavily reliant on using dual-layer GD-ROM discs rather than regular CDs, even though they look the same to the naked eye. There were other checks in place as well, but simply using GD-ROMs was pretty effective in and of itself.
Unfortunately, Sega also added support for a thing called "MIL-CD" to the DreamCast. MIL-CD was intended to allow regular music CDs to include interactive multimedia components when played on the console. However, MIL-CD was supported for otherwise completely standard CDs, including burned CDs, and had no copy protection, because Sega wanted to make it as easy as possible for other companies to make MIL-CDs, so the format could spread and hopefully become popular. Someone found a way to "break out" of the MIL-CD system and take over the console to run arbitrary code like a regular, officially released game, and that was the end of DreamCast's copy protection. People couldn't just copy an original game disc 1:1 and have it work; some work had to be done on the game to put it on a burned CD and still have it run (sometimes quite a lot of work, actually), but no console modification was needed. Anyone with a DreamCast relased before Sega patched this issue (which seems to be most of them) can simply burn a CD and play it on their console, provided they can get a cracked copy of the game.
I would also probably try to plug USB drives in once a year or so if I were being diligent, but in reality I recently found a handful of USB flash drives that I'd stored in a box in my parents' unattached garage, and every one of them could be read completely without any issues. They ran the gamut of build quality from expensive, name-brand drives to no-name dollar-store keychains. They'd been sitting in that box, untouched, for a little over nine years, and I'm pretty sure that some of them hadn't been used for several years even before that.
I wouldn't rely on it for critical data, but USB flash might not be so terrible.
Go for it, if it's to satisfy your own curiosity, but there's virtually no practical use for it these days. I had a personal interest in it at uni, and a project involving coding in assembly for an imaginary processor was a small part of one optional CS course. Over the years I've dabbled with asm for 32-bit Intel PCs and various retro consoles; at the moment I'm writing something for the Atari 2600.
In the past, assembly was useful for squeezing performance out of low-powered and embedded systems, but now that "embedded" includes SoCs with clock speeds in the hundreds of MHz and several megabytes of RAM, and optimizing compilers have improved greatly, the tiny potential performance gain (and you have to be very good at it before you'll be able to match or do better than most optimizing compilers) is almost always outweighed by the overhead of hand-writing and maintaining assembly language.
I'm in a similar boat to you. I ripped almost all of my CDs to 320kbps mp3s for portability, but then I wanted to put all of them (a substantial number) plus a bunch more (my partner's collection) on a physically tiny USB stick (that I already had) to just leave plugged into our car stereo's spare port. I had to shrink the files somehow to make them all fit, so I used ffmpeg and a little bash file logic to keep the files as mp3s, but reduce the bitrate.
128kbps mp3 is passable for most music, which is why the commercial industry focused on it in the early days. However, if your music has much "dirty" sound in it, like loud drums and cymbals or overdriven electric guitars, 128kbps tends to alias them somewhat and make them sound weird. If you stick to mp3 I'd recommend at least 160kbps, or better, 192kbps. If you can use variable bit rate, that can be even better.
Of course, even 320kbps mp3 isn't going to satisfy audiophiles, but it sounds like you just want to have all your music with you at all times as a better alternative to radio, and your storage space is limited, similar to me.
As regards transcoding, you may run into some aliasing issues if you try to switch from one codec to another without also dropping a considerable amount of detail. But unless I've misunderstood how most lossy audio compression works, taking an mp3 from a higher to a lower bitrate isn't transcoding, and should give you the same result as encoding the original lossless source at the lower bitrate. Psychoacoustic models split a sound source into thousands of tiny component sounds, and keep only the top X "most important" components. If you later reduce that to the top Y most important components by reducing the bitrate (while using the same codec), shouldn't that be the same as just taking the top Y most important components from the original, full group?
How about that worst of both worlds, the tutorial where the author starts out writing as if their audience only barely knows what a computer is, gets fed up partway through, and vomits out the rest in a more obtuse and less complete form than they would've otherwise?
Congratulations! You're ready to go!