Simiiformes is a clear and distinct clade.
Yes but who says that specific clade maps to the colloquial taxonomic word “monkey”?
Simiiformes is a clear and distinct clade.
Yes but who says that specific clade maps to the colloquial taxonomic word “monkey”?
Monkeys are a social construct. Like trees.
Yes, everything that can be expressed as letters is in the Library of Babel. Finding anything meaningful in that library, though, is gonna take longer than just writing it yourself.
Yeah, I think for most of what OP is describing, an earlier generation Pro with RAM and storage upgrade is a better bargain than spending the same amount of money on the newest processor. Not sure if OP can access the refurbished Apple store, but that’s where I’d be looking.
Yup, the base M chips can only support two displays, including the built-in, so a base MacBook Air can only support one external monitor. This was not a limitation of the Intel versions from before 2020.
Also, the main problem with LIDAR is that it really doesn’t see any more than cameras do. It uses light, or near-visible light, so it basically gets blocked by the same things that a camera gets blocked by. When heavy fog easily fucks up both cameras and LIDAR at the same time, that’s not really redundancy.
The spinning lidar sensors mechanically remove occlusions like raindrops and dust, too. And one important thing with lidar is that it involves active emission of lasers so that it’s a two way operation, like driving with headlights, not just passive sensing, like driving with sunlight.
Waymo’s approach appears to differ in a few key ways:
There’s a school of thought that because many of these would need to be eliminated for true level 5 autonomous driving, Waymo is in danger of walking down a dead end that never gets them to the destination. But another take is that this is akin to scaffolding during construction, that serves an important function while building up the permanent stuff, but can be taken down afterward.
I suspect that the lidar/radar/ultrasonic/extra cameras will be more useful for training the models necessary to reduce reliance on human intervention and maybe reduce the number of sensors. Not just in the quantity of training data, but some filtering/screening function that can improve the quality of data fed into the training.
BYD was just a cell phone battery company, and was like “well we’ve got the lithium supply chain locked down, you know what needs huge batteries: guess we’re doing cars now.”
Waymo chose the more expensive but easier option, but it also limits their scope and scalability.
I don’t buy it. The lidar data is useful for training the vision models, so there’s plenty of reason to believe that Waymo can solve the vision issues faster than Tesla.
I don’t think they’d go back to off-package RAM anymore. The benefits of putting it on one package is too great, and gives them just enough cover to be able to charge like crazy for it.
96GB of DDR5 laptop memory is $350
Maybe it’s better to compare LPCAMM2 form factor prices. For that, 64GB is $329. Still not quite the same as adding 16GB for $400, but it’s a better comparison.
The thing is, if Intel doesn’t actually get 18A and beyond competitive, it might be on a death spiral towards bankruptcy as well. Yes, they’ve got a ton of cash on hand and several very profitable business lines, but that won’t last forever, and they need plans to turn profits in the future, too.
Compared to AMD FX series, the Intel Core and Core2 were so superior, it was hard to see how AMD could come back from that.
Yup, an advantage in this industry doesn’t last forever, and a lead in a particular generation doesn’t necessarily translate to the next paradigm.
Canon wants to challenge ASML and get back in the lithography game, with a tooling shift they’ve been working on for 10 years. The Japanese “startup” Rapidus wants to get into the foundry game by starting with 2nm, and they’ve got the backing of pretty much the entirety of the Japanese electronics industry.
TSMC is holding onto finFET a little bit longer than Samsung and Intel, as those two switch to gate all around FETs (GAAFETS). Which makes sense, because those two never got to the point where they could compete with TSMC on finFETs, so they’re eager to move onto the next thing a bit earlier while TSMC squeezes out the last bit of profit from their established advantage.
Nothing lasts forever, and the future is always uncertain. The past history of the semiconductor industry is a constant reminder of that.
I just mean does it keep offline copies of the most recently synced versions, when you’re not connected to the internet? And does it propagate local changes whenever you’re back online?
Dropbox does that seamlessly on Linux and Mac (I don’t have Windows). It’s not just transferring files to and from a place in the cloud, but a seamless sync of a local folder whenever you’re online, with access and use while you’re offline.
Intel got caught off guard by the rise of advanced packaging, where AMD’s chiplet design could actually compete with a single die (while having the advantage of being more resilient against defects, and thus higher yield).
Intel fell behind on manufacturing when finFETs became the standard. TSMC leapfrogged Intel (and Samsung fell behind) based on TSMC’s undisputed advantage at manufacturing finFETs.
Those are the two main areas where Intel gave up its lead, both on the design side and the manufacturing side. At least that’s my read of the situation.
Does it do offline sync?
iCloud doesn’t have Linux, Android, or Windows clients. It’s basically a non-starter for file sharing between users not on an Apple platform.
I don’t like the way Google Drive integrates into the OS file browsing on MacOS, and it doesn’t support Linux officially. Plus it does weird stuff with the Google Photos files, which count against your space but aren’t visible in the file system.
OneDrive doesn’t support Linux either.
I just wish Dropbox had a competitive pricing tier somewhere below their 2TB for $12/month. I’d 100% be using them at $5/month for like 250 GB.
So with the case/mobo/power supply at $259, the CPU/GPU at $329, you’ve got $11 left to work with to buy RAM and SSD, in order to be competitive with the base model Mac Mini.
That’s what I mean. If you’re gonna come close to competing with the entry level price of the Mac Mini (to say nothing of frequent sales/offers/coupons that Best Buy, Amazon, B&H, and Costco run), you’ll have to sacrifice and use a significantly lower-tier CPU. Maybe you’d rather have more RAM/storage and are OK with that lower performing CPU, and twice the power consumption (around 65W rather than 30W), but at that point you’re basically comparing a different machine.
Ok, let’s put together a mini PC with a ryzen 9700X for under $600. What case, power supply, motherboard, RAM, and SSD are we gonna get? How’s it compare on power, sound, form factor?
It’s an apples to oranges comparison, and at a certain point you’re comparing different things.
When I was last comparing laptops a few years back I was seriously leaning towards the Framework AMD. It was clearly a tradeoff between Apple’s displays, trackpad, lid hinges, CPU/GPU benchmarks, and battery life, versus much more built in memory and storage, a tall display form factor, and better Linux support. Price was kinda a wash, as I was just comparing what I could get for $1500 at the time. I ended up with an Apple again, in the end. I’m keeping an eye on progress with the Asahi project, though, and might switch OSes soon.
Or the untested hardware that isn’t guaranteed to be as good as the established player.