Skip Navigation

InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)G
Posts
1
Comments
547
Joined
3 yr. ago

  • Google has been quietly doing that for more than 10 years, only we didn't start really calling this stuff AI until 2022. Google had offline speech to text (and an always on local hotword detection for "hey Google") since the Moto X 2013, and added hardware support for image processing in the camera app, as images were captured.

    The tasks they offloaded onto the Tensor chip starting in 2021 started opening up more image editing features (various algorithms for tuning and editing images), keyboard corrections and spelling/grammar recommendations that got better (and then worse), audio processing (better noise cancellation on calls, an always-on Shazam-like song recognition function that worked entirely offline), etc.

    Apple went harder at trying to use those AI features into language processing locally and making it obvious, but personally I think that the tech industry as a whole has grossly overcorrected for trying to do flashy AI, pushed beyond the limits of what the tech can competently do, instead of the quiet background stuff that just worked, while using the specialized hardware functions that efficiently process tensor math.

  • Deleted

    Permanently Deleted

    Jump
  • It's a chain of trust, you have to trust the whole chain.

    Including the entire other side of the conversation. E2EE in a group chat still exposes the group chat if one participant shares their own key (or the chats themselves) with something insecure. Obviously any participant can copy and paste things, archive/log/screenshot things. It can all be automated, too.

    Take, for example, iMessage. We have pretty good confidence that Apple can't read your chats when you have configured it correctly: E2EE, no iCloud archiving of the chats, no backups of the keys. But do you trust that the other side of the conversation has done the exact same thing correctly?

    Or take for example the stupid case of senior American military officials accidentally adding a prominent journalist to their war plans signal chat. It's not a technical failure of signal's encryption, but a mistake by one of the participants inviting the wrong person, who then published the chat to the world.

  • Sci-Fi Author: In my book I invented the Torment Nexus as a cautionary tale

    Tech Company: At long last, we have created the Torment Nexus from classic sci-fi novel Don't Create The Torment Nexus

  • It wasn't the buffer itself that drew power. It was the need to physically spin the disc faster in order to read the data to build up a buffer. So it would draw more power even if you left it physically stable. And then, if it would actually skip in reading, it would need to seek back to where it was to build up the buffer again.

  • Locked

    lemm.ee is shutting down at the end of this month

    Jump
  • I'm not sure that would work. Admins need to manage their instance users, yes, but they also need to look out for the posts and comments in the communities hosted on their instance, and be one level of appeal above the mods of those communities. Including the ability to actually delete content hosted in those communities, or cached media on their own servers, in response to legal obligations.

  • Yes, it's the exact same practice.

    The main difference, though, is that Amazon as a company doesn't rely on this "just walk out" business in a capacity that is relevant to the overall financial situation of the company. So Amazon churns along, while that one insignificant business unit gets quietly shut down.

    For this company in this post, though, they don't have a trillion dollar business subsidizing the losses from this AI scheme.

  • NSFW Deleted

    Permanently Deleted

    Jump
  • They're actually only about 48% accurate, meaning that they're more often wrong than right and you are 2% more likely to guess the right answer.

    Wait what are the Bayesian priors? Are we assuming that the baseline is 50% true and 50% false? And what is its error rate in false positives versus false negatives? Because all these matter for determining after the fact how much probability to assign the test being right or wrong.

    Put another way, imagine a stupid device that just says "true" literally every time. If I hook that device up to a person who never lies, then that machine is 100% accurate! If I hook that same device to a person who only lies 5% of the time, it's still 95% accurate.

    So what do you mean by 48% accurate? That's not enough information to do anything with.

  • Yeah, from what I remember of what Web 2.0 was, it was services that could be interactive in the browser window, without loading a whole new page each time the user submitted information through HTTP POST. "Ajax" was a hot buzzword among web/tech companies.

    Flickr was mind blowing in that you could edit photo captions and titles without navigating away from the page. Gmail could refresh the inbox without reloading the sidebar. Google maps was impressive in that you could drag the map around and zoom within the window, while it fetched the graphical elements necessary on demand.

    Or maybe web 2.0 included the ability to implement states in the stateless HTTP protocol. You could log into a page and it would only show you the new/unread items for you personally, rather than showing literally every visitor the exact same thing for the exact same URL.

    Social networking became possible with Web 2.0 technologies, but I wouldn't define Web 2.0 as inherently social. User interactions with a service was the core, and whether the service connected user to user through that service's design was kinda beside the point.

  • Teslas will (allegedly) start on a small, low-complexity street grid in Austin. exact size TBA. Presumably, they're mapping the shit out of it and throwing compute power at analyzing their existing data for that postage stamp.

    Lol where are the Tesla fanboys insisting that geofencing isn't useful for developing self driving tech?

  • Wouldn't a louder room raise the noise floor, too, so that any quieter signal couldn't be extracted from the noisy background?

    If we were to put a microphone and recording device in that room, could any amount of audio processing be able to extract the sound of the small server out from the background noise of all the bigger servers? Because if not, then that's not just a auditory processing problem, but a genuine example of destruction of information.

  • taking a shot at installing a new OS

    To be clear, I had been on Ubuntu for about 4 years by then, having switched when 6.06 LTS had come out. And several years before that, I had previously installed Windows Me, XP beta, and the first official XP release on a home-built, my first computer that was actually mine, using student loan money paid out because my degree program required all students have their own computer.

    But freedom to tinker on software was by no means the flexibility to acquire spare hardware. Computers were really expensive in the 90's and still pretty expensive in the 2000's. Especially laptops, in a time when color LCD technology was still pretty new.

    That's why I assumed you were a different age from me, either old enough to have been tinkering with computers long enough to have spare parts, or young enough to still live with middle class parents who had computers and Internet at home.

  • That's never really been true. It's a cat and mouse game.

    If Google actually used its 2015 or 2005 algorithms as written, but on a 2025 index of webpages, that ranking system would be dogshit because the spammers have already figured out how to crowd out the actual quality pages with their own manipulated results.

    Tricking the 2015 engine using 2025 SEO techniques is easy. The problem is that Google hasn't actually been on the winning side of properly ranking quality for maybe 5-10 years, and quietly outsourced the search ranking systems to the ranking systems of the big user sites: Pinterest, Quora, Stack Overflow, Reddit, even Twitter to some degree. If there's a responsive result and it ranks highly on those user voted sites, then it's probably a good result. And they got away with switching to that methodology just long enough for each of those services to drown in their own SEO spam techniques, so that those services are all much worse than they were in 2015. And now indexing search based on those sites is no longer a good search result.

    There's no turning backwards. We need to adopt new rankings for the new reality, not try to turn back to when we were able to get good results.

  • I can't tell if you were rich, or just not the right age to appreciate that it wasn't exactly common for a young adult, fresh out of college, to have spare computers laying around (much less the budget to spare on getting a $300-500 secondary device for browsing the internet). If I upgraded computers, I sold the old one used if it was working, or for parts of it wasn't. I definitely wasn't packing up secondary computers to bring with me when I moved cities for a new job.

    Yes, I had access to a work computer at the office, but it would've been weird to try to bring in my own computer to try to work on it after hours, while trying to use the Internet from my cubicle for personal stuff.

    I could've asked a roommate to borrow their computer or to look stuff up for me, but that, like going to the office or a library to use that internet, would've been a lot more friction than I was willing to put up with, for a side project at home.

    And so it's not that I think it's weird to have a secondary internet-connected device before 2010. It's that I think it's weird to not understand that not everyone else did.

  • Getting a smartphone in 2010 was what gave me the confidence to switch to Arch Linux, knowing I could always look things up on the wiki as necessary.

    I also think my first computer that could boot from USB was the one I bought in 2011, too. Everything before that I had to physically burn a CD.

  • Plus if the front end is hashing with each keystroke, I feel like the security of the final hash is far, far, less secure to any observer/eavesdropper.

    If the password is hunter2 and the front end sends a hash for h, then hu, then hun, etc., then someone observing all these hashes only has to check each hash against a single keystroke, then move on to the next hash with all but the last keystroke known. That hash table is a much smaller search space, then.

  • You're like, so close.

    Don't reuse passwords between different services, or after a password reset. You're aware of exactly why that's a bad practice (a compromise of any one of those services, or an old database of those services will expose that password), so why knowingly bear that risk?

  • My gigabit connection is good enough for my NAS, as the read speeds on the hard drive itself tend to be limited to about a gigabit/s anyway. But I could see some kind of SSD NAS benefiting from a faster LAN connection.

  • Yeah, you're describing an algorithm that incorporates data about the user's previous likes. I'm saying that any decent user experience will include prioritization and weight of different posts, on a user by user basis, so the provider has no choice but to put together a ranking/recommendation algorithm that does more than simply sorts all available elements in chronological order.

  • All the other answers here are wrong. It was the Boeing 737-Max.

    They fit bigger, more fuel efficient engines on it that changed the flight characteristics, compared to previous 737s. And so rather than have pilots recertify on this as a new model (lots of flight hours, can't switch back), they designed software to basically make the aircraft seem to behave like the old model.

    And so a bug in the cheaper version of the software, combined with a faulty sensor, would cause the software to take over and try to override the pilots and dive downward instead of pulling up. Two crashes happened within 5 months, to aircraft that were pretty much brand new.

    It was grounded for a while as Boeing fixed the software and hardware issues, and, more importantly, updated all the training and reference materials for pilots so that they were aware of this basically secret setting that could kill everyone.