It's more like:
Traditional search pipes first page of results to the bot. The bot reads the pages from the results and tries to identify an answer or the best result from the set. Both the bot summary and the adjusted ranking for the results are returned. This gives a chance at a better experience for the user because they don't have to read all the pages themselves to try find the answer they were looking for. However there is a huge margin for error since the bot is underpowered due to Google balancing the amount they pay for each search with the amount they earn for each search. So there end up being misinterpretations, hallucinations, biased content etc.
If they used a top end model like Claude Sonnet 3.7 and piped it enough contextual information, the AI summaries would be quite accurate and useful. They just can't afford to do that and they want to use their own Gemini bs.
It takes a lot of time and effort to bring a platform to maturity. If you aren't a developer then it's hard to grasp that. Lemmy has had years to get where it is.
As a programmer who integrates many languages together in the same product this is a pretty clear line in the sand. Where the languages interface, it's up to the new language to adopt the interfaces offered by the older language. Rust guys said they will do this, C guys said why don't you assume this responsibility (they already are). This is either a miscommunication or deliberate scape goat reasoning and deflection. There is no good reason why two languages can't work together with interfaces. I think the C guys are old, grumpy and fearful.
Disclaimer: I don't even like Rust as a language. Just calling it how I see it.
If you run executables from pirate sites, 9 times out of 10 you will get malware.