There's been a string of misogynistic man vs women posts lately... And the image you sent fits that profile very well (even if the text content goes a different way)
Travel (providers like Saily allow you to take a temporary sim for a country you're visiting, basically what used to be the people selling sims at the airport. eSIM makes it less shady and basically one-click) [maybe multiple of these if they travel often because different providers are better in certain regions]
I think it's 5. for the most part. What could be done is forcing a mandatory tag like [TextOnly] or maybe the platform the screenshot is from [Reddit], [Twitter] etc. That way the people that don't like those types of posts can just filter them out
They're pretty much just hating to hate or basing themselves on very outdated information, 'missing critical features' is a joke, because if it actually were critical it would've been implemented already (plus firefox is very extensible, with many plugins existing and forks adding specific features), if they actually had a point they maybe would've given a single example.
Weirdly implementing some web standards kinda did apply a bit until a few years ago where all the big browser engine developers got together and pinned down the standard. If something still breaks that probably means the website used some out-of-spec workaround that only works in Chrome. Some things do indeed behave differently between firefox and chrome (an example of my own: file input fields with multiple types, eg allow both video and image are handled differently at least in the mobile apps). Yet again if they had a point maybe an example would've been great.
Weird user agent styles?...?? I'm just confused honestly.
This, but my first job for 6 months. Straight out of school got hired by semi-consultancy/semi-try-n-hire firm... But the market for my skill set went from being in high demand to no demand in the span of like a year. So in that half year I did nothing except learn and do some small internal projects.
Had this once, turned out to be some driver update software for a gaming mouse (at least something like that). Sucks for non-technical people that its quite hard to figure out for them without involving the 'family IT guy'
The only thing one-handed mode does for me is an on/off toggle that basically halves the screen. (So that everything can be reached with one hand I suppose) [OnePlus 13R]
Real funny they coloured it differently, because Flanders literally shares a language with The Netherlands.
To be fair half the world seems to forget Belgium is not all french sometimes, or puts french as the default even though Flanders' population is almost twice as large as Wallonie. Even adding the population of Brussels and Wallonie, Flanders still has the larger population. (Numbers for stats come from statbel)
The problem is that it's impossible to take out this one application. There doesn't need to be any actual nude pictures of children in the training set for the model to figure out that a naked child is basically just a naked adult but smaller. (Ofc I'm simplifying a bit).
Even going further and saying let's remove all nakedness from our dataset, it's been tried... And what they found is that removing such a significant source of detailed pictures containing a lot of skin decreased the quality of any generated image that has to do with anatomy.
The solution is not a simple 'remove this from the training data'. (Not to mention existing models that are able to generate these kinds of pictures are impossible to globally disable even if you were to be able to affect future ones)
As to what could actually be done, applying and evolving scanning for such pictures (not on people's phones though [looking at you here EU].) That's the big problem here, it got shared on a very big social app, not some fringe privacy protecting app (there is little to do except eliminate all privacy if you'd want to eliminate it on this end)
Regulating this at the image generation level could also be rather effective. There aren't that many 13 year old savvy enough to set up a local model to generate there. So further checks at places where the images are generated would also help to some degree. Local generation is getting easier by the day to set up though, so while this should be implemented it won't do everything.
In conclusion: it's very hard to eliminate this, but ways exist to make it harder.
Not actually as conclusive as the black/white in the original image, but I find the best spot in traffic to judge people is a roundabout.
A normal lane change can be dangerous to the driver if they don't use the turn signal. When exiting a roundabout however the only reason to use your signal is to notify other drivers, with no benefit for yourself. So if someone does not signal there (and that is quite common) it's a pretty good indicator about what kind of person they are. (self-centered or half-asleep)
There's been a string of misogynistic man vs women posts lately... And the image you sent fits that profile very well (even if the text content goes a different way)