so I've used it a few times, but it absolutely chokes on 4K footage on my rig. I think maybe the lack of hardware acceleration for pre rendering?
I have a big project coming up and I'm considering downscaling all the source files, editing, and then slipping in the real files when I'm done. Is that a thing people do?
Final Cut Pro, Logic Pro, Pixelmator Pro, Motion, Compressor, and MainStage — plus new AI features and premium content in Keynote, Pages, and Numbers — come together in a single subscription
Not quite rage-quit, but I worked as a cashier at a local grocery store chain until one day I was asked to clean up broken eggs that had dripped into the dusty-ass vents at the bottom of the dairy section.
By my math, $250/day for every day since Dec 31, 2021 would be $380,000 which could mean that they actually tracked each individual infraction (i.e. no fees on days they weren't home) which means there's zero chance of contesting this. Curious how much warning she got along the way before the fines really stacked up.
Also really curious what all the libertarians moving to Florida think of this.
Think of how much people whine about printer ink without A) looking for alternatives and B) questioning why their printer was fucking free (with rebate).
Not quite. When Nvidia releases a new chip, they'll release a bunch of documentation that indicates how the chip needs to be used. How to configure it. Cooling requirements. Etc. Each Board manufacturer (like MSI) uses this documentation to manufacture the physical boards that you buy.
What OP was referring to is that NVidia doesn't physically have a plant with a bunch of people in bunny suits walking around a clean room. Instead, they contract manufacturers and send the designs for them to build.
It's like how Apple designs phones but Foxconn actually builds them.
Can you elaborate? This is my first time dealing with higher level languages in the workplace (barring some Python scripts), and I feel like I'm losing my mind.
Thank you. Dude checked in a shit load of code before going on PTO for three weeks. We get pretty live plots of data, but he broke basically every hardware driver in the process.
I’m in a nightmare scenario where my new job has a guy using Claude to pump out thousands of lines of C++ in a weekend. I’ve never used C++ (just C for embedded devices).
He’s experienced, so I want to believe he knows what he’s doing, but every time I have a question, the answer is “oh that’s just filler that Claude pumped out,” and some copy pasted exposition from Claude.
So I have no idea what’s AI trash and what’s C++ that I don’t know.
Like a random function was declared as a template. I had to learn what function templates are for. So I do, but the function is only defined once, and I couldn’t think of why you would need to templatize it. So I’m sitting here barely grasping the concept and syntax and trying to understand the reasoning behind the decision, and the answer is probably just that Claude felt like doing it that way.
Changing a monitor's resolution from 1024x768 to something useable.