63
submitted 9 months ago* (last edited 9 months ago) by therealjcdenton@lemmy.zip to c/linux@lemmy.ml

For example, theming shouldnt have to be a 10 step process. Make Flatpaks use your themes correctly. Another thing is QT Theming, why is it outside of KDE you can't use the breeze style? It's the best and most consistent application style for QT apps. And the final point, when is the naming scheme of org.foo.bar going to be fixed to be the actual name of the package rather than the technical name. Flatpak remove and flatpak install both work without giving the full name, so why doesn't flatpak run? The naming is the only thing snaps have over flatpak. If nothing is been done how can I contribute?

you are viewing a single comment's thread
view the rest of the comments
[-] TCB13@lemmy.world 2 points 9 months ago

See if there are cracked versions of MacOS that I can boot in a VM and see if I like it. I have to think about that.

You don't need any cracks. The issue with running macOS on a VM is that the VM won't provide a compatible GPU and it will lag a lot. Yes, it's painful and there aren't decent workarounds unless you can passthrough an entire GPU supported natively by macOS.

The whole thing looked completely mental (to me.) And I think there is something like that on Windows, too.

Yes, there's vTask (proprietary) and AutoIt for Windows. The second one is very good and very reliable.

I can only imagine things like that break easy and you’re never able to change things if people actually rely on it. But I’m really not an expert on this. Might have valid use-cases. Or it’s just a silly way of doing things.

AutoIt doesn't break as much as you think if the developer knows what he's doing. "Unfortunately" I spent the better part of 2010 coding AutoIt to automate exporting data from a very proprietary Siemens software and after a few months you just learn how to it properly :P It can target the Win32 controls directly and you can bind code to UI events by their internal Windows IDs. Another interesting thing it can do (sometimes) is explore a program's DDLs and internal funcional calls and call those directly from your code instead of button clicks.

What Apple does with AppleScript is a less advanced version of AutoIt, you can call their framework's functions directly from it (hance the ability to build entire applications) and interact in robust ways with the GUI of some application. Applications can also load "plugins" into the editor and provide methods to do certain tasks the developer decided that might be important for someone.

Might have valid use-cases.

In the macOS land the use case is allowing anyone without much coding experience to be able to automate some GUI task. While not perfect this a large win for a lot of people, specially because you can just click "record" > do your repetitive task > "finish" and it will translate the task into code - the best part is that this "record" feature doesn't actually record click positions, it will actually find out the IDs of the buttons and menus you clicked and write optimized and reliable code for the task.

In my case with the Siemens software the use-case was very simple: we needed access to data that was only made available either through their software that was about 300€/month OR with a special license and another tool (that provided a local API via socket with the data) that would cost around 50 000 €/month - when you see a price like that I believe it's totally justifiable and okay to automate the UI. Note that this was in 2010 and from what I've been told my code is still running the same task today without changes (AutoIt is complied and they don't even have the source). I believe this speaks volumes about how reliable AutoIt can be.

Something I don’t agree with is Windows and MacOS succeeding because of solid and stable APIs. Theoretically this might be the case for developers

And and developers create software that people use. Large companies, without being given stable APIs and good documentation won't ever feel like developing for Linux. They couldn't justify a very expensive development process with large maintenance costs for such a small market share. If the APIs were more stable and there were better frameworks it could be easier to justify.

So while in theory the Windows Kernel API might enjoy a good development model, it has little to zero effect on the end-user

It's not just about the kernel, it's about the higher level APIs and frameworks that make developers be able to develop quickly. It's about having C# and knowing the thing is very well supported at any corner of Windows and whatnot. It's about having entire SDKs with everything integrated on a IDE made by them where everything works at the first try.

re, I’ve tried. Installing the old dotnet or c++ runtimes and directx versions is a hassle, sometimes impossible. Some games crap out entirely. I can’t do it the other way around and install an old version of Windows on modern hardware.

It seems you're picking the hard case - games. But for instance you can install Office 2003 and Photoshop 6 on Windows 11 and they'll run without hacks - Linux desktop (not CLI) never offered this kind of long term support. Recently I had an experience with an old game on modern Windows that might interest you: https://lemmy.world/post/10112060.

Apple changed the entire kind of processor architecture, and then again. With them things also don’t stay the same. They solve that with other techniques. And a Macbook won’t be thrown to the garbage after a few years because it’s gotten so slow. I see people keeping them for quite some time. But they usually don’t run the latest version of MacOS any more. At least that’s what I’ve seen.

Apple simply obliterates the old and doesn't care much about it, Microsoft usually is way better at this. BUT... still as you've noticed their Rosetta 2 compatibility layer allows you to run Intel software on ARM machines without issues, even games and heavy stuff.

But they usually don’t run the latest version of MacOS any more. At least that’s what I’ve seen.

Yes, they've restrictions because they usually want to cleanup their kernel and some system components of support for older hardware and this seems to be a big advantage when it comes to the performance and reliability of their OS. Either way those machines with older macOS versions keep working and getting at least most of the software for a reasonable time.

[-] rufus@discuss.tchncs.de 2 points 9 months ago* (last edited 9 months ago)

Yes, Googled a bit and found how to virtualize macOS, the install did the first reboot already. Seems they took inspiration from Scotty from the TOS Enterprise, it suggested 2h50 at first but the minutes are coming down fast.

We'll see about that graphics accelleration. The laptop doesn't have a dedicated GPU anyways. Either QEMU/KVM does it or I can pass through half the intel iGPU or it'll just be slow.

I can empathize with your story about the GUI automation. Sometimes you just need a solution for your problem. If it's still running more than 10 years later it probably was the right call. Sometimes crazy workarounds stick and do the trick. You can always calculate if buying software/a license or paying someone to come up with a solution is cheaper. 13x12x50.000€ is a good amount of money.

It just gets a bit messy once you're forced to re-work a hacked together solution in production. But it really depends on the circumstances. I've seen old machines that did crazy jobs and broke down or had to be integrated into something else at some point. And then you have an 10 year old operating system you can't change much on, the employer who cobbled together that solution had long left and the company who initially sold the expensive and specialized software/hardware had changed the product twice in the meantime... Might turn a few of your hair grey, especially if someone absolutely needs to use it on Wednesday, but somehow it usually works out. If it's tastefully done and documented, everything might be perfectly alright.

Thank you for the Midtown Madness 2 link. I need that, too. Spent quite some time in that blocky version of San Francisco when I was a kid.

I don't really have a better use case for Windows on my laptop at home. I use it to update stuff like the GPS and probably one or two other things. I moved a few games there after the SSD with Linux on it was filled up.

(Edit: The install is done. You were right, the desktop is totally sluggish and I don't have any sound. And I skipped the AppleID. I've closed it for now. Maybe I can find better settings on the weekend and try to install something on it.)

[-] TCB13@lemmy.world 2 points 9 months ago

We’ll see about that graphics accelleration. The laptop doesn’t have a dedicated GPU anyways. Either QEMU/KVM does it or I can pass through half the intel iGPU or it’ll just be slow.

Assuming you're a GPU supported by macOS you might be able to get good results by treating with like an hackintosh: https://dortania.github.io/OpenCore-Install-Guide/

I'm not sure how macOS plays with GVT-g / SR-IOV / sharing slices of hardware but this guy says he go it to work https://www.reddit.com/r/VFIO/comments/innriq/successful_macos_catalina_with_intel_gvtg/. I personally never got macOS with GPU acceleration working fine on a VM because my host is NVIDIA and unsupported. However I did have very good results in HP Mini computers running macOS by following the links before.

Thank you for the Midtown Madness 2 link. I need that, too. Spent quite some time in that blocky version of San Francisco when I was a kid.

I believe the hacks work with other games from that time as well as it solves the DirectX and GPU issues nicely without permanente changes to your system.

this post was submitted on 06 Feb 2024
63 points (86.2% liked)

Linux

48179 readers
1106 users here now

From Wikipedia, the free encyclopedia

Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).

Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.

Rules

Related Communities

Community icon by Alpár-Etele Méder, licensed under CC BY 3.0

founded 5 years ago
MODERATORS