• 0 Posts
  • 27 Comments
Joined 1 year ago
cake
Cake day: August 2nd, 2023

help-circle
  • PrefersAwkward@lemmy.worldtolinuxmemes@lemmy.worldunused is wasted
    link
    fedilink
    English
    arrow-up
    14
    ·
    edit-2
    7 months ago

    Using swap isn’t always a sign you need more RAM. Typically, if you use a computer for a while or have a lot of IO operations going on, Linux will decide to swap some things to make more room for cache.

    Sometimes Linux just finds that you have a bunch of inactive app memory and it can swap that out to cache way more stuff. That’s just good memory management, but it’s not worth buying more RAM over



  • I wouldn’t put swap on an SD card, no. Even if it had an NVME, it seems like putting up at least a double-digit percent would be more effective than 1%.

    Also, since 6.1, swap has been a lot better, with MGLRU. ChromeOS gets away with paltry amounts of RAM due to swapping. So classic overcommitting seems fine as long as you don’t run into situations where more RAM is active at once than is available by hardware.


  • I think the question is: if a person is going to make such a tiny swap, why even use swap?

    Such a small swap is unlikely to save a system from memory problems and it’s does not seem likely to make a noticeable difference in performance when it’s only able to swap out small amounts of memory.

    Why wouldn’t one just put in larger ZRAM or a larger Swap with a reduced swapiness?

    If I have a raspberry pi with 1 GB ram, I don’t think a 2 MB swap is worth bothering with.




  • I remember about a decade ago, when I was a student, I helped a small company with some office work. An office admin showed me various things using her computer, such as their QuickBooks data. The bevy of ads I saw suggested to me that she was a religious, dog-owning, single parent, and she was clearly seeking a partner and she was about 50 years old. I got all this just from the front page of her yahoo and was blown away at how specific and personal those ads were. And some of the ads were Spanish so she’s easily bilingual.

    Totally creeped me out and I wasn’t sure she even knew that her computer was just broadcasting to her office all her personal info. She regularly collaborated with countless others, even new faces, using this same computer, where anyone could see what I saw.





  • PrefersAwkward@lemmy.worldtoLinux@lemmy.mlThoughts on this?
    link
    fedilink
    English
    arrow-up
    19
    ·
    edit-2
    10 months ago

    You aired my frustrations really well. He spent a lot more time making claims and discussing his own background than demonstrating Wayland’s alleged issues and showing that they’re egregious. It’s an entertaining rant at best, but that doesn’t make his points valid nor does it make anything actionable.








  • I agree desktop is not top priority. And I know their money largely comes outside Desktop. In fact, I would be surprised if consumer products came close to their b2b products. Just saying they have more than zero incentive to care about the Linux desktop. And apparently, Nvidia agrees, because they are finally putting more effort in.

    I still use and recommend AMD for Linux desktop, and I’m hoping Intel will become competitive in that space so we have more options and competition. I personally don’t like how closed off, uninvolved, and impassive Nvidia has been in general and I don’t trust them in general to collaborate much, as shown by their history.


  • Well they do lose some business in the Linux world to their issues and will probably take some time to recover their reputation in the Linux desktop community. I know not everyone hates them and the Linux Desktop community isn’t huge right now, but there is some incentive to show the world you care about your customers

    And if Linux Desktop ever gets super popular and easy for everyone but Nvidia, that’s not a necessary risk Nvidia should take. And the catching up later on could be really slow and painful if Nvidia lets themselves get even further behind. GPUs are among the most complicated hardware components to support and develop drivers and other software for.


  • I also have 12 GB. There are usage patterns where additional RAM wull be useful or even necessary on a phone. When you have more RAM, the phone can sleep tasks and leave background apps alone without having to discard their contents from RAM. This means fewer cold startups. Also, more contents can be cached, which means faster app startups. Both of these techniques also reduce CPU usage and improve battery life. You can also achieve more tabs in your browsers and more and bigger apps running at the same time. More RAM also means fewer situations where swapping is done or needed, so additional CPU and disk cycles are saved and battery usage is reduced. Some apps will actually require more RAM or spin more when memory is scarce. Examples can be advanced content creation apps in audio, video, or picture/photography. Also, some games, especially in high settings.

    Are these additional GBs necessary? No. And most people would not notice them, as even 6 GB is overkill for quite a number of peoples’ usage patterns. Your phone does maybe 95% of what it does just about as well, even when you have a low-midrange CPU and GPU that is from a few years ago, and just 4 or 6gb of RAM.

    This holds true for iOS and Android. They’ve both done a fair bit of housekeeping and software improvements to reel in excessive resource usage gen over gen. I think Android was doing some catch-up here for a while, but I don’t know how they go toe to toe on this anymore, and it’s difficult to empirically compare the two in this area.