Alt Text

A screenshot of a file manager preview window for my ~/.cache folder, which takes up 164.3 GiB and has 246,049 files and 15,126 folders. The folder was first created about 1.75 years ago with my system

  • Dog@lemmy.world
    link
    fedilink
    English
    arrow-up
    18
    arrow-down
    2
    ·
    1 year ago

    Question, could you have cron/crontab do it monthly or something? Do it monthly meaning delete everything in ~/.cache every month or so?

      • BaroqueInMind@kbin.social
        link
        fedilink
        arrow-up
        9
        ·
        1 year ago

        This is the good shit I miss from reddit. Thank you for posting a systemd service config, I’m going to implement this.

      • Zangoose@lemmy.worldOP
        link
        fedilink
        English
        arrow-up
        4
        ·
        1 year ago

        Thanks for this! I’ve been meaning to start getting into learning more about systemd and making services, this is super detailed and gives me a pretty good starting point!

    • bizdelnick
      link
      fedilink
      arrow-up
      4
      ·
      1 year ago

      Don’t. You don’t need to clean it unless cache of some buggy program grows uncontrollable.

    • cmnybo@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 year ago

      You could have a cronjob run something like find /home/user/.cache -type f -atime +30 -delete, which would find files that haven’t been accessed in the last 30 days and delete them. Make sure your home partition is not mounted with the noatime option though.

    • Zangoose@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 year ago

      I just found this today, I don’t really know anything about cron jobs but this will probably incentive me to learn lol

      • SuperIce@lemmy.world
        link
        fedilink
        English
        arrow-up
        10
        ·
        1 year ago

        Did you happen to see which subdirectory was using up this much space? I don’t think I’ve ever seen .cache go above 10GB, so this may be a bug in a piece of software you use.

          • Zangoose@lemmy.worldOP
            link
            fedilink
            English
            arrow-up
            5
            ·
            edit-2
            1 year ago

            Looks like yay is storing every previous binary for AUR bin packages (also excuse the unreadable terminal theme, it doesn’t play very well with a lot of TUI apps unless they support custom theming)

            • Bronco1676
              link
              fedilink
              arrow-up
              2
              ·
              edit-2
              1 year ago

              You should run yay -Sc from time to time. This cleans a) your pacman cache (which is normally done by executing pacman -Sc) b) your AUR build cache, which is what’s taking up 160GB. But this one seems rather unusual, I use paru (which also has the command paru -Sc), so I can’t really tell if this is normal with yay.

              The command also asks you for every directory if you want to delete it or not, so it’s completely save to run that command.

              • Zangoose@lemmy.worldOP
                link
                fedilink
                English
                arrow-up
                2
                ·
                1 year ago

                Something I noticed was that it was mostly the binary packages that were taking up so much space, it may be because of how yay stores the programs (does it use git?), the ones that were compiled from source code usually took up the least amount of space, while the binary programs were the ones taking up tens of gigabytes

                • Bronco1676
                  link
                  fedilink
                  arrow-up
                  1
                  ·
                  1 year ago

                  Indeed, yay utilizes the AUR, which essentially serves as a Git repository for each package. These repositories typically include a PKGBUILD file and a .SRCINFO file, along with possible additional files like patches, desktop, or service files.

                  For example, take a look at IntelliJ Ultimate: [https://aur.archlinux.org/cgit/aur.git/tree/?h=intellij-idea-ultimate-edition]. It contains the .SRCINFO and PKGBUILD, as well as a .desktop file. These files themselves do not occupy much space.

                  The PKGBUILD specifies the sources for dependencies. For instance:

                  source=("https://download.jetbrains.com/idea/ideaIU-$pkgver.tar.gz"
                          "jetbrains-idea.desktop")
                  

                  The PKGBUILD is essentially a Bash script with predefined functions and variables. You can learn more about it here: [https://wiki.archlinux.org/title/PKGBUILD].

                  This script primarily downloads and extracts the tar file. In this specific case, it only relocates the files to their intended installation locations, like moving the desktop file to /usr/share/applications.

                  With such packages, there’s a possibility of wasting significant space since the tar file is downloaded and possibly retained in the cache.

                  However, other packages, especially those compiled from source, usually involve Git clones. These clones bring the Git repository into a subdirectory of the already cloned AUR package Git repo. Some might also have source tarballs. These types of packages generally do not consume much space in the cache, as they are often just text files, like C source code or Python scripts. These packages frequently rely on external libraries and packages, which are not included in this package’s cache.

                  While binary packages often bundle all necessary libraries and other components in their source tarballs.

                  The AUR cache is mostly beneficial if you’re rebuilding the same version or can reuse components from a previous version. For example, a package might depend on a large, static file that doesn’t change often.

                  In Paru, I’ve enabled the “CleanAfter” option to prevent my cache from overflowing. Given my relatively fast internet speed, redownloading large files isn’t a major concern for me.

            • neonred@lemmy.world
              link
              fedilink
              arrow-up
              2
              ·
              edit-2
              1 year ago

              Wow, I’ve never seen something like this.

              Is it" allowed"? I mean, there are quotas for user homes.