Did you know most coyotes are illiterate?

Lemmy.ca flavor

  • 0 Posts
  • 18 Comments
Joined 6 months ago
cake
Cake day: June 7th, 2025

help-circle


  • Yeah that sounds about right. It also depends on which indexers you’re using, as I imagine the more public indexers will have a higher chance of getting takedowns from trolls. It’s worth noting that I believe the running theory is that a lot of 2021-2023 articles were voluntarily deleted to save space, resulting in issues even for .nzbs that weren’t takedown’d. It’s also theorized (and outright stated sometimes) that providers do silently delete data that is rarely or never accessed as well to save space, so that can be a random issue too.

    Personally, I lean more into torrent technology because usenet can be fickle for these reasons even if you’re in the secret indexers, whereas if you’re in at least some semi-good private torrent trackers you’ll never have completion issues (just potentially slower downloads). I also feel like usenet’s scalability, future, and pricing is sort of uncertain.




  • Worth noting that when What died, ~4 new sites popped up immediately and invited all the old members, and everyone raced to re-upload everything from What onto them, which was actually pretty effective. At this point, RED and OPS have greatly surpassed What in many ways, aside from some releases that never made it back (you can actually find out which releases used to exist because What’s database was made available after its death). Users and staff are a lot more prepared if it happens again, e.g. keeping track of all metadata via “gazelle-origin”.

    If by “in” you mean how to get into them, generally you’re supposed to have a friend invite you. If you don’t have anyone you know on private trackers, you’ve gotta get in from scratch. Luckily, RED and OPS both do interviews to test your knowledge on the technicals of music formats, though I’ve heard RED’s interview queues are long and OPS’s interviews are often just not happening: https://interviewfor.red/en/index.html https://interview.orpheus.network/

    Alternatively, you can interview for MAM, which is IMO the best ebook/audiobook tracker. They’re super chill and have a very simple interview e.g. “what is a tracker”: https://www.myanonamouse.net/inviteapp.php. After that, you can just hang around there for a while until you can get into their recruitment forums to get invites to other entry-level trackers, and then on those entry-level trackers you can get recruited into slightly higher-level trackers, and so on, and eventually RED/OPS should be recruiting from somewhere.

    This can feel a little silly and convoluted, but I guess I’d just appreciate that these sites put the effort into conducting interviews for new people at all, since the alternative is that you will just never get into anything without a friend. Reddit’s /r/trackers wiki is unfortunately one of the better places for information about private trackers if you want to do further reading.



  • Yes, it’s allowed and encouraged between RED<->OPS. There are a few tools on the RED and OPS forums to automate most of the process (e.g. Transplant, REDCurry, Takeout, Orpheus-Populator, etc.). Cross-posting torrents on many sites is allowed and fine, you just have to be aware of the rules of the source site, e.g. some places don’t want their internals to be shared, or some have a literal timer countdown before cross-posting is allowed. On the other hand, most sites are not going to enforce other sites’ exclusivity demands (PTP explicitly has a note about this). If an exclusive file is cross-posted onto PTP, PTP isn’t going to take it down on anyone’s behalf.

    I’ll note that private tracker culture has warmed up quite a bit in the past decade and a half that I’ve been on them. Trackers (and their users) don’t usually see other trackers as rivals/competitors anymore, release groups are respectful of each other, there are a ton of tutorials and help forums around to help low-skill members learn how to do the advanced stuff, and so on. There are recognizable usernames everywhere, and the general vibe is to cross-upload as much as possible and help build everyone’s trackers together. Cross-seed (the program) has helped a lot with this, and seedbases have become very strong even on smaller trackers as a result.





  • Yeah h264 is the base codec (also known as AVC), x264 is the dominant encoder that encodes in that codec. So the base BDs are just plain h264, and remuxes will take that h264 and put it into an mkv container. Colloquially, people tag WEB-DL and BDs/remuxes as “h264” as they’re raw/untampered-with, and anything that’s been encoded by a person as “x264”. Same thing for h265/HEVC and x265, and same for h266/VVC and x266.


  • As an idea, I use an SSD as a “Default Download Directory” within qBittorrent itself, and then qB automatically moves it to a HDD when the download is fully finished. I do this because I want the write to be sequential going into my ZFS pool, since ZFS has no defragmentation capabilities.

    Hardlinks are only important if you want to continue seeding the media in its original form and also have a cleaned-up/renamed copy in your Jellyfin library. If you’re going to continue to seed from the HDD, it doesn’t matter that the initial download is done on the SSD. The *arr stack will make the hardlink only after the download is finished.


  • Yep, fully agree. At least BluRays still exist for now. Building a beefy NAS and collecting full BluRay disks allows us to brute force the picture quality through sheer bitrate at least. There are a number of other problems to think about as well before we even get to the encoder stage, such as many (most?) 4k movies/TV shows being mastered in 2k (aka 1080p) and then upscaled to 4k. Not to mention a lot of 2k BluRays are upscaled from 720p! It just goes on and on. As a whole, we’re barely using the capabilities of true 4k in our current day. Most of this UHD/4k “quality” craze is being driven by HDR, which also has its own share of design/cultural problems. The more you dig into all this stuff the worse it gets. 4k is billed as “the last resolution we’ll ever need”, which IMO is probably true, but they don’t tell you that the 4k discs they’re selling you aren’t really 4k.


  • The nice thing is that Linux is always improving and Windows is always in retrograde. The more users Linux has, the faster it will improve. If the current state of Linux is acceptable enough for you as a user, then it should be possible to get your foot in the door and ride the wave upwards. If not, wait for the wave to reach your comfort level. People always say <CURRENT_YEAR> is the year of the Linux desktop but IMO the real year of the Linux desktop was like 4 or 5 years ago now, and hopefully that captured momentum will keep going until critical mass is achieved (optimistically, I think we’re basically already there).


  • CoyoteFacts@piefed.catoLinux@lemmy.mlAOMedia To Release AV2 Video Codec At Year's End
    link
    fedilink
    English
    arrow-up
    12
    arrow-down
    1
    ·
    edit-2
    3 months ago

    To be fair, it’s also basically impossible to have extremely high quality AV1 video, which is what a lot of P2P groups strive for. A lot of effort has gone into trying to do so and results weren’t good enough compared to x264, so it’s been ignored. AV1 is great at compression efficiency, but it can’t make fully transparent encodes (i.e., indistinguishable from the source). It might be different with AV2, though again even if it’s possible it may be ignored because of compatibility instead; groups still use DTS-HD MA over the objectively superior FLAC codec for surround sound because of hardware compatibility to this day. (1.0/2.0 channels they use FLAC because players support that usually)

    As for HEVC/x265, it too is not as good as x264 at very high quality encoding, so it’s also ignored when possible. Basically the breakdown is that 4k encoding uses x265 in order to store HDR and because the big block efficiency of x265 is good enough to compress further than the source material. x264 wouldn’t be used for 4k encoding even if it could store HDR because its compression efficiency is so bad at higher resolutions that to have any sort of quality encode it would end up bigger than the source material. Many people don’t even bother with 4k x265 encodes and just collect the full disc/remuxes instead, because they dislike x265’s encoder quality and don’t deem the size efficiency worth its picture quality impact (pretty picky people here, and I’m not really in that camp).

    For 1080p, x265 is only used when you want to have HDR in a 1080p package, because again x265’s picture quality can’t match x264, but most people deem HDR a bigger advantage. x264 is still the tool of choice for non-HDR 1080p encodes, and that’s not a culture thing, that’s just a quality thing. When you get down into public P2P or random encoding groups it’s anything goes, and x265 1080p encodes get a lot more common because x265 efficiency is pretty great compared to x264, but the very top-end quality just can’t match x264 in the hands of an experienced encoder, so those encoding groups only use x265 when they have to.

    Edit: All that to say, we can’t entirely blame old-head culture or hardware compatibility for the unpopularity of newer formats. I think the home media collector usecase is actually a complete outlier in terms of what these formats are actually being developed for. WEB-DL content favors HEVC and AV1 because it’s very efficient and displays a “good enough” quality picture for their viewers. Physical Blu-Rays don’t have to worry about HDD space or bandwidth and just pump the bitrate insane on HEVC so that the picture quality looks great. For the record, VVC/x266 is already on the shortlist for being junk for the usecases described above (x266 is too new to fully judge), so I wouldn’t hold my breath for AV2 either. If you’re okay with non-transparency, I’d just stick with HEVC WEB-DLs or try to find good encoding groups that target a more opinionated quality:size ratio (some do actually use AV1!). Rules of thumb for WEB-DL quality are here, though it will always vary on a title-by-title basis.


  • One of my pet peeves is that adage about how the modern internet is just 4 websites stealing content from each other. Because like, the websites themselves don’t make the content - the users do. There is no difference between a meme originating from instagram or ifunny; it was made by a person who wanted to create a unit of culture, and it’s your duty to make sure it spreads around.


  • I’m not a security expert by any means, but here are a few things I know as a regular user:

    Always keep your system up-to-date and only download and execute software from the official Arch repository if you can help it. Malware often takes advantage of outdated systems that don’t have the latest security patches, so by staying as up-to-date as possible you’re making yourself a very difficult target. The AUR is a user-based repository and is not inherently trusted/maintained like the official Arch repos, so be careful and always read PKGBUILDs before you use AUR software. Don’t use AUR auto-updaters unless you’re reading the PKGBUILD changes every time. Ideally try not to use the AUR at all if you can help it; official Arch Linux is usually quite stable, but AUR software is often responsible for a lot of the “breakages” people tend to get with Arch. If you have to run sketchy software, use a virtual machine for it, as a 0-day VM escape is almost certainly not going to happen with any sort of malware you’d run into. ClamAV or VirusTotal may also help you scan specific files that you’re wary of, but I wouldn’t trust that a file is clean just because it passes an AV check. Also, never run anything as root unless you have a very specific reason, and even then try to use sudo instead of elevating to a full root shell.

    Don’t open up any network ports on your system unless you absolutely have to, and if you’re opening an SSH port, make sure that it: isn’t the default port number, requires a keyfile for login, root cannot be logged into directly, and authentication attempts are limited to a low number. If you’re opening ports for other services, try to use Docker/Podman containers with minimal access to your system resources and not running in root mode. Also consider using something like CrowdSec or fail2ban for blocking bots crawling ports.

    As far as finding out if you’re infected, I’m not sure if there’s a great way to know unless they immediately encrypt all your stuff and demand crypto. Malware could also come in the form of silent keyloggers (which you’d only find out about after you start getting your accounts hacked) or cryptocurrency miners/botnets (which probably attempt to hide their CPU/GPU usage while you’re actively using your computer). At the very least, you’re not likely to be hit by a sophisticated 0-day, so whatever malware you get on your computer probably wants something direct and uncomplicated from you.

    Setting up a backup solution to a NAS running e.g. ZFS can help with preventing malware from pwning your important data, as a filesystem like ZFS can rollback its snapshots and just unencrypt the data again (even if it’s encrypted directly on the NAS). 2FA’ing your accounts (especially important ones like email) is a good way to prevent keyloggers from being able to repeat your username+password into a service and get access. Setting up a resource monitoring daemon can probably help you find out if you’re leaking resources to some kind of crypto miner, though I don’t have specific recommendations as I haven’t done this before.

    In the case of what to do once you’re pwned, IMO the only real solution is to salvage and verify your data, wipe everything down, and reinstall. There’s no guarantee that the malware isn’t continually hiding itself somewhere, so trying to remove it yourself is probably not going to solve anything. If you follow all the above precautions and still get pwned, I’m fairly sure the malware will be news somewhere, and security experts may already be studying the malware’s behavior and giving tips on what to do as a resolution.