• 0 Posts
  • 8 Comments
Joined 5 months ago
cake
Cake day: February 8th, 2025

help-circle
  • That’s a terrible way to put it and sincerely misguided in my opinion. I have a handful of public indexers, they work fine in 99.99% of the cases for my needs. In fact, never before I’ve had this issue until recently, with two unreleased episodes that were fake files. For me, not allowing the unreleased episodes is just another layer of security. In other words, using your example, I don’t want the water filter for my car to use the bad gas station, I want to get the water filter to make sure that if there’s ever some water by accident or not then it won’t get to the engine… If I see the indexers or trackers start publishing a lot of fake stuff it will get removed, but from public indexers I understand if there’s something ever getting past, and I don’t want the devs of some software deciding that me requiring that a show has been aired before I even try to download it is dumb.


  • Precisely, just make it optional, hell even just apply the way it is now as default, but give the option to those that prefer it. But each time it is requested it gets shot down immediately and when people ask why not make it optional no one answers.

    The needed change is not even that complex and someone provided the link to the pull request for radarr that implemented the similar function (actually even more complex as it has more options for movies). I’ve even considered trying to do it myself, but its quite the effort to prepare the dev environment, make the change, test it and make the pull request just to get the same dev shoot it down just out of spite. If the feature request was still open even if the usual devs don’t want to do it then it would show that they would accept it…



  • I went down the rabbit hole on this the other day as I was trying to find a way to block unreleased episodes. It’s unbelievable to me the resistance they put against such a simple feature. Like no one is requesting to force it that way, just give the option to make it so.

    The two reasons I saw for canceling the feature request over menu duplicates is the “use better trackers” mainly but also that shows are so often released or leak early they this setting would block you from getting them faster… Those are the dumbest reasons ever to not provide a setting that people are literally asking over and over again for.

    The change is done for radarr so it might not be terribly hard to adapt into sonarr. Being open source I would have expected someone to do the change already but if they fight against it so much as a principle who would expect them to approve the change…


  • This has been my path so far, nearly to a T. Got an old laptop, installed endeavour with a very light DE and attached an external drive and started messing with *arrs and jellyfin and bunch of other things.

    The only downside is that my family now relies on that for watching so I’m more careful of not breaking the stuff that works.

    Got another laptop that had no use so I started playing around on it. Installed Debian and CasaOS on top to test if that would be a nice alternative.

    The only real issue is the lack of time to spend on this.


  • yyprum@lemmy.dbzer0.comtoTechnology@lemmy.world*Permanently Deleted*
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    1
    ·
    3 months ago

    And I could rebute to that, that if someone is interested enough to check it with AI then they were likely to try and check it anyway without AI, maybe it would take longer, it would be harder to find… But they’d be the intended audience that now are redirected elsewhere.

    To quote myself:

    It’s a really complex topic that no simple straight answer would solve.

    We could rebute again and again and again, and get nowhere because either option is hard to discuss as it is simply impossible to give proper data to prove anything. And worse, when defending the use of AI for it can lead to being told you are allowing it in the first place and that’s not even telling how many people still believe that AI needs real sample images to produce those (whether the algorithm is trained or not on CP is irrelevant on this particular point, as it is not needed to be created)


  • yyprum@lemmy.dbzer0.comtoTechnology@lemmy.world*Permanently Deleted*
    link
    fedilink
    English
    arrow-up
    15
    arrow-down
    2
    ·
    3 months ago

    As a counterpart, the fact that it is so easy and simple to get those AI images, compared to the risk and extra effort of doing it for real, could make the actual child abuse become less common and less profitable for mafias and assholes in general. It’s a really complex topic that no simple straight answer would solve.

    Normalising it would be horrible and should be avoided, but there will always be some amount of people looking for that content. I rather have them using AI to create it than having to go searching for real content. Persecuting the AI content is not only very inefficient, it might also be harmful as the only other content left would be the real one that is much harder to catch those who make it.


  • Don’t pay any attention to that kinda stupid comment. Anyone posting that kind of misinformation about AI is either trolling or incapable of understanding how generative AI works.

    You are right it is a victimless crime (for the creation of content). I could create porn with minions without using real minion porn to put the randomnest example I could think of. There’s the whole defamation thing of publishing content without someone’s permission but that I feel is a discussion irrelevant of AI (we could already create nasty images of someone before AI, AI just makes it easier). But using such content for personal use… It is victimless. I have a hard time thinking against it. Would availability of AI created content with unethical themes allow people to get that out of their system without creating victims? Would that make the far riskier and horrible business of creating illegal content with real unwilful people disappear? Or at the very least much more uncommon? Or would make people more willing to consume thw content creating a feelibg of fake safety towards content previously illegal? There’s a lot of implications that we should really be thinking about and how it would affect society, for better or worse…