• 0 Posts
  • 18 Comments
Joined 1 year ago
cake
Cake day: October 17th, 2023

help-circle












  • Not 100%, stremio is a front end for a debrid service and the debrid service will download the torrent and add it to their cache and stremio users will access the downloaded file directly from the debrid services’ servers.

    Only the initially download may cause a slow down of torrents. Idk exactly how they distribute the file to their cdn. If all the servers in their cdn download the same file at 1 time it may cause a temporary slow down of torrents but i would assume they don’t download directly on each server and instead download on some close to the requesting user and then use some kind of file synchronization technique to propogate the file through the network.

    Their cache is pretty huge and for most shows they already have tons of links cached and wouldn’t need to keep downloading very often.

    Stremio isn’t the first front end for these services and like all the rest of them it will eventually get shutdown too and this will continue long after stremio.

    The real.issuein my opinion isnt bandwidth hogging by debrid services its that if everyone migrates to them that the majority of the network will be leechers. With less seeders the remaining seeders will need more powerful computers to support the torrents and if they cant afford the upgrade them the whole system could collapse


  • I dont think stremio does either technically. Stremio it’s typically used as a front end application for debrid services. Mainly real debrid, all debrid and premiumize.

    I believe the 2 debrids only download, but i think premiumize seeds, but not 100% sure.

    That being said if a file gets added to these services it is not constantly leeching like op said. The real debid servers for example will download a torrent and distrubute the downloaded file throughout their cdn, leaving it in their cache for 30 days. I believe each time it is accessed by a user that 30 day clock is reset.

    Stremio typically only shows cached torrents in there app so in order for a user to force a download they would need to go to their debrid provider directly and add the torrent causing it to get added to the cache.

    Is it bad for the torrenting network, yes because they don’t seed, is stremio using up all of seeders bandwidth, probably not


  • So there are multiple technologies at play. One is an indexer program (jackett/prowlarr/etc). These basically hook up to public trackers (1337x, TPB, etc).

    Then you have Sonarr/Radarr which are connected to the indexer. Sonarr and radarr basically have an rss feed (which is basically a list of content, podcasts and youtube apps use this to show you new episodes/videos).

    I think they use tmdb or something as there source of rss feeds. They also let you select which shows to monitor and it stores that inforamation in a database. So sonarr will reach out to tmdb and request the latest rss feed for a show every so often for the shows in the database. If an episode that sonarr is supposed to download is listed on the rss feed it will then send a request to its indexer and tell it what show, what episode, what season, etc.

    The indexer then searches each tracker it is connected to for that show, season, episode combo and returns a list of links to sonarr/radarr.

    Sonarr then has a set of rules in its database to filter these links (ie minimum quality, language, etc) to determine which link to pick). Finally in its settings sonarr/radarr has a location where it should save the files.

    Now sonarr/radarr cant download themselves, instead they are also hooked to a torrent client. For example qbittorrent which has an api which allows you to programatically download torrents (ie it has a command to download a torrent and sonarr/radarr sends the command along with additional information like the link and where to save the files.

    This is the basic setuo but there are other tools used sometimes like unpackarr which is for decompressing files that get downloaded. Unpackarr watches a folder for new files and if it finds a file in a compressed format (7z, rar, zip, etc) it will automatically decompress it so that a media program like jellyfin can play it without you having to do it manually.

    Programs like jellyfin are media servers where you would specify folders for movies/tv shows/etc and any playable file in those folders can be streamed in their app/web interface. These kind of programs are really just graphical programs that are easy to set up and use that are built on top of more technical programs like ffmpeg which does the transcoding and streaming.

    Then there are also programs like flaresolverr. You would integrate this into your indexer because some trackers might use cloudflare to prevent bots (they require you to click a checkbox and watch the movement of the cursor to see if it is robotic). Flaresolverr uses something called selenium webdriver which is a program that can automate a webbrowser. You can program it to open web pages, click things, etc. I assume the code uses randomization to make cloudflare think a person is moving the mouse to click the button so you can access those trackers

    In simple terms that’s how it works. All these programs set up a web interface and api and send each other http requests to communicate