Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!


Usenet 'suckup' feed?
New on LowEndTalk? Please Register and read our Community Rules.

All new Registrations are manually reviewed and approved, so a short delay after registration may occur before your account becomes active.

Usenet 'suckup' feed?

FreekFreek Member
edited December 2012 in General

This is just another brainfart of mine so feel free to say 'You're nuts'.
I read somewhere* that a guy managed to setup a 'Usenet suckup feed' which transforms your server into a 'usenet feed/mirror'. It basically just downloads the files from another usenet provider and forwards them directly via your server to the end user.
I am interested how this guy accomplished this. What's the correct terminology for this, e.g. for what should I search?
Giganews seems to call it 'IHave and Suck Feed service' but I am curious if this can be accomplished with other providers as well.

*Source: http://filesharingtalk.com/threads/444373-GIBInews?s=096a1daddffa62d1186cce83004fc1a6&p=3649238&viewfull=1#post3649238

Comments

  • If you have enough bandwidth and harddrives to suck up 10tb/day go for it!

  • Creating an index with Newznab is one thing - mirroring all the content seems like quite another.

  • FreekFreek Member
    edited December 2012

    @gsrdgrdghd said: If you have enough bandwidth and harddrives to suck up 10tb/day go for it!

    Not planning on deploying this commercially/large scale or even setting this up. Just interested in the technology behind this :)
    'For science' ;)

    @abat said: Creating an index with Newznab is one thing - mirroring all the content seems like quite another.

    You're not mirroring all stuff. You are basically a tunnel to their service. Nothing is permanently stored on your server, just a small cache

  • I'm not sure if it does what you think it does. From the Giganews description it seems like this is just an ordinary Usenet feed but instead of having to make peering agreements with all the big providers you just make one with Giganews

  • Read up on the archive.org people

  • How about, you grab an RSS feed of a particular group (or groups) you want to 'mirror'.
    i.e. http://rss.binsearch.net/rss.php?max=50&g=alt.binaries.british.drama
    Feed that into your newsgroup client, and make it request the feed every hour or so, and download everything.
    Make the completed folder accessible via HTTP (with directory listings?)
    Cron a command to delete all files that are over x days old within the web directory.

    Seems like a waste of bandwidth to me though :(

  • @gsrdgrdghd said: I'm not sure if it does what you think it does. From the Giganews description it seems like this is just an ordinary Usenet feed but instead of having to make peering agreements with all the big providers you just make one with Giganews

    Ouch that is indeed a big difference from what I thought It was!

    @bamn said: Read up on the archive.org people

    They don't index usenet, do they?

    @ElliotJ said: Seems like a waste of bandwidth to me though :(

    VPSes come with tons of BW nowadays. I never even managed to exceed 250GB in a month so 'why can't I hold all these TBs' ;) ?
    Nice idea, sadly not very practical as your usenet client cannot pull it/use it.

Sign In or Register to comment.