Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!


Need some help - SaaS Provider is dead but my site is still up, how should I grab it?
New on LowEndTalk? Please Register and read our Community Rules.

All new Registrations are manually reviewed and approved, so a short delay after registration may occur before your account becomes active.

Need some help - SaaS Provider is dead but my site is still up, how should I grab it?

Hey guys,

DevPort, the SaaS that powers my homepage, has died. I can no longer log in, and their Twitter/Facebook/Blog is long gone.

How can I grab my site (http://0xdragon.com) before it too vanishes into the abyss?

Thanks for all your help!

Comments

  • wget -mk -w 3 -e robots=off http://0xdragon.com
    
    Thanked by 3ehab 0xdragon netomx
  • wget recursive or HTTrack

    Thanked by 20xdragon flopv
  • Why you no backup?

    Thanked by 4ehab rm_ J1021 doughmanes
  • @budi1413 said:
    Why you no backup?

    No backups available :)

    @GStanley said:

    > wget -mk -w 3 -e robots=off http://0xdragon.com
    > 

    Thanks, it's running now.

  • Get ArchiveTeam on the case?

    Thanked by 1netomx
  • @0xdragon said:
    No backup available

    Damn son.

  • Looks like the earlier command doesn't grab all the assets such as images. Now to figure out how to include AWS domains..

  • hawchawc Moderator, LIR

    @0xdragon, want me to try and get a copy of it into the Internet Archive, so you can rebuild from that?

  • dopulossdopuloss Member
    edited October 2015

    @GStanley said:

    > wget -mk -w 3 -e robots=off http://0xdragon.com
    > 

    what does this command exactly mean? If the site was a dynamic site, let say powered by Php and Mysql, how to get the full database backup in such a situation?

  • ClouviderClouvider Member, Patron Provider

    @dopuloss if you don't have access to the files or the database directly you simply won't be able to do that.

  • @hawc said:
    0xdragon, want me to try and get a copy of it into the Internet Archive, so you can rebuild from that?

    It's still online :) So if it comes to that, I'll do that. Thanks though!

  • dopuloss said: what does this command exactly mean? If the site was a dynamic site, let say powered by Php and Mysql, how to get the full database backup in such a situation?

    In this situation, you aren't going to be able to get the full database backup. He wants a static copy mirror of his site.

    Change to this, it should get all of your site, including Amazon assets, look for folders like : devport-prod.s3-us-west-1.amazonaws.com

    wget -mk -w 3 -e robots=off --show-progress --span-hosts http://0xdragon.com
     

    As for what the commands do:

           -m
           --mirror
               Turn on options suitable for mirroring.  This option turns on recursion and time-stamping, sets
     infinite recursion depth and keeps FTP directory listings.  It is currently equivalent to -r -N -l inf --no-remove-listing.
    
           -w seconds
           --wait=seconds
               Wait the specified number of seconds between the retrievals.  Use of this option is recommended, as it lightens the server load by making the requests less frequent.  
    
      -k,  --convert-links             make links in downloaded HTML or CSS point to local files.
    

    @0xdragon, with not having all possible files downloaded, perhaps play with some flags, the above should work. I've added recursive and span hosts. Never had that issue.

           -H
           --span-hosts
               Enable spanning across hosts when doing recursive retrieving.
    
    Thanked by 10xdragon
  • joepie91joepie91 Member, Patron Provider
    edited October 2015

    I've at least added it to ArchiveBot, which will download the entire (public) site and put it in the Wayback Machine, as static pages. You can follow the progress here while it is running.

    Of course it can't find anything that isn't publicly linked :)

    EDIT: I'm seeing a 502 error on http://0xdragon.com/projects/ox.developerportfolio.com. This might indicate that infrastructure is already failing?

    EDIT2: The site of DevPort seems to be up? At least it redirects to http://developerportfolio.com/.

    Thanked by 2hawc netomx
  • @joepie91 said:
    I've at least added it to ArchiveBot, which will download the entire (public) site and put it in the Wayback Machine, as static pages. You can follow the progress here while it is running.

    Of course it can't find anything that isn't publicly linked :)

    EDIT: I'm seeing a 502 error on http://0xdragon.com/projects/ox.developerportfolio.com. This might indicate that infrastructure is already failing?

    EDIT2: The site of DevPort seems to be up? At least it redirects to http://developerportfolio.com/.

    Yes, their site is up, but try logging into it via GitHub.

  • @0xdragon so did you try HTTrack? It pulls everything visible to the public and you can specify additional URLs that may be hidden for the public.

    I used it to make HTML hard copies of a site build with Joomla CMS.

    Thanked by 1netomx
  • What makes you think they are dead?

  • joepie91joepie91 Member, Patron Provider

    @0xdragon said:
    Yes, their site is up, but try logging into it via GitHub.

    Aha, I see. I've added it to the list of proposed projects here anyway... not sure whether somebody will have the time to follow up on it as a whole. Either way, anything public-facing on your site should be saved by now :)

  • @joepie91 said:
    Aha, I see. I've added it to the list of proposed projects here anyway... not sure whether somebody will have the time to follow up on it as a whole. Either way, anything public-facing on your site should be saved by now :)

    Awesome! Is there a place I can view/download it?

  • @MarkTurner said:
    What makes you think they are dead?

    Aren't you getting 502 errors?

    Anyway. To answer your question. OP said that their sites like Blog, Facebook and Twitter are gone and he cannot login.

  • netomxnetomx Moderator, Veteran

    joepie91 said: You can follow the progress here while it is running.

    WOW, nice page! I loved it!

  • @Hidden_Refuge - Blog is there last entry Friday, May 8, 2015 3:00 AM, Twitter and Facebook are fine.

  • ItsChrisGItsChrisG Member
    edited October 2015

    If anyone has the contact information for who ran/runs DevPort / DeveloperPortfolio - can you please share it with me? or have them contact me?

    If they are indeed dying/shutting down, I want to keep it online and running for free as it is now.

  • @ItsChrisG said:

    If they are indeed dying/shutting down, I want to keep it online and running for free as it is now.

    Maybe be trying the whois info: https://gist.githubusercontent.com/anonymous/bfb80e0ac231781c81b0/raw

  • Hey 0xdragon, I'm the creator of DevPort. ItsChrisG reached out to me regarding this issue.

    Sorry you're having trouble with the site, there were some issues with redis (session store). I pushed a fix, you should be able to log into your account now.

    I'd also like to assure you that I won't be taking DevPort down anytime soon, and if I did, I would allow you to export all your data.

    Best!
    Ian

  • 0xdragon0xdragon Member
    edited October 2015

    @ianjennings said:
    Hey 0xdragon, I'm the creator of DevPort. ItsChrisG reached out to me regarding this issue.

    Sorry you're having trouble with the site, there were some issues with redis (session store). I pushed a fix, you should be able to log into your account now.

    I'd also like to assure you that I won't be taking DevPort down anytime soon, and if I did, I would allow you to export all your data.

    Best!
    Ian

    Thanks for the reassurance, you had me worried for a month or so :)

    Trying to add new projects and modify data keeps giving me 502 Bad Gateway issues.

    502 Bad Gateway
    nginx/1.4.6 (Ubuntu)

  • joepie91joepie91 Member, Patron Provider
    edited October 2015

    @0xdragon said:
    Awesome! Is there a place I can view/download it?

    Yeah, anything saved by ArchiveBot (or well, our instance of it anyway) goes into the Wayback Machine directly. Generally takes anywhere between an hour and a day, depending on how much is being archived - it's batch-imported in fixed-size chunks.

    @ianjennings said:
    Hey 0xdragon, I'm the creator of DevPort. ItsChrisG reached out to me regarding this issue.

    Sorry you're having trouble with the site, there were some issues with redis (session store). I pushed a fix, you should be able to log into your account now.

    I'd also like to assure you that I won't be taking DevPort down anytime soon, and if I did, I would allow you to export all your data.

    Best!
    Ian

    Sounds like you need better error reporting ;) Good to see that it's still alive, though.

Sign In or Register to comment.