Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!


Need some help - SaaS Provider is dead but my site is still up, how should I grab it?
New on LowEndTalk? Please Register and read our Community Rules.

Need some help - SaaS Provider is dead but my site is still up, how should I grab it?

Hey guys,

DevPort, the SaaS that powers my homepage, has died. I can no longer log in, and their Twitter/Facebook/Blog is long gone.

How can I grab my site (http://0xdragon.com) before it too vanishes into the abyss?

Thanks for all your help!

This signature wasted 121 bytes of your data allocation.

https://nixstats.com/report/56b53d6465689e44598b4567

Comments

  • wget -mk -w 3 -e robots=off http://0xdragon.com
    
    Thanked by 3ehab 0xdragon netomx

    True wisdom comes to each of us when we realize how little we understand about life, ourselves, and the world around us.

  • cassacassa Member, Provider

    wget recursive or HTTrack

    Thanked by 20xdragon flopv

    ik moet poepen

  • Why you no backup?

    Thanked by 4ehab rm_ J1021 doughmanes
  • @budi1413 said:
    Why you no backup?

    No backups available :)

    @GStanley said:

    > wget -mk -w 3 -e robots=off http://0xdragon.com
    > 

    Thanks, it's running now.

    This signature wasted 121 bytes of your data allocation.

    https://nixstats.com/report/56b53d6465689e44598b4567

  • Get ArchiveTeam on the case?

    Thanked by 1netomx

    Here lies Nekki. He loved massive amounts of storage, K-Pop and calling people cunts.

  • @0xdragon said:
    No backup available

    Damn son.

  • cassacassa Member, Provider

    ik moet poepen

  • Looks like the earlier command doesn't grab all the assets such as images. Now to figure out how to include AWS domains..

    This signature wasted 121 bytes of your data allocation.

    https://nixstats.com/report/56b53d6465689e44598b4567

  • hawchawc Member, Moderator, LIR

    @0xdragon, want me to try and get a copy of it into the Internet Archive, so you can rebuild from that?

  • dopulossdopuloss Member
    edited October 2015

    @GStanley said:

    > wget -mk -w 3 -e robots=off http://0xdragon.com
    > 

    what does this command exactly mean? If the site was a dynamic site, let say powered by Php and Mysql, how to get the full database backup in such a situation?

    Freelancing developer for desktop apps, check my iPhone recovery tutorial to restore deleted files

  • ClouviderClouvider Member, Provider

    @dopuloss if you don't have access to the files or the database directly you simply won't be able to do that.

    Clouvider Limited - Leading Hosting & Connectivity Partner || Dedicated Server Sale from £39/m - Our Latest LET Offer
    Cloud Web Hosting | SSD & SAS HA OnApp VPS | US, UK, NL & DE Dedicated Servers | Network Services | Colocation | Managed Services

  • @hawc said:
    0xdragon, want me to try and get a copy of it into the Internet Archive, so you can rebuild from that?

    It's still online :) So if it comes to that, I'll do that. Thanks though!

    This signature wasted 121 bytes of your data allocation.

    https://nixstats.com/report/56b53d6465689e44598b4567

  • dopuloss said: what does this command exactly mean? If the site was a dynamic site, let say powered by Php and Mysql, how to get the full database backup in such a situation?

    In this situation, you aren't going to be able to get the full database backup. He wants a static copy mirror of his site.

    Change to this, it should get all of your site, including Amazon assets, look for folders like : devport-prod.s3-us-west-1.amazonaws.com

    wget -mk -w 3 -e robots=off --show-progress --span-hosts http://0xdragon.com
     

    As for what the commands do:

           -m
           --mirror
               Turn on options suitable for mirroring.  This option turns on recursion and time-stamping, sets
     infinite recursion depth and keeps FTP directory listings.  It is currently equivalent to -r -N -l inf --no-remove-listing.
    
           -w seconds
           --wait=seconds
               Wait the specified number of seconds between the retrievals.  Use of this option is recommended, as it lightens the server load by making the requests less frequent.  
    
      -k,  --convert-links             make links in downloaded HTML or CSS point to local files.
    

    @0xdragon, with not having all possible files downloaded, perhaps play with some flags, the above should work. I've added recursive and span hosts. Never had that issue.

           -H
           --span-hosts
               Enable spanning across hosts when doing recursive retrieving.
    
    Thanked by 10xdragon

    True wisdom comes to each of us when we realize how little we understand about life, ourselves, and the world around us.

  • joepie91joepie91 Member, Provider
    edited October 2015

    I've at least added it to ArchiveBot, which will download the entire (public) site and put it in the Wayback Machine, as static pages. You can follow the progress here while it is running.

    Of course it can't find anything that isn't publicly linked :)

    EDIT: I'm seeing a 502 error on http://0xdragon.com/projects/ox.developerportfolio.com. This might indicate that infrastructure is already failing?

    EDIT2: The site of DevPort seems to be up? At least it redirects to http://developerportfolio.com/.

    Thanked by 2hawc netomx
  • @joepie91 said:
    I've at least added it to ArchiveBot, which will download the entire (public) site and put it in the Wayback Machine, as static pages. You can follow the progress here while it is running.

    Of course it can't find anything that isn't publicly linked :)

    EDIT: I'm seeing a 502 error on http://0xdragon.com/projects/ox.developerportfolio.com. This might indicate that infrastructure is already failing?

    EDIT2: The site of DevPort seems to be up? At least it redirects to http://developerportfolio.com/.

    Yes, their site is up, but try logging into it via GitHub.

    This signature wasted 121 bytes of your data allocation.

    https://nixstats.com/report/56b53d6465689e44598b4567

  • @0xdragon so did you try HTTrack? It pulls everything visible to the public and you can specify additional URLs that may be hidden for the public.

    I used it to make HTML hard copies of a site build with Joomla CMS.

    Thanked by 1netomx

    I'm on vacation in Belize.

  • What makes you think they are dead?

  • joepie91joepie91 Member, Provider

    @0xdragon said:
    Yes, their site is up, but try logging into it via GitHub.

    Aha, I see. I've added it to the list of proposed projects here anyway... not sure whether somebody will have the time to follow up on it as a whole. Either way, anything public-facing on your site should be saved by now :)

  • @joepie91 said:
    Aha, I see. I've added it to the list of proposed projects here anyway... not sure whether somebody will have the time to follow up on it as a whole. Either way, anything public-facing on your site should be saved by now :)

    Awesome! Is there a place I can view/download it?

    This signature wasted 121 bytes of your data allocation.

    https://nixstats.com/report/56b53d6465689e44598b4567

  • @MarkTurner said:
    What makes you think they are dead?

    Aren't you getting 502 errors?

    Anyway. To answer your question. OP said that their sites like Blog, Facebook and Twitter are gone and he cannot login.

    I'm on vacation in Belize.

  • netomxnetomx Member, Moderator

    joepie91 said: You can follow the progress here while it is running.

    WOW, nice page! I loved it!

  • @Hidden_Refuge - Blog is there last entry Friday, May 8, 2015 3:00 AM, Twitter and Facebook are fine.

  • ItsChrisGItsChrisG Member
    edited October 2015

    If anyone has the contact information for who ran/runs DevPort / DeveloperPortfolio - can you please share it with me? or have them contact me?

    If they are indeed dying/shutting down, I want to keep it online and running for free as it is now.

    Email Me: [email protected]
    | http://SwiftInternet.de

    320TB High Bandwidth Servers on 1Gbps, 2Gbps & 10Gbps! PCCW, NTT, Psychz Network Blend (+Addtl Upstreams Coming!)

  • @ItsChrisG said:

    If they are indeed dying/shutting down, I want to keep it online and running for free as it is now.

    Maybe be trying the whois info: https://gist.githubusercontent.com/anonymous/bfb80e0ac231781c81b0/raw

    True wisdom comes to each of us when we realize how little we understand about life, ourselves, and the world around us.

  • Hey 0xdragon, I'm the creator of DevPort. ItsChrisG reached out to me regarding this issue.

    Sorry you're having trouble with the site, there were some issues with redis (session store). I pushed a fix, you should be able to log into your account now.

    I'd also like to assure you that I won't be taking DevPort down anytime soon, and if I did, I would allow you to export all your data.

    Best!
    Ian

  • 0xdragon0xdragon Member
    edited October 2015

    @ianjennings said:
    Hey 0xdragon, I'm the creator of DevPort. ItsChrisG reached out to me regarding this issue.

    Sorry you're having trouble with the site, there were some issues with redis (session store). I pushed a fix, you should be able to log into your account now.

    I'd also like to assure you that I won't be taking DevPort down anytime soon, and if I did, I would allow you to export all your data.

    Best!
    Ian

    Thanks for the reassurance, you had me worried for a month or so :)

    Trying to add new projects and modify data keeps giving me 502 Bad Gateway issues.

    502 Bad Gateway
    nginx/1.4.6 (Ubuntu)

    This signature wasted 121 bytes of your data allocation.

    https://nixstats.com/report/56b53d6465689e44598b4567

  • joepie91joepie91 Member, Provider
    edited October 2015

    @0xdragon said:
    Awesome! Is there a place I can view/download it?

    Yeah, anything saved by ArchiveBot (or well, our instance of it anyway) goes into the Wayback Machine directly. Generally takes anywhere between an hour and a day, depending on how much is being archived - it's batch-imported in fixed-size chunks.

    @ianjennings said:
    Hey 0xdragon, I'm the creator of DevPort. ItsChrisG reached out to me regarding this issue.

    Sorry you're having trouble with the site, there were some issues with redis (session store). I pushed a fix, you should be able to log into your account now.

    I'd also like to assure you that I won't be taking DevPort down anytime soon, and if I did, I would allow you to export all your data.

    Best!
    Ian

    Sounds like you need better error reporting ;) Good to see that it's still alive, though.

Sign In or Register to comment.