Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!


LowEndPing queries
New on LowEndTalk? Please Register and read our Community Rules.

All new Registrations are manually reviewed and approved, so a short delay after registration may occur before your account becomes active.

LowEndPing queries

NikkiNikki Member
edited November 2012 in General

Trying to get a few ideas for re-doing LowEndPing's request methods, a few options I have thought of so far have been:

  1. use shell_exec/exec to fork into the background if allowed, post back to the master server
  2. write a daemon (Python, PHP, Node.js whatever) to handle requests, seems like the cleanest way right now
  3. use the current system, hanging a PHP process while the request executes (pretty bad if it's a production server, usually isn't)

Any other ideas? Needs to be as efficient as possible, the whole system relies on a 'query id', which is calculated by the master server and will probably be sent to each of the sub servers.
Also needs to be secure, can't be just executing commands randomly

Comments

  • i like node.js

  • A python daemon please :>

  • @jcaleb the only problem I see with node.js is that the servers responding to the queries will need it installed =\

  • @Nikki said: @jcaleb the only problem I see with node.js is that the servers responding to the queries will need it installed =\

    Write it in something most environments will have (perl/python).

  • Exactly why I recommended python, about 100x easier to install.

    Or if you want something 'default,' try Perl.

  • NikkiNikki Member
    edited November 2012

    @gubbyte That's what I was thinking, I might have to learn a bit more python then >.>

    I was actually kinda bummed when I found out I couldn't pcntl_fork from cgi/web requests in PHP, would've made this job 100x easier

  • can you expound how #3 works currently?

  • @jcaleb frontend requests, forks cli script into the background on master, cli uses curl_multi functions to request responses from all the servers, pushing them into memcache after they have the response which the ajax controller sees and pushes to the client.

    Remote side is simply shell_exec on ping/ping6/traceroute, validating input etc

  • how about the javascript calls each servers on your list directly?

  • @jcaleb was thinking about that, but it'd require external connections...

  • @Nikki said: write a daemon

  • imma just leave this here

    http://pypi.python.org/pypi/sh

  • NikkiNikki Member
    edited November 2012

    @fly WHERE. HAVE. YOU. BEEN.

    That's AWESOME, I'll look into that

    Tornado + sh possibly? Just accept /ping/(.), /ping6/(.) and /trace/(.*) (looks like sh filters input variables to make sure nobody messes with them)

    So far: http://paste.ee/r/zynxm (Ignore the useless import for escape)

Sign In or Register to comment.