Revision as of 12:24, 26 March 2012 by Stattrav (talk | contribs)



  Users could just transfer/submit their logs to the public ftp server. 


Developers could transfer the logs to the a specific folder on the server.


Users mail the benchmark file to ""

http API

There is a http API which recieves the file as a post message which could be sent using curl or equivalent tool.



A http API has to be made which accepts the benchmark file as a POST message or use a file upload mechanism. The http call is embedded into the benchmark shell script or a seperate script through which the user can just use "benchmark-post" or some such command. The file upload could be automated using urllib2(Python) or somesuch equivalent lib for other languages. If this mechanism is included in the benchmark shell script, one could just use an extra argument which could be something such as "--push-result-to-web=true".

FTP sync

Similar to the http push, one can also implement the FTP sync at the userend. The files are submitted to a queue folder and here the mechanism is a polling script which checks for any new file and introduces them into the db and the file storage folder. Here the polling script(say at a frequency of 5mins) can find the files which have been created after the last poll and then check if they are already introduced to the db by checking their md5sum(which could be stored in a separate table). This script dumps the log to a file which could be used to check if the script has worked properly. There could be another script which could be a cleaning script which checks the logs, pushes them to the db and file storage folders incase some of the log files have not been moved and emails if there are any discrepancies. The log files in the file storage folder could be stored as .gz

scp sync

Similar to that of an FTP client from the queue folder.

Mail server

Similar to the FTP/scp sync, a polling script could be written to check the IMAP server and bring in the attachments.


Values to be maintained in the db

flat file

Db Schema

Things needed to be scrapped from the user for the analysis

User-end tools

Other data needed for analysis

Handling of data for various nodes


FTP/scp sync

FTP/scp verifier

IMAP sync

IMAP verifier

files to db