Welcome, Guest

Author Topic: usf distributed computing project  (Read 2245 times)

vh

  • formerly mudkipz
  • *****
  • Posts: 1140
  • "giving heat meaning"
usf distributed computing project
« on: February 22, 2014, 09:37:00 PM »
do you know those things you have to run 24/7 on a computer like imgget but inevitably there will be downtime, computer crashes, power outages, and shit like that? well i figured if two people had a dedicated machine to run these programs on if one went down the other could take over.

but that's not the whole scope of this thread. you might also request that someone else back up a few important files on their computer in case something happens to use (encrypt sensitive info first), or run a script (like imgget) for a few hours if power is about to go down.

so below, list what you project is, and post if you're running someone's project, so they know.

an entry should consist of the projects:
ram usage
memory usage
cpu usage
dependencies
any other notes
and include the code or exe of course
« Last Edit: February 22, 2014, 09:53:38 PM by vh »

vh

  • formerly mudkipz
  • *****
  • Posts: 1140
  • "giving heat meaning"
Re: usf distributed computing project
« Reply #1 on: February 22, 2014, 09:41:04 PM »
here's a first post:

ram usage: 10MB average, maximum around 50MB
memory usage: 2MB per week, maximum 2MB per hour
cpu usage: eats up one core if given that, 9% while cpu under load, this can be lowered significantly if needed
dependencies: python and urllib2 which is probably installed with python
other notes: needs internet connection, backs up self every 10 kiloseconds, put in a separate folder because it spits out data dumps. the script prints a . if everything is working, and an x if everything is not working (probably internet issue)

Darvince

  • *****
  • Posts: 1842
  • 差不多
Re: usf distributed computing project
« Reply #2 on: February 22, 2014, 09:42:24 PM »
pygcm

atomic7732

  • Global Moderator
  • *****
  • Posts: 3848
  • caught in the river turning blue
    • Paladin of Storms
Re: usf distributed computing project
« Reply #3 on: February 23, 2014, 02:25:47 AM »
i had an idea for a really minimal python script that just loads a page to have imgget run on my website but then i realized that would require entirely recoding it to work in php and that sounds not fun so instead i'll probably just package up imgget, make it throw it's images in folders and ask people to upload craptons of images every week

Yqt1001

  • *
  • Posts: 0
    • Airline Empires
Re: usf distributed computing project
« Reply #4 on: February 23, 2014, 10:13:09 AM »
recoding it to work in php and that sounds not fun

file_put_contents('directory you want the image', file_get_contents('url where the image comes from'));

Was that so hard? The only problem is it takes a lot of work to make it so that you can create directories (with proper permissions), so organizing the files is tedious.

atomic7732

  • Global Moderator
  • *****
  • Posts: 3848
  • caught in the river turning blue
    • Paladin of Storms
Re: usf distributed computing project
« Reply #5 on: February 23, 2014, 10:50:29 AM »
well no, i need it to compare the images and not download them if they are the same and stuff and aaaaa

copying and moving files is really easy yeah but i don't want like 6 copies of the same file

vh

  • formerly mudkipz
  • *****
  • Posts: 1140
  • "giving heat meaning"
Re: usf distributed computing project
« Reply #6 on: February 23, 2014, 11:00:18 AM »
convert the image to string and compare md5's?