|
Joined: Jun 2006
Posts: 136
member
|
member
Joined: Jun 2006
Posts: 136 |
i have written some php code -- about 400 lines -- that allows me to pull data off particular websites (nothing too naughty; it's public financial data from the central bank).
the php code takes up to 60 seconds to finish execution, mainly because it pulls data down from about 4 rather slow sites.
the code finally produces a small HTML file comprising collected data that is imported and displayed in a side bar custom island.
my issue is this: i wish to run this code about every 30 minutes to keep my site up to date with local financial data. where should i place this code? can i call it from the body section of the custom islands? and if i do, will there be any speed consequences for my users?
i'd be thankful for anyone's thoughts on this.
|
|
|
|
Joined: Jun 2006
Posts: 16,304 Likes: 116
|
Joined: Jun 2006
Posts: 16,304 Likes: 116 |
You could start running into the php max execution limit if it takes that long to run... Also, I'd think of doing a cronjob to pull the data vs loading it in a content island, as the lag it takes to generate the data could hold the forum from loading... I'd be interested at looking at your code though
|
|
|
|
Joined: Jun 2006
Posts: 136
member
|
member
Joined: Jun 2006
Posts: 136 |
You could start running into the php max execution limit if it takes that long to run... yes, i've already come across the problem of timeout. most systems are set to 30 seconds. i used ini_set("max_execution_time", "60") to fix it. Also, I'd think of doing a cronjob to pull the data vs loading it in a content island, as the lag it takes to generate the data could hold the forum from loading... you mean, call php from the command line, called from cronjob? php < code.php i can't seem to bring php up like this ... (unix is not really my forte). I'd be interested at looking at your code though well, if you show me yours, i'll show you mine actually, it's nothing too special. just a lot of data munging to extract data from particular locations within tables. and a few data cleanup routines to put the information in a useable format. i'm not sure it would be of generic interest.
|
|
|
|
Joined: Jun 2006
Posts: 136
member
|
member
Joined: Jun 2006
Posts: 136 |
ok, i can now run phh from the unix command line! and the program executes successfully like this. any advice on how can i set up a cron job to run this every 30 minutes?? last time i set up a cron job was in 1993, and i seem to recall some angry sysops ...
|
|
|
|
Joined: Jun 2006
Posts: 136
member
|
member
Joined: Jun 2006
Posts: 136 |
another problem ... it seems my ISP has not given me permission to use crontab. maybe they heard about the last time i used it.
so, seems i could be back to having to execute the php code from within ubb.
would embedding an exec() call in the PHP section of the custom island set up be a problem?
|
|
|
|
Joined: Jun 2006
Posts: 136
member
|
member
Joined: Jun 2006
Posts: 136 |
the exec() didn't work out. so i inserted
require_once("http://jakchat.com/code.php");
at the top of the PHP section in the custom island concerned -- which i presume imported all 400 lines of code, with all its functions -- and this seems to work. i have noticed no speed hits when using the system, using an update time of 4 minutes for testing purposes.
nonetheless, makes me a bit nervous putting so much complex code in that section ...
|
|
|
|
Joined: Jun 2006
Posts: 16,304 Likes: 116
|
Joined: Jun 2006
Posts: 16,304 Likes: 116 |
yes, i've already come across the problem of timeout. most systems are set to 30 seconds. i used ini_set("max_execution_time", "60") to fix it. Still not really a good answer, perhaps splitting it up into several seperate scripts would be better; if it's taking that long to run, that's way too long imo... you mean, call php from the command line, called from cronjob?
php < code.php
i can't seem to bring php up like this ... (unix is not really my forte). No, more setting a cron task to execute: php /path/to/code.php well, if you show me yours, i'll show you mine actually, it's nothing too special. just a lot of data munging to extract data from particular locations within tables. and a few data cleanup routines to put the information in a useable format. i'm not sure it would be of generic interest. Which one did you want to see? lol... As for using exec or system, some isp's block them out, if you where to execute them from command line you could have them run fine in the back end of things, cron would execute them on a set schedual and you wouldn't have to worry about any potention forum lag. BTW, you may not notice any lag now, but trust me, you will lol...
|
|
|
|
|
Test
by Phun - 05/28/2024 7:31 PM
|
|
0 members (),
273
guests, and
292
robots. |
Key:
Admin,
Global Mod,
Mod
|
|
|
|