tommytx

I have asked this before in other

php

  forums and have not yet gotten a workable answer...

When I use the wget in the cron to fire a

php

  file on any remote server, it not only send an email report to you, but also dumps a report on the root actually one level above the root.. This is aggravating as if I fired the cron 1,000 times, it would dump 1,000 files on the root. It is inefficient to have to delete these files daily... and could build a cron run auto deleter.. but surely there is a command to suppress this...
I tried the >> null thing at the end of the url, and it quieted the email reporting but did not stop the root report..
when i use the cron on the server without going to the

net

  with the wget, i dont get this file but I need to send this cron to lots of other sites... therefore the wget command as it allows  wget http://www.mydomain.com/autoping.

php

  etc..

I am having no problem --q -quiet ing all responses with exception of the one to the root...

If this is not the place for this question if anyone knows a  better place to list it please let me know..

Thanks
Tom

perkiset

nope, this is the place... but we need to wait for TheDarkness in AU or VS in Philadelphia to come back online as they are the best bet... I have never used WGet but do not have the problems that you outline.

I've had trouble in the past when I had carriage returns OUTSIDE of the <? and ?> in a

php

  file... it forced output even when I thought there was none. You might make sure that all of your files *and includes* have NO text AT ALL outside of the

php

  boundaries.

I wonder if the trouble is with wget, because I get no email to root or me on any of my servers and I run an absolute boatload of cron and spawned tasks, and the vast majority are all

PHP

  now...

nutballs

i have the same problem. its Wget that doing it, and its the invisible magic characters perk is talking about.

I came up with a typical ghetto nutballs solution since I am too lazy to actually figure out the right way...
I just added a cron to delete the files every so often like you said. would like to know the "correct way" though.

tommytx

Thanks for the response Nutballs, yes I fully understand it is a normal function of wget and always has been, but there must be a simple suppression command. iF i could use the con just on its own server I don't need to use wget for the command and all is well it will not print the command above the root. And I don't know of any way to address a

php

  file that is not on your server without WGET.  If the commands were only on my server which the normally are it would not be a problem at all.

I am an avid

php

  programmer, but am not sure how to address above the root, i know how to write a program to delete files at the root but this is above the root (above /public_html) and not sure how to command a deletion there, would you be willing to share your idea... so maybe as a temporary I could duplicate it..

Thanks for your input Perkiset, I see you also are heavy into crons... you might be interested in checking out a new idea about cron activity at http://autocron.info interesting to say the least.
<quote>
I get no email to root or me on any of my servers and I run an absolute boatload of cron and spawned tasks, and the vast majority are all

PHP

  now...
</quote>

tommytx

oh by the way if you try that link and its dead, just do a Refresh and it picks up just fine.. I left a message that the page is slow to load and sometimes needs a refresh to load... maybe its a simple problem.. I have had lots of problems with those .info page loading... guess they are just getting too many .info out there.

vsloathe

Initially, I would say that you could just write a

PHP

  script that does file_get_contents (or fopen, or whatever) to your remote scripts, and call that locally via

PHP

  on the command line. I have done this in the past.

For another solution, let me not speak too soon. I need to go look at my wget docs and see if there's a simpler way for you to do this.

vsloathe

Ok, for a simpler solution that could be decidedly less elegant using wget from cron, use the "no clobber" flag. -nc is shorthand.

If you set this flag, your cron job will create one copy of the file, and subsequent requests will neither overwrite it nor create new copies of it. E.g. -nc should work for if you just want to "activate" your remote scripts without downloading anything (except the first time).

P.S. I am not sure this will solve your problem, but I am wondering - is it the cron output or the output from wget that is creating this report? The reason I've answered the way I have is that I'm guessing it's actually wget downloading the full contents of your

PHP

  scripts each time. Give it a shot and let me know if this works for you.

tommytx

Thanks for your response vsloathe.

However here is what happened... it did prevent the buildup of files on the root above.. don't know the name of that area, but it is one level above the /public_htm root...

I left the email sender running to see what is going on and the email sends me the following each minute. I set the cron to fire every minute with the wget command...

File `tommytx.

php

 ' already there; not retrieving.

Also the data in the file is not the actual

php

  file, its simply what ever the printed output of the

php

  file is... just thougght of a possibility... if I suppress all output of the

php

  file would it then skip makeing the file on the above root... that is possible... as it is not necessary for my

php

  to print anything to the screen... or std out at all... but I have a feeling it will just dump an empty file there... I will go try that now and let you know...

So bottom line it stopped the file build up... but also stopped firing off the

php

  file either... so it killed ops.. and that will never do... its a do only once now.. and never activates again unless you delete the file on the above root..

Anyway on my way to suppress all output of the

php

  file to see it that will stop the file buildup...

Will let you know


tommytx

Now that really sucks... no joy... i made certain that no output was printed to the screen during

php

  activation and lo and behold the damn thing prints an empty file with 0 bites... what a piece of shit... there must be a command to suppress output and still allow the file to activate..

I am hoping nutballs will feel sorry for me and send me a copy of his deleter.... I guess I can build my own, but not sure how to address the above root location.... but its ridiculous to have to write a program to suppress a simple thing like that ..but I fire several hundred crons a day so the buildup will be too much.... at least by suppressing it does not use much disk space as zero bytes but will put tons of the same item in my directory..

Shoot... thanks for the help so far if you can think of anything else I can try, just let me know..

Tom

perkiset

As a side question, do you have to use wget? file_get_contents is similar and completely in

PHP

  control... cURL is really strong... even the webRequest2 class will do it... what is the nature of your app?

tommytx

I am not aware of any command you can use with a cron with exception of wget when you need a cron on one server to activate a

php

  file on another server. Is there something else..  I was not aware that curl or file_get_contents or webrequest2 had any capability to fire off a file on another server.

if the files i am lighting of were on the same server it would not be a problem.. but i know of no command to fire a file on a different server but wget... does anyone one know any other... maybe an example..

The application is a program that must be run at specific intervals..which is the primary purpose of a cron. The cron fires the

php

  program which does the job and then resets the cron to the time for the next firing.. in other words when the cron fires it activates a

php

  program on a different server then the job gets done by the

php

  program and as a final step it send a command back to the cron telling it when it should fire again...

Now here is how I run a cron to a file when its on the same server :
/home/username/public_html/; /usr/bin/

php

  ping1.

php

 

And this will not generate the file on the root at all... so all is well, but this direct command cannot be sent to another server... the only way i know of to send a cron signal to another server is via the cron command.

Does anyone know any other method..

perkiset

Ahhhh I get it.

Consider using

apache

  to fire off the job... call a "web page" that's just a

php

  file on another server and you can fire off jobs all day long. Now this might have some user permissions implications, but those can be gotten around as well.

I use

Apache

  a lot to fire off my processes because it handles threading and memory usage much better for me - for instance, if you fire off 100 concurrent .

php

  files via

apache

 , you're still [in] a single instance of

PHP

 , rather than 100 shells and 100 instances of

PHP

 .

Use wget, or cURL or the webRequest2 class (here in the document repository) or even just file_get_contents('http:// ...') on one server via a cron job that calls

Apache

  to fire it off on another.

I'm pretty strong in this arena if you choose this path and want help.

/p

thedarkness

wget http://localhost/ -o /dev/null -O /dev/null

OR for the intensely anal

wget http://localhost/ -o /dev/null -O /dev/null > /dev/null 2>&1

ANY output in a cron job will end up in an email being sent to root by default. This can be adjusted by changing; MAILTO=root

to say
MAILTO=foo@bar.com

at the top of your cron file (use "crontab -e" or edit /etc/crontab directly)

Cheers,
td

vsloathe

Yeah, ok there's a way to specify that the output should "clobber" your 'wgot' (tee hee) file. I will look it up if I get time later this morning.

Sorry, didn't realize it wouldn't do the fetch at all if -nc was set, thought it would still activate the script but not download the output.

tommytx

Hey,

I love that clobber idea, as the no clobber won't even run the file... clobber would be good if it means the darn thing will just save over itself... then i only have one file and I can deal with that ... its the 3,456,789 files i will have in 1 year that i dont like + the wasted space as the file is usually around 5 to 10 k unless I suppress all output printing and I would rather not do that as it would not allow me to run the

php

  file direct from the browser occasionally and be able see if its working ok... by viewing the output..

Thanks for the offer to help, might just take you up on that... later on when I get a minute i will detail exactly what i am doing to see if you have ideas...

however clobber will do the trick just fine if it still fires the

php

  each time...

Thanks for the input Mr Darkness,

But i have tried all that and much more...

<quote>
wget http://localhost/ -o /dev/null -O /dev/null
OR for the intensely anal
wget http://localhost/ -o /dev/null -O /dev/null > /dev/null 2>&1
ANY output in a cron job will end up in an email being sent to root by default. This can be adjusted by changing;
</quote>

None of that will stop it...
wget http://localhost/ -o /dev/null -O /dev/null
The above command and the second does control and stop the email.. and have used them for years. the one going to the root is not being sent by email it is internal in the server... I am using VPS so the server has access to all 100 of my domains and smartly places the junk file on which ever domain the cron was sent from..

So I can use wget from mydomain7.com and fire

php

  on mydomain1,2,3,4.com or whatever on any server in the world and 1 file will end up on the root of mydomain7.com for every single cron that i fire to anywhere....

So again.. clobbber is super if it will clobber and then fire the

php

  remotley... NC no clobber took care of the file build up ... but as we know I could have deleted my cron as accomplisehd as much as deleting the cron will stop the file build up but also stops the firing of the

php

 .. which it did.... so I am hoping vsloathe can find the clobber command.... got my fingers crossed anyway..

Thanks
Tom

tommytx

<quote>
Sorry, didn't realize it wouldn't do the fetch at all if -nc was set, thought it would still activate the script but not download the output.
</quote>

Yeah... I think the damn thing looks at you funny and says "You told me NC not to dupe or write over so I ain't and there aint no use in me even going and mess with the file since you told me not to down load it and I won't ... good bye!  he he..

I really thought I had it whipped when it began to say "You got this one... so I wont download it." until I looked and found that the job did not get done at all... Sure enough the file build  up went away ... but so did job accomplishment..

vsloathe

In here somewhere should be your answer.

From the Gnu Wget manual:

quote

`-nc'
`--no-clobber'
If a file is downloaded more than once in the same directory, Wget's behavior depends on a few options, including `-nc'. In certain cases, the local file will be clobbered, or overwritten, upon repeated download. In other cases it will be preserved.

When running Wget without `-N', `-nc', `-r', or `p', downloading the same file in the same directory will result in the original copy of file being preserved and the second copy being named `file.1'. If that file is downloaded yet again, the third copy will be named `file.2', and so on. When `-nc' is specified, this behavior is suppressed, and Wget will refuse to download newer copies of `file'. Therefore, “no-clobber” is actually a misnomer in this mode—it's not clobbering that's prevented (as the numeric suffixes were already preventing clobbering), but rather the multiple version saving that's prevented.

When running Wget with `-r' or `-p', but without `-N' or `-nc', re-downloading a file will result in the new copy simply overwriting the old. Adding `-nc' will prevent this behavior, instead causing the original version to be preserved and any newer copies on the server to be ignored.

When running Wget with `-N', with or without `-r' or `-p', the decision as to whether or not to download a newer copy of a file depends on the local and remote timestamp and size of the file (see Time-Stamping). `-nc' may not be specified at the same time as `-N'.

Note that when `-nc' is specified, files with the suffixes `.html' or `.htm' will be loaded from the local disk and parsed as if they had been retrieved from the Web.


Sounds like -r or -p, but without -nc should do the trick. Have you tried that?

thedarkness

I guess the real question here is why the hell is it creating the "junk" file in the first place?

The command I posted produces no output, none whatsoever, and no files on my system.

Cheers,
td


Perkiset's Place Home   Politics @ Perkiset's