The Cache: Technology Expert's Forum
 
*
Welcome, Guest. Please login or register. October 16, 2019, 03:58:18 PM

Login with username, password and session length


Pages: [1]
  Print  
Author Topic: How to reset a MYSQL PHP data base... for recount and reselect..  (Read 7105 times)
tommytx
Expert
****
Offline Offline

Posts: 123


View Profile WWW
« on: July 31, 2011, 02:44:25 PM »

My first question in the Mysql forum....
I have something that is confusing me.... it seems to work ok on some servers while not on others... does anyone know of a work around like doing a data base reset or something to allow a second count or second selection..

When the first run is made the code works fine on most servers but when the loop begins it just simply hangs up and will not proceed on the second pass...

On the average when the query is done it usually selects about 35,000 records... then the count($links) says 35k found. Then all the other work goes down ok.. but when its time to loop back to select and count again it hangs at
the following command: $links=$this->db->get_results($query);

So how do I know where its stuck since no indicators are given? The echo query prints fine but the print_r fails the second time around... and a program restart it all works fine one time...

So my question? Is there a data base reset command that would ensure that it could select and recount again.. and what is weird is that it works fine on some servers but not others....

Quote
$query="SELECT * FROM wplb_comments_blogs WHERE blog_keyword ='-1' and campaign_id = '$camp->id' and status='0'";

echo $query;

$links=$this->db->get_results($query);

print_r($links);

if (count($links) ==0 ){

Oh.. if someone is looking for the loop, its not shown but I am sure you understand what I meant.. it loops for the total number of keywords I happen to be working with at the time.. but the first loop and all others fail... due to the lockup at the point I showed..

Thanks for any ideas...
Logged
perkiset
Olde World Hacker
Administrator
Lifer
*****
Offline Offline

Posts: 10096



View Profile
« Reply #1 on: August 03, 2011, 10:49:27 PM »

Tommy I'm afraid your question confuses the heck out of me.

Private message me the real code and a snippet of some data, lemmee have a look.
Logged

It is now believed, that after having lived in one compound with 3 wives and never leaving the house for 5 years, Bin Laden called the U.S. Navy Seals himself.
tommytx
Expert
****
Offline Offline

Posts: 123


View Profile WWW
« Reply #2 on: August 03, 2011, 11:20:33 PM »

thanks for your interest...but I solved it tonight... makes no sense at all but that is the way it is...
When we loaded up the mysql with 37,000 urls.. that is when it would only run thru 1 cycle and stop... so I was assuming that it was server related because some servers it worked and some it did not... when in fact it was if the server was flat full  i suppose overloaded it would stop... so we keep the urls below 20,000 and all is well... so problem solved.... but I am surprised a little old 37,000 items using 2.5 meg broke the mysql...  Thanks for the interest.
Will be happy to send it to you if you still want to look.. but we  can deal  with the reduced number of urls...'
Logged
perkiset
Olde World Hacker
Administrator
Lifer
*****
Offline Offline

Posts: 10096



View Profile
« Reply #3 on: August 03, 2011, 11:27:55 PM »

LOL no interest mate, I'd have given'r a look just to assist. Glad you've got it sorted.

I think there's more to the story though than the number of URLs, that's not enough to choke MySQL running on a 386 with a Zip drive. What about total timeout? Is the procedure just running too long or something and getting kicked out?

Another thought: I don't like big honkin' routines that do a sh!tload. I'd rather write a procedure that does some then quits ... then cron kicks it off again. Since it runs with just 20K URLs I'm just wondering if you're hitting a time or memory limit on your server, rather than a limit of MySQL. Natch, I keep millions and millions of rows in my tables and have had nary a trouble.
Logged

It is now believed, that after having lived in one compound with 3 wives and never leaving the house for 5 years, Bin Laden called the U.S. Navy Seals himself.
tommytx
Expert
****
Offline Offline

Posts: 123


View Profile WWW
« Reply #4 on: August 04, 2011, 12:00:52 AM »

What is even stranger is that the only thing that happens is it has to count all the urls count($links) and that is what would not get done on the second sweep...
In other words it comes into the first loop and counts how many urls it has on hand... now it can go thru and use as many urls as it need to use to get the job done... normally 5 or 6 would be used.... now it loops again.. and instead of using common sense and subtracting the 5 or 6 from the 37,000 the dummy counts them again... and then uses 5 or 6 again... and on and on.... and as long as the total urls is under 20,000 all is well... and there is not much going on...

Matter of fact it counts the total urls available for posting... then grabs the first one and tries to make a post on the blog url... then if the post fails it will grab another and try to post normally somewhere between 2 and 6 urls it will get a successful post... then the dummy counts all the urls again... like I said... and counting 37,000 urls is more work that the posting attemp....

makes no sense... but what I see is what I see... nothing was changed but to lower the total number of urls to count..


Totally blew my mind as at first i thought it related to the server or what version of php was being run... but then found out by accident that it matters not which server... only affected by how darn many urls that got dumped into it..


Logged
tommytx
Expert
****
Offline Offline

Posts: 123


View Profile WWW
« Reply #5 on: August 04, 2011, 12:04:02 AM »

And that is where my title came from I wondered if some mysql on some servers required a mysql reset of some sore before the next recount could be done... like the counter hit bottom and need to be moved back to the top..
I know I was pulling for straws......
Logged
perkiset
Olde World Hacker
Administrator
Lifer
*****
Offline Offline

Posts: 10096



View Profile
« Reply #6 on: August 04, 2011, 12:04:58 AM »

Evrything youre saying points to the local execution instance to me.

have you tried running it on another php machine ...?
Logged

It is now believed, that after having lived in one compound with 3 wives and never leaving the house for 5 years, Bin Laden called the U.S. Navy Seals himself.
perkiset
Olde World Hacker
Administrator
Lifer
*****
Offline Offline

Posts: 10096



View Profile
« Reply #7 on: August 04, 2011, 12:06:15 AM »

I wonder also if you are leaking. PHP instances have a memory allotment. Also - are you triggering this via Apache or from the console?
Logged

It is now believed, that after having lived in one compound with 3 wives and never leaving the house for 5 years, Bin Laden called the U.S. Navy Seals himself.
tommytx
Expert
****
Offline Offline

Posts: 123


View Profile WWW
« Reply #8 on: August 04, 2011, 12:08:43 AM »

It is running on many different servers... and a lot of hostgator servers.. but at least 10 different hosts.. so the chance of more than 1 version of php is great..
Logged
tommytx
Expert
****
Offline Offline

Posts: 123


View Profile WWW
« Reply #9 on: August 04, 2011, 12:17:47 AM »

Take a look its very generic and straight forward... the problem is when it loops they keywords.
I have marked below where it stops..


Quote
/* ------------------------------------------------------------------------*
* Processing Single Campaign Function
* ------------------------------------------------------------------------*/   
function process_campaign($camp){
   // ini
   $max_comments_per_keyword=get_option( 'alb_max_comm', '2' );

   //reading keywords that need to be processed
   $keywords=split(',',$camp->keywords);

   //print_r($keywords);
   foreach($keywords as $keyword){
// This loop
      if (trim($keyword) != ''){
         echome('
---->Processing Keyword;'.$keyword.'
');
         // fetching the links to comment from the table wplb_comments_blogs
      
         //getting keywords from files first
         $query="SELECT * FROM wplb_comments_blogs WHERE blog_keyword ='-1' and campaign_id = '$camp->id' and status='0'";

//does this ok...
echo $query;


         $links=$this->db->get_results($query);

// do not print here it will be 37,000 long...wow
// print_r($links);
          

// does this on 1st loop then no more
// unless the links are less than 20,000
// hangs here never moves to next command
print count($links);
         if (count($links) ==0 ){
         
// never makes it here but one..
// echome is just a print to screen routine.
echome( '
-------------------------------------count files links=0-----------------------
');
            $query="SELECT * FROM wplb_comments_blogs WHERE blog_keyword ='$keyword' and campaign_id = '$camp->id' and status='0'";
            $links=$this->db->get_results($query);
         }
         //$query="SELECT * FROM wplb_comments_blogs WHERE blog_keyword ='$keyword' and campaign_id = '$camp->id' and status='0'";
         //$links=$this->db->get_results($query);
      

      
         $i=0;
         $processed=0;
         while( ($processed < $max_comments_per_keyword) ){

         // if the keyword don't have any links availble calling google to get 100 link
         if( ( ($i + 1) > count($links) )){
            if($this->fetch_links($keyword,$camp) == false) break;
            $query="SELECT * FROM wplb_comments_blogs WHERE blog_keyword ='$keyword' and campaign_id = '$camp->id' and status='0'";
            $links=$this->db->get_results($query);
            $i=0;
         }

            $link=$links[$i];
            echome('
--------->should now process:'.$link->blog_url);
            $blog_id=$link->blog_id;
            if($this->place_comment($link,$camp)== 'true'){;
               $processed++;
               echome("
Processed:$processed
");
               //updating the status of the comment to 1
               $query="UPDATE  wplb_comments_blogs  SET  status  = '1' WHERE blog_id ='$blog_id'";
               $this->db->query($query);
               // Incrementing the posted flag number
               $query="UPDATE wplb_comments_post SET posted = posted + 1 WHERE id='$camp->id'";
                echome($query);
               $this->db->query($query);
               $this->log('New Comment Posted on :'.$camp->id,$link->blog_url.';'.$keyword);
            }else{
               //hard remove the link from the queue
               $query="DELETE FROM  wplb_comments_blogs  WHERE  blog_id  = $blog_id";
               $this->db->query($query);
            }
            $i++;
         }
      }
   }

}


Logged
tommytx
Expert
****
Offline Offline

Posts: 123


View Profile WWW
« Reply #10 on: August 04, 2011, 12:21:41 AM »

It really hangs at the $links= in my opinion..

Quote
// actually I think it hangs here with get_results since I have no proof it makes past the $links= below..

         $links=$this->db->get_results($query);

// do not print here it will be 37,000 long...wow
// print_r($links);
         

// does this on 1st loop then no more
// unless the links are less than 20,000
// hangs here never moves to next command
print count($links);
         if (count($links) ==0 ){

Logged
perkiset
Olde World Hacker
Administrator
Lifer
*****
Offline Offline

Posts: 10096



View Profile
« Reply #11 on: August 07, 2011, 02:16:15 PM »

Tommy that all looks RAM problemish to me, since you're essentially trying to bring back the entire result set. That's a really bad way to do things.

What you want to do is create a cursor or result set then walk through one by one, if the set is that large. My guess is that the default PHP configuration has a memory max that is being used at all your hosts which is why it looks consistent.

Here is some code using my database class and the way I look at it:

$db 
= new dbConnection($host$username$password$db);
$db->query('select * from aReallyBigTable');
while (
$db->fetchAssoc())
{
	
if (
$db->row['field_name'] == 'some value')
	
{
	
	
// Do Something
	
}
	
echo 
'.';
}

echo 
"\nAll Done\n";

The downside here is that you lose the speed uptick of just getting essentially a huge blob of data in one shot, but it doesn't matter if you have 19 bazillion records, this will work. Note that this also allows for the handy little advancing dot mechanism to show my progress (assuming you're doing this from the console and not a webpage).

Another element of this methodology allows you to store a pointer in a file (something like "I was at record 17,426") and then create this procedure as a CRON job so that it does, like, 100 records in one shot then dies, and gets restarted by CRON every minute. This is more friendly to your host and is less likely to get you smacked.
Logged

It is now believed, that after having lived in one compound with 3 wives and never leaving the house for 5 years, Bin Laden called the U.S. Navy Seals himself.
Pages: [1]
  Print  
 
Jump to:  

Perkiset's Place Home   Best of The Cache   phpMyIDE: MySQL Stored Procedures, Functions & Triggers
Politics @ Perkiset's   Pinkhat's Perspective   
cache
mart
coder
programmers
ajax
php
javascript
Powered by MySQL Powered by PHP Powered by SMF 1.1.2 | SMF © 2006-2007, Simple Machines LLC
Seo4Smf v0.2 © Webmaster's Talks


Valid XHTML 1.0! Valid CSS!