The Cache: Technology Expert's Forum
 
*
Welcome, Guest. Please login or register. September 22, 2019, 11:49:15 PM

Login with username, password and session length


Pages: [1]
  Print  
Author Topic: tracking user stats using PHP  (Read 5253 times)
skyts
Rookie
**
Offline Offline

Posts: 22


View Profile
« on: July 11, 2007, 07:26:14 AM »

When I use my PHP cloaking script to redirect users out of my site, (Based on IP and very similar to the one posted here: http://www.syndk8.net/forum/index.php/topic,7864.0.html)  I get an annoying problem. My Awstats just does not record any information like the keywords people are coming from and I don't know shit besides what the affiliate company is telling me and I am sure they are fishing me up.

Does anyone have any idea how to enable the tracking when using a PHP script, Or should I build a small tracking application inside the existing cloaking script?

Thanks for any help Smiley
Logged
perkiset
Olde World Hacker
Administrator
Lifer
*****
Offline Offline

Posts: 10096



View Profile
« Reply #1 on: July 11, 2007, 09:19:56 AM »

Absolutely build a little tracking app inside of the cloaking script.

At that moment in time, you have everything: you have them in script, you have everything about them and you have access to the DB. Grab whatever you want to know about them and toss it into the database BEFORE executing either the surfer or spider side of your cloak.

Every single app I have, even straight up WH sites do exactly the same thing - at the moment the surfer arrives, I track what I want from them - then I produce their page.

/p
Logged

It is now believed, that after having lived in one compound with 3 wives and never leaving the house for 5 years, Bin Laden called the U.S. Navy Seals himself.
skyts
Rookie
**
Offline Offline

Posts: 22


View Profile
« Reply #2 on: July 11, 2007, 01:16:54 PM »

Yhea... You are probably right. I might aswell use this opportunity to form a central database for all of my sites, Using A remote Mysql database, And get all the stats together.

Thanks Smiley
Logged
skyts
Rookie
**
Offline Offline

Posts: 22


View Profile
« Reply #3 on: July 13, 2007, 10:27:19 AM »

OK So now I have written down a small app to track all users that are being redirected by me.
I am using IP delivery using a remote Mysql database. I was thinking I should use the same database that I query in order to do the IP delivery, Also for recording stats of my websites. This would be done in the same query and will be the fastest way. Ofcourse, I open a new table for that info. 

Now, I am logging a lot of traffic in that Mysql database. What do you think about what I did? Will I encounter problems with Mysql if I continue to log all traffic from all of the sites in 1 mysql database?
Logged
perkiset
Olde World Hacker
Administrator
Lifer
*****
Offline Offline

Posts: 10096



View Profile
« Reply #4 on: July 13, 2007, 10:32:29 AM »

Hey Skyts -

Over time that file might get pretty large, but other than that I don't see any immediate problems.

My sites do a two-stage cleanup. First, after state records have expired, I distill them into my database and keep basic data as well as click-by-clibk data. This helps me understand how my users are experiencing my sites.

Every night, another process takes every state record that is older that 60 days and accumulates it into a single daily traffic record (the extreme detail of the record is no longer necessary). This leaves me with a single record per day on each site, which is very manageable. If I didn't do this, I'd just have enormous tables in my database and it would be unwieldy - but leaves me data for month over month and year over year comparison.

Well done and hope this helps,
/p
Logged

It is now believed, that after having lived in one compound with 3 wives and never leaving the house for 5 years, Bin Laden called the U.S. Navy Seals himself.
skyts
Rookie
**
Offline Offline

Posts: 22


View Profile
« Reply #5 on: July 13, 2007, 10:38:27 AM »

Hey Skyts -

Over time that file might get pretty large, but other than that I don't see any immediate problems.

My sites do a two-stage cleanup. First, after state records have expired, I distill them into my database and keep basic data as well as click-by-clibk data. This helps me understand how my users are experiencing my sites.

Every night, another process takes every state record that is older that 60 days and accumulates it into a single daily traffic record (the extreme detail of the record is no longer necessary). This leaves me with a single record per day on each site, which is very manageable. If I didn't do this, I'd just have enormous tables in my database and it would be unwieldy - but leaves me data for month over month and year over year comparison.

Well done and hope this helps,
/p

That's a wonderful idea. I did make a script that will work as a cron job deleting records by a required date, But I did not think about the option to make a script that will accumulate data for each day. I think I will add that aswell.

Always full of good ideas Smiley thanks perkiset


Logged
perkiset
Olde World Hacker
Administrator
Lifer
*****
Offline Offline

Posts: 10096



View Profile
« Reply #6 on: July 13, 2007, 11:28:11 AM »

Most kind. Good luck,

/p
Logged

It is now believed, that after having lived in one compound with 3 wives and never leaving the house for 5 years, Bin Laden called the U.S. Navy Seals himself.
skyts
Rookie
**
Offline Offline

Posts: 22


View Profile
« Reply #7 on: July 15, 2007, 02:44:07 AM »

Fish. I have just looked at the stats created, And I am not satisfied at all. I used Query_string, And I got 0 results from that one. Even in a session where I had HTTP_REFERRER, Query_string just did not parse the results good enough to present me the keywords people are coming from.

I guess i should build my own parser for HTTP_REFERRER if I want to get the keywords people are using to get to my pages?

Or am I doing something wrong...
« Last Edit: July 15, 2007, 02:55:20 AM by skyts » Logged
perkiset
Olde World Hacker
Administrator
Lifer
*****
Offline Offline

Posts: 10096



View Profile
« Reply #8 on: July 15, 2007, 09:38:36 AM »

Yeah probably... I'd build a little class lib that understands each of the inbounds you want to parse and then an "other."

For example, since you know how a Google url looks, if the ref lists google, process the URL with code specifically for them and post to a database. Also, I'd mark a little column "handled" to be true.

Any referrer that you DONT have a handler for, put it into the same table but in a column called "raw" or something and set handled to false.

Then, you can look at the spread of all referrers in one table whether handled or not, and as you see enough of a domain that you want to handle, you can write code for it and go back and "handle" those.


/p
Logged

It is now believed, that after having lived in one compound with 3 wives and never leaving the house for 5 years, Bin Laden called the U.S. Navy Seals himself.
skyts
Rookie
**
Offline Offline

Posts: 22


View Profile
« Reply #9 on: July 15, 2007, 02:33:23 PM »

Nice, I am only interested in traffic coming from the big 3 so what I did was leave the Query_string (No results from it until now) and do a parsing of google, yahoo and msn for the keywords. All other referrers? they can kiss my ass Smiley



Thanks perk


Logged
skyts
Rookie
**
Offline Offline

Posts: 22


View Profile
« Reply #10 on: July 17, 2007, 10:23:30 AM »

Now I have a new problem, perk you've gotta help me here...

I made a php script that will extract all data from the database, and create an xls file for every domain. Now, I setup a linux cron to execute this script every few hours, So I can be updated.

But now I get errors:

Fatal error:  Maximum execution time of 30 seconds exceeded

And so, I set:
max_execution_time = 200

Guess what? I got this error:

Fatal error:  Maximum execution time of 30 seconds exceeded


So now it will only grow... what shall I do? Shall I set max_execution_time = 1000000000 and just forget about it? Or will it have implications on other script I use?

Or maybe I can set PHP timeout from within the script?

Appreciate any help on this.
Logged
perkiset
Olde World Hacker
Administrator
Lifer
*****
Offline Offline

Posts: 10096



View Profile
« Reply #11 on: July 17, 2007, 09:45:37 PM »

Is this your own box? If it is (or a vps) then set the maximum execution time to 0 so that ou are not limited at all. You can also do this with set_time_limit(0) provided your box is not running in safe mode.

Beyond that, if your job is that huge, you might consider breaking it up into pieces rather than processing it all in one big go. I do with with my client's eblasts, for example - every minute a cron job looks to see if there is anything to do and if so, process it otherwise go back to sleep...

Hope this helps,
/p
Logged

It is now believed, that after having lived in one compound with 3 wives and never leaving the house for 5 years, Bin Laden called the U.S. Navy Seals himself.
skyts
Rookie
**
Offline Offline

Posts: 22


View Profile
« Reply #12 on: July 18, 2007, 04:07:41 AM »

The only thing I have to tell you is -   Praise

Thanks perk Smiley
Logged
Pages: [1]
  Print  
 
Jump to:  

Perkiset's Place Home   Best of The Cache   phpMyIDE: MySQL Stored Procedures, Functions & Triggers
Politics @ Perkiset's   Pinkhat's Perspective   
cache
mart
coder
programmers
ajax
php
javascript
Powered by MySQL Powered by PHP Powered by SMF 1.1.2 | SMF © 2006-2007, Simple Machines LLC
Seo4Smf v0.2 © Webmaster's Talks


Valid XHTML 1.0! Valid CSS!