Thread: js ideas.
nop_90

Started a new thread so that way do not clutter up perks and nb's cloak thread.

do an
<img src="http" onerror="

javascript

 :alert">

the error will always occur if a "real" browser is running.
but should not occur if G is running.

perkiset

Applause onerror - that's pretty cool nop - NB - perhaps this is could create the instantiation of the

ajax

  request, rather than a set timer or someting... I'd bet that the onerror would be condiserably more overlooked than setTimer and setInterval...

nice nop

nutballs

very interesting idea. obviously I want the process to be as compact as it can be. so using this method the steps would be

this at least could take the cookie dropping img out of the equation.

user lands
set a cookie from server
page loads
img triggers the

ajax

  request (bots wont hit this till later and the context is gone.)

ajax

  then requests the destination URL

ajax

  checks that cookie exists(just to double up the browser confirmations)
if all is good, redirect the user to the URL.

only downside I can see is that the Image would be broken for bots. whereas doing a "real" beacon, you can still reply with even a relevant image. the

ajax

 -image method could show a real image to a user (i assume) but not a bot since all the supporting

ajax

  code wouldnt be available when the bot tries to request the image. I wonder if that would be a good thing? or bad thing?

perkiset

I was seeing the beacon and this onerror jazz as two different points to the same s

pear

 . I think that a straight-ahead beacon that drops a cookie is one piece, then doing this onerror is actually what throws the

ajax

  request - this will stop 'bots that might try to execute "root" code... in the onerror handler you call the Requestor for the redirection url. That last piece is the setInterval piece that is watching for a variable containing <something> and a cookie dropped:

[ pseudocode ]
surfer lands
clearCookies();
var theRedir = '';
setInterval starts watching for (cookie[mycookie] > ' ') and (var theRedir != null)
image is called that contains the myCookie
errant image tag fires onError routine
onError handler kicks off

Ajax

  routine requesting URL
... waiting ...

ajax

  handler gets url and puts it into var theRedir
interval handler sees cookie has arrived from beacon, theRedir > ' '
interval handler sez: top.location=theRedir

I think that about does it

nop_90

It is an very old trick that was resurrected Applause.
And probably where i saw it that guy stole it from someone else Applause

Where is saw it was in a "crackme" a program which you are supposed to crack for fun.
The code was encrypted, no biggie just a matter of "unpacking" it.
encrypting also serves purpose of hiding strings so they can not be sniffed.

But here was the cute part.
If you ran the program not inside the debugger (softice) it would do what it was supposed to.
But If you ran the proggie in softice it would do another thing.
Anyway in a nutshell he was setting a error handler.
Then he would trigger an exception, but if an exception occurs while in the debugger it will be ignored
(since debugger works by catching exception when it single steps)
But the actual "program" was inside the exception handler.

Encrypting the JS is a good idea, but the weakness is that the decryptor leaves a footprint.
Solution to that is use a well know encryptor/decryptor which u can find on the

net

 .
That way u get mixed up with legit sites Applause
Maybe http://scriptasylum.com/

tutor

 ials/encdec/

javascript

 _encoder.html
Then u can do ur nasty inside Applause.

The only real defense you have is stealth.
Once it has been determined that there is some sort of protection it can be removed given enuff time.

This would require a lot of work so not really a serious idea Applause
But find out "quirks" that various browsers can do.
then hide ur cloaking using that.
For example IE can execute JS in CSS.
If when it request the

ajax

  it not an IE browser ...... Applause

I personally think that G parses JS, but does not actually execute it.
Possibly it might change in the future.
But if G actually executes JS this opens up all sorts of possibilities.
(that is why i doubt they execute).

Anyway my ideas take with grain of salt Applause

perkiset

quote author=nop_90 link=topic=17.msg39#msg39 date=1177022934

But here was the cute part.
If you ran the program not inside the debugger (softice) it would do what it was supposed to.
But If you ran the proggie in softice it would do another thing.
Anyway in a nutshell he was setting a error handler.
Then he would trigger an exception, but if an exception occurs while in the debugger it will be ignored
(since debugger works by catching exception when it single steps)
But the actual "program" was inside the exception handler.

That's hot - I love it.

quote author=nop_90 link=topic=17.msg39#msg39 date=1177022934

Encrypting the JS is a good idea, but the weakness is that the decryptor leaves a footprint.
Solution to that is use a well know encryptor/decryptor which u can find on the

net

 .
That way u get mixed up with legit sites Applause
Maybe http://scriptasylum.com/

tutor

 ials/encdec/

javascript

 _encoder.html
Then u can do ur nasty inside Applause.


Actually, in this case NBs is not trying to cloak the way you might be thinking - he wants ONLY valid (read, "dumb"Applause surfers to float through to money pages... anyone else (for the most part) stops at one point or another, particularly bots. This is 180 degrees from me - I only show my spider food to known SE spiders... no one else. It's kind of backwards but makes sense.

quote author=nop_90 link=topic=17.msg39#msg39 date=1177022934

I personally think that G parses JS, but does not actually execute it.
Possibly it might change in the future.
But if G actually executes JS this opens up all sorts of possibilities.
(that is why i doubt they execute).

I think that it is possible that they do - but perhaps it's only if it's deemed necessary. Running JS on a page would take considerably longer than just grabbing the HTML, stuffing it into a DB and running... but it would not be that hard. Consider: if you simply attach to the back end of IE you could tell it what to do, and JS would be executable by default. If you then were watching screen memory you could do a simple comparison to see if <certain things> were visible or not... it really wouldn't be that hard. There's enough creepiness happening that I think there must be *some* execution going on, but perhaps not always. Or something, or perhaps not. I don't fishing know  Applause

/p

nop_90

quote author=perkiset link=topic=17.msg41#msg41 date=1177024631

Actually, in this case NBs is not trying to cloak the way you might be thinking - he wants ONLY valid (read, "dumb"Applause surfers to float through to money pages... anyone else (for the most part) stops at one point or another, particularly bots. This is 180 degrees from me - I only show my spider food to known SE spiders... no one else. It's kind of backwards but makes sense.

Basically right now i have been throwing stuff up on free servers.
So basically my goals are to use the JS to remove thier ads.
The i place my ads on the page, remove from the page all of the spider food,remove from the page all links so user is not distracted.
Or sometimes i redirect the user to a signup page depending what i am doing.
So I am hiding from 2 things the free host operator and G Applause

quote
There's enough creepiness happening that I think there must be *some* execution going on, but perhaps not always. Or something, or perhaps not. I don't fishing know

I am starting to think SEs in general play a variant of diceman.
Sometimes i can get like 200/300 pages of a site indexed in 2 days after dropping link.
But sometimes it is slow like a week or more.
I thought it was dependant on the freehost, as in maybe it is getting PR from it or something.

So i started making my sites in pairs, both of them with the same KWs just different stuffing.
(stuffing is what i call the stuff i stick arround the KWs. Ussually 2 random sentences).
And then i drop the links in pairs to same site.

I did this 2 days ago.
1 site 200 pages indexed
other site ap

pear

 s to be banned  Applause
I am going to do more experiment but Applause

so i have no idea Applause

perkiset

I dig how scientifically you attack this stuff. You're really rare because the majority of the BHs I've met are marketers or understand the

net

 , but are not coders and do not have that kind of thinking - or perhaps they think that way, but they cannot execute with that kind of clarity.

thedarkness

Another js encoder, different approach, took me a while to unravel it  Applause

http://hivelogic.com/enkoder/form

A couple of interesting takes on the

javascript

  debate;

http://www.johnon.com/105/google-proof-

javascript

 -redirect.html
http://rathamahata.blogspot.com/2007/01/how-do-search-engines-bots-handle.html

With the image onerror causing a missing img href for the spiders how about a third image at the top of the page with an onload="change_to_non_existent_image()" that changes the src for the onerror image to something that doesn't exist. So when the spiders try to get the image it is there but when a surfer comes it changes the src so the image isn't there and triggers the onerror condition. Hope this is clear.

HTH,
td

nutballs

i dig that Dark. so now even another level of blockage.

so the code is now this
1 user lands
2 page loads, with a bad image, or better yet, a real image (but not beacon), where the beacon should be.
3 onload = change image SRC to beacon image URL.
4 beacon drops cookie
5

Ajax

  sees cookie exists now
6

ajax

  requests redirection URL from server.
7

ajax

  receives URL and triggers the redirect

so 99% of bots will fail at 3
the remaining smarter bots (if any exist) have then 4 and 6 to contend with. i leave out 5 and 7 because if they can do 4 and 6, the other two are semantic.

thats triple redundant. me likey.


BTW the reason this is important is because i am getting ALOT of bots that do not ap

pear

  in my IP databases. I also need more "bullet proof" redirection and shielding of my destination sites from being connected to my source sites.

perkiset

I like it a lot as well.

Be totally willing to work through the entire code base with you NBs... I think it's really worthwhile. Not really that tought either...

nutballs

no i know. i will need the help.

i just need to get other things squared away first.

thedarkness

My

javascript

  is nowhere near the level of Perk's (that shit rocks) but I am happy to offer my humble services if they are required. There's going to be very few cloaking systems out there like this and, for the moment at least, I would say 0 bots that can parse it to the extent that anyone knows what's going on.

Cheers,
td

nutballs

thats why i think this could be hot. if done to a high enough level, this could actually eliminate the IP database from my system. I would really have no use for it anymore. And i just re-upped my subscription, lol.

thedarkness

lol, sell it!


Perkiset's Place Home   Politics @ Perkiset's