jp29997

hello smart guys:

I have  515 mb sql dump  -  i can probably figure a way to open it manually and split it up (its got a lot of data i dont need to import) but is there a command line short cut to split this thing up semi-intelligently, you know like between blocks of statements?

perkiset

I don't know of any simple commands, but some immediate ideas come to mind:

Can you select into another "temp" table that has precisely what you want, then dump that?
Is it a size of download problem? SSH to that box, store the SQL there, tar.gz it for the trip home

/p

jp29997

nah download is OK, I have it on my box, is mainly the pain of opening it - im moving this dump to a VPS, and i might as well edit it on my local

mac

 hine before dumping it OR split it intelligently to make it easier to deal with as an iterative task

i can split it with a program like hjsplit for windows or split -b or split -l in

linux

  and then clean it up, but i was hoping someone might have a little shell script or batch file that did this smarter like splitting it just before a sql file comment block

--
-- dumping...
--

no big deal i'll just futz with it, my specialty

or alternative anyone use a text editor that loves gigantic file? im using notepad++ which is pretty challenged by a file this size

edit: please dont suggest e

mac

 s :rofl:

perkiset

quote author=jp29997 link=topic=292.msg1965#msg1965 date=1181081823

edit: please dont suggest e

mac

 s :rofl:


Fair enough. Use VI.  Applause


Perkiset's Place Home   Politics @ Perkiset's