Thread: Processing large files dirk
Files are normally processed using the for loops below:
open my $INPUT, '<', $filename or croak "Can't open '$filename': $OS_ERROR"; my @lines = <$INPUT>; close $INPUT; foreach my $line (@lines) { # processing } open my $INPUT, '<', $filename or croak "Can't open '$filename': $OS_ERROR"; for my $line (<$INPUT> { # processing } close $INPUT; If the file is a large one, for example with 100 MB or more the script could stop with a memory allocation failure: Out of memory! The reason is that the array @lines in the first example or the temporary array in the second example will require too much memory. Therefor a while loop should be used which reads and processes only one line at a time: open my $INPUT, '<', $filename or croak "Can't open '$filename': $OS_ERROR"; while ( my $line = <$INPUT> ) { # processing } close $INPUT; So a file with 2 GB can be processed without any memory problems. perkiset
Such important stuff Dirk - lots of n00bs do not consider streaming data anymore - the
PHPfunction file_get_contents() makes it so easy to grab the whole file into a string variable that people are afraid of the fopen, fread, fseek and fclose style functions - which if you're an oldie (like you'n me ) you were raised on...Thanks! Nice one /p |
Thread Categories
Best of The Cache Home | ||
Search The Cache |
- Ajax
- Apache & mod_rewrite
- BlackHat SEO & Web Stuff
- C/++/#, Pascal etc.
- Database Stuff
- General & Non-Technical Discussion
- General programming, learning to code
- Javascript Discussions & Code
- Linux Related
- Mac, iPhone & OS-X Stuff
- Miscellaneous
- MS Windows Related
- PERL & Python Related
- PHP: Questions & Discussion
- PHP: Techniques, Classes & Examples
- Regular Expressions
- Uncategorized Threads