FarligOpptreden
Executive Member
- Joined
- Mar 5, 2007
- Messages
- 5,396
OH OH OH! This is gonna be good... Deq getting on his war-horse... 
ROUND 1: Fight!
That was also my knee-jerk reaction to the 2GB sized file. What about randomizing say 500 lines at a time and writing it into the new file and, if performance permits, do a second (or even third) pass on the new file, but increase the starting position by about 100 - 200 to randomize the "chunked" output?
ROUND 1: Fight!
So reading in the entire file is not an option.
Do you need the whole file randomized and the pieces, or could you just output the pieces? Max size of each piece?
That was also my knee-jerk reaction to the 2GB sized file. What about randomizing say 500 lines at a time and writing it into the new file and, if performance permits, do a second (or even third) pass on the new file, but increase the starting position by about 100 - 200 to randomize the "chunked" output?
Last edited: