- AuthorPosts
- June 18, 2010 at 9:40 am #8627A-D-TParticipant
hi!
I just encountered a problem with large files and the find-replace function:
I’ve got a big file with different datasets like1|asdf|12348|asdf|….
2|1234855|asdf|238….
3|238ahsh|asdf|238….
and so onI just want to have all datasets starting with “2|”
So I delete every other dataset with the regex “^[1345]|*$”This works well and leaves me a big file with only “2|” datasets and many empty lines. To optimize the onging use of the new file i also want to delete the empty lines with “^n”. And now comes the problem: It works but is AWFULLY SLOW (aprox. 1\% of the file per 30 seconds).
Am I doing something wrong or is this a problem within EmEditor/Regex++? Is there another way to get rid of the empty lines?
cheers
A-D-TJune 18, 2010 at 4:50 pm #8630Yutaka EmuraKeymasterHello A-D-T,
I am sorry but when EmEditor opens a large file using a temporary file (larger than specified size in the Advanced tab of Customize dialog box), it becomes slow when it needs to add or remove lines. If possible, you can increase this size so that it can be a little larger than the size of the file you are trying to open.
Please let me know if you have further questions.
Thank you,June 19, 2010 at 5:57 pm #8642A-D-TParticipantok. That explains it!
By raising the size it worked fine.
thx! - AuthorPosts
- You must be logged in to reply to this topic.