On 13-10-09 07:15 PM, Lex Trotman wrote:
[...]
- For non-recursive searches, grep does not allow a directory to be
specified instead of file. To work around that, we read the directory, back-parse all --include=*.x patterns, and match them manually.
Proposition: grep -rl --include=*.c --exclude-dir=[^.]* --exclude-dir=.?* void . :)
We pass our Directory as a workind directory to spawn, and the recursive search already uses . as a grep FILE argument. --exclude-dir is supported since more than 5 years.
Is there any reason we cannot just walk the directory/subdirectories ourselves and search the files using GRegex stuff? The pattern syntax would then be the exact same as the other search dialogs and it would save having to not only spawn a subprocess but also remove the dependency on grep (only a problem on Windows). Also, as a future optimization, if any of the files to search are open in Geany, we could search their document buffer directly from memory rather than having to do any file IO for them. GIO/GFile has all the stuff needed to walk a directory tree and open files both asynchronously and portably IIRC.
*Technically* the only issue I know of is that last time I looked PCRE was much less optimised than grep (partly because PCRE is more general) so searching a big tree of big files may be noticeably slower.
But this is more code that needs to be written, debugged and maintained. Pull requests (with an ongoing support pledge) are welcome.
I'll gladly write and help maintain it if I don't have to bikeshed over using stupid old versions of GIO which might not have the features needed[1] to implement it properly without dirty hacks or #ifdef stuff :)
The benefits of control and consistency (and as you pointed out on IRC to search the buffers not the files for open documents) may not outweigh the cost of doing it.
Cheers, Matthew Brush
[1]: I haven't investigated deeply, the docs don't list "since" versions for some of the functions that would be used.