there may be similar problems in other parsers too and better to prevent complete parser freezes at the hash table level for all parsers than relying on the parser logic.
Firstly I'm definitely in favour of applying this to limit the impact of issues with badly written parsers. After all they are written for offline ctags, not interactive Geany.
But the point was that the real problem is the parser, and that should be fixed to be smarter, as you say:
1. see $$ 2. scan until first non-keyword character 3. now look it up in the keyword table 4. if not keyword skip string to $$
Of course thats not infallable, `$$PLSQL_LINE is a PL/SQL keyword in a Postgresql string$$`, but then that will be misinterpreted now anyway.
be no languages with 1000 character keywords). So storing works as before and is unaffected by this patch.
Well, PL/SQL $$THINGS are really a type of compile time identifier that users can also define IIRC (its been [mumble] decades since I done PL/SQL), so they _could_ be any length.
This line won't be reached because of this check before
Sorry for not being clear, I was talking about the existing code that would be casing a whole $$ string, and which I think is the cause of the slowness, not the hashing. Certainly the test will constrain it (unless an evil :smiling_imp: user made a long name, see above).