there may be similar problems in other parsers too and better to prevent complete parser freezes at the hash table level for all parsers than relying on the parser logic.
Firstly I'm definitely in favour of applying this to limit the impact of issues with badly written parsers. After all they are written for offline ctags, not interactive Geany.
But the point was that the real problem is the parser, and that should be fixed to be smarter, as you say:
Of course thats not infallable, $$PLSQL_LINE is a PL/SQL keyword in a Postgresql string$$
, but then that will be misinterpreted now anyway.
be no languages with 1000 character keywords). So storing works as before and is unaffected by this patch.
Well, PL/SQL $$THINGS are really a type of compile time identifier that users can also define IIRC (its been [mumble] decades since I done PL/SQL), so they could be any length.
This line won't be reached because of this check before
Sorry for not being clear, I was talking about the existing code that would be casing a whole $$ string, and which I think is the cause of the slowness, not the hashing. Certainly the test will constrain it (unless an evil
—
Reply to this email directly, view it on GitHub, or unsubscribe.
You are receiving this because you are subscribed to this thread.