Hi,
How do I get Geany to recognize (Linux text) files as UTF-8 encoded?
The files in question are legacy Windows txt files, written in French (i.e. with lots of accents) which I have converted to mode: Unix (LF) encoding:UTF-8 by a Perl script that does
"iconv -f CP1252 -t UTF-8 --output=$tempfile $infile"
and "dos2unix -n -f $tempfile $outfile"
-f for 'force binary files'? Geany can't handle binary files.
It appears that if the infile has a final \x{OA} character, then this arrives in the outfile.
\x0A ist \n, hard to imagine this really confuses Geany that much.
I can open these files with JEdit or Kate, no problem. But Geany's behaviour with such files is inconsistent.
Sometimes Geany refuses to do anything, saying "... does not look like a text file, or the file encoding is not supported",
Sometimes Geany renders the file using encoding UTF-16 LE, which makes it look as if written in Mandarin Chinese.
And sometimes Geany opens such 'problem' files correctly, as UTF-8. So far as I can see, this tends to be the case if there are already several txt files open.
I have tried putting the line /* geany_encoding=utf-8 */ as line 1 of a problem file, but that does not seem to have any consistent effect.
Without having a look at the code, I was sure in-file headers would take precedence over guessed encodings.
Anyway, it's quuite hard to help here without knowing about what files we are talking here. Could you share some of the problematic files? If not possible in public, at least via private mail?
Regards, Enrico