I have a csv file that I can see with :
file -bi myCsv.txt
that the character-set is iso-8859-1
.
Now I want to read this file using JavaScript from input tag and type ‘file’ without manipulating the encoding.
I tried several options with FileReader()
with a method readAsText()
and readAsArrayBuffer()
or with a fetch API and new Uint8Array(arrayBuffer)
, but every time I send the file (as string or byte[]) to the backend via ajax-post-request, I see UTF-8
and not iso-8859-1
anymore.
I thought that if I read just a bytes of this file, i would not manipulate the encoding.
But with FileReader()
and fetch-API(URL) the encoding is changed every time.
How can i read text files without changing the encoding to check the encoding in the backend, which works fine.
Here is Update, when i’am sending that as byte[]
async function sendFileToValidation() {
const file = document.getElementById("uploadFile").files[0];
try {
const fileURL = URL.createObjectURL(file);
const response = await fetch(fileURL); //no idea, which encoding is set
const arrayBuffer = await response.arrayBuffer();
const byteArray = new Uint8Array(arrayBuffer);
$.ajax({
url: '/validateOnBackend',
type: 'POST',
contentType: 'application/octet-stream',
data: byteArray,
success: function (response) {
alert("we can process this encoding!!!!" + response);
},
error: function (xhr, status, error) {
alert("Error: " + xhr.responseText);
}
});
return byteArray;
} catch (error) {
console.error('error', error);
}
}
2
Answers
I solved my problem. I've used FileReader.readAsArrayBuffer. And then i converted this to byte-array with new Uint8Array(arrayBuffer) and after that i send it as json like: JSON.stringify(Array.from(byteArray));
You need the second, optional
encoding
parameter toreadAsText
detailed here.