I’ve got a C++/UnrealEngine client that talks to both an existing C# server and an existing JavaScript server (All on Windows).
My problem is when I Base64 encode Guids in C++ I get different results from C# and JS. From what I can tell it’s directly related to how the Guid strings are encoded as mentioned in this SO answer.
I can’t really change the C# and JS servers as they have other dependent systems, so what would I need to do in C++ (or Unreal) to ensure I get the same output?
//The Guid as a literal string
"3F73B3E6-7351-416F-ACA3-AE639A3F587F"
//C# & JSEncoded
"5rNzP1Fzb0Gso65jmj9Yfw"
//C++ Encoded
"P3Oz5nNRQW+sowAArmOaPw"
Additionally using online converters (to sanity check myself) I get the same two different results so there’s no real standardized way it seems.
//https://rcfed.com/Utilities/Base64GUID
//https://toolslick.com/conversion/data/guid
"5rNzP1Fzb0Gso65jmj9Yfw"
//https://www.fileformat.info/tool/guid-base64.htm
"P3Oz5nNRQW+sowAArmOaPw"
The code used for the conversions is below:
//In C#
Guid myGuid = new Guid("3F73B3E6-7351-416F-ACA3-AE639A3F587F");
//Encodes / & + for URL & truncates the trailing "==" padding
string strBase64Guid = Convert.ToBase64String(myGuid.ToByteArray()).Substring(0, 22).Replace("/", "_").Replace("+", "-");
//In JavaScript
//https://stackoverflow.com/questions/55356285/how-to-convert-a-string-to-base64-encoding-using-byte-array-in-javascript
var strGuid = GuidToBase64("3F73B3E6-7351-416F-ACA3-AE639A3F587F");
function GuidToBase64(guid){
//Guid to ByteArray
var buffer = [];
guid.split('-').map((number, index) => {
var bytesInChar = index < 3 ? number.match(/.{1,2}/g).reverse() : number.match(/.{1,2}/g);
bytesInChar.map((byte) => { buffer.push(parseInt(byte, 16)); })
});
var base64String = btoa(String.fromCharCode.apply(null, new Uint8Array(buffer)));
//Encodes / & + for URL & truncates the trailing "==" padding
return base64String.slice(0, 22).replace("/", "_").replace("+", "-");
}
//In C++ (Unreal Engine)
FGuid& myGuid = FGuid("3F73B3E6-7351-416F-ACA3-AE639A3F587F");
FString& strGuid = myGuid.ToString(EGuidFormats::Short);
//Copyright Epic Games, Inc. (AFAIK I'm allowed to paste snippets from the sources, just not the whole thing. UE's source it's all on GitHub anyway.)
//Truncated [Function overloads, & Guid format handling]
uint32 Bytes[4] = { NETWORK_ORDER32(A), NETWORK_ORDER32(B), NETWORK_ORDER32(C), NETWORK_ORDER32(D) };
TCHAR Buffer[25];
int32 Len = FBase64::Encode(reinterpret_cast<const uint8*>(Bytes), sizeof(Bytes), Buffer);
TArrayView<TCHAR> Result(Buffer, Len);
//Truncated [Sanitizes '+' & '/' and cuts the '==' padding]
}
2
Answers
5rNzP1Fzb0Gso65jmj9Yfw
decodes to:P3Oz5nNRQW+sowAArmOaPw
decodes to:Your C++ code has reversed the first 4 bytes. Reversed the next two pairs of bytes and broken the last 6-byte group completely.
The base64 in question is the result of encoding a Guid in its binary format, not in the original string format.
A Guid in binary format consists of:
It does not consist of:
As depicted in the string format.
Your C++ code is base64-encoding an array of four 4-byte integers (because that is how
FGuid
stores them – I do not know why), which is not the same Guid format that C# and Javascript are using. You need to make sure you are using the correct number of integers, as well as correct byte size and endian for them, to match what C# and Javascript are using.IIRC, Javascript uses Big-Endian integers, and C# uses Big- or Little-Endian integers depending on the system (see
BitConverter.IsLittleEndian
).5rNzP1Fzb0Gso65jmj9Yfw
decodes as bytes:e6 b3 73 3f 51 73 6f 41 ac a3 ae 63 9a 3f 58 7f
Which is the following Guid values in binary format:
e6 b3 73 3f
= 0x3F73B3E651 73
= 0x73516f 41
= 0x416Fac a3 ae 63 9a 3f 58 7f
So, you need something equivalent to the following in your C++ code: