I have a vector<BYTE> that represents the characters in a string. I want to interpret these characters as ASCII characters and store them in a Unicode string (UTF-16)
You should use std::vector<BYTE> only when working with binary data. When working with strings, use std::string instead. Note that this std::string object will contain special characters that will be encoded in sequences of one or more bytes (the so-called multi - byte characters ), but this is not ASCII .
Once you use std::string , you can use MultiByteToWideChar to create your own function that converts std::string (which contains a few bytes of UTF-8 characters) to std::wstring containing UTF-16 encoded points:
// multi byte to wide char: std::wstring s2ws(const std::string& str) { int size_needed = MultiByteToWideChar(CP_UTF8, 0, &str[0], (int)str.size(), NULL, 0); std::wstring wstrTo(size_needed, 0); MultiByteToWideChar(CP_UTF8, 0, &str[0], (int)str.size(), &wstrTo[0], size_needed); return wstrTo; }
source share