I have a problem with the CharsetDecoder class.
First code example (which works):
final CharsetDecoder dec = Charset.forName("UTF-8").newDecoder(); final ByteBuffer b = ByteBuffer.allocate(3); final byte[] tab = new byte[]{(byte)-30, (byte)-126, (byte)-84};
Result aβ¬a
But when I execute this code:
final CharsetDecoder dec = Charset.forName("UTF-8").newDecoder(); final CharBuffer chars = CharBuffer.allocate(3); final byte[] tab = new byte[]{(byte)-30, (byte)-126, (byte)-84};
Result a
Why not the same result?
How to use decode(ByteBuffer, CharBuffer, endOfInput) method decode(ByteBuffer, CharBuffer, endOfInput) for the CharsetDecoder class to get the result aβ¬a ?
- EDIT -
So, with Jesper code, I do this. This is not perfect, but works with step = 1, 2 and 3
final CharsetDecoder dec = Charset.forName("UTF-8").newDecoder(); final CharBuffer chars = CharBuffer.allocate(6); final byte[] tab = new byte[]{(byte)97, (byte)-30, (byte)-126, (byte)-84, (byte)97, (byte)97};
source share