Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Treat UTF-16 strings in binary VDF as little-endian
Integers in binary VDF are already treated as little-endian (least significant byte first) regardless of CPU architecture, but the 16-bit units in UTF-16 didn't get the same treatment. This led to a test failure on big-endian machines. Resolves: #33 Signed-off-by: Simon McVittie <[email protected]>
- Loading branch information