Date
2009-09-03.00:00:00
Message id
1122

Content

The footnote for int_type in [char.traits.typedefs] says that

If eof() can be held in char_type then some iostreams implementations may give surprising results.

This implies that int_type should be a superset of char_type. However, the requirements for char16_t and char32_t define int_type to be equal to int_least16_t and int_least32_t respectively. int_least16_t is likely to be the same size as char_16_t, which may lead to surprising behavior, even if eof() is not a valid UTF-16 code unit. The standard should not prescribe surprising behavior, especially without saying what it is (it's apparently not undefined, just surprising). The same applies for 32-bit types.

I personally recommend that behavior be undefined if eof() is a member of char_type, and another type be chosen for int_type (my personal favorite has always been a struct {bool eof; char_type c;}). Alternatively, the exact results of such a situation should be defined, at least so far that I/O could be conducted on these types as long as the code units remain valid. Note that the argument that no one streams char16_t or char32_t is not really valid as it would be perfectly reasonable to use a basic_stringstream in conjunction with UTF character types.