First of all, kids may think that decimal representations of real numbers are real numbers instead of merely representing them. This misconception should be cleared up first.
When you divide 1 by 3 using long division, you are guaranteed that the result is in the intervals [0.3, 0.4), [0.33, 0.34), [0.333, 0.334), and so on. Kids get introduced to infinite decimals through long division, so it's only natural for them to think that 0.333... means the real number in the intersection of [0.3, 0.4), [0.33, 0.34), [0.333, 0.334), and so on. And this works for any infinite decimal except the ones ending in infinite 9s. A standard definition would instead make 0.333... mean the real number in the intersection of [0.3, 0.4], [0.33, 0.34], [0.333, 0.334], and so on. Notice closed instead of open intervals.
If we do not adopt the standard definition or an equivalent and define 0.999... as meaning the real number in the intersection of [0.9, 1.0), [0.99, 1.00), [0.999, 1.000), and so on, then there are three options. The first would be that 0.999... does not exist because there is no such real number. The second, that there is exactly one such number. And the third, that 0.999... is ambiguous because there are multiple such numbers.
The first option is merely an alternate notation for writing infinite decimals in the standard real number system, where decimals ending in infinite 9s are forbidden. The second option can be disproven, provided our numbers obey the usual laws of algebra, by considering (0.999... + 1) / 2 or 2 * 0.999... - 1, both of which would meet the definition if 0.999... does, but which are clearly different numbers if 0.999... is different from 1.
[continued]