surely defining a byte as 16 bits, completely throws out common terms like Megabyte.
The DCPU doesn't redefine a byte as a word. It can only access data aligned in words, but that doesn't make it define a byte as 16 bits. One byte is, always will be and is universally 8bits (well, it is the de facto standard). The size of a word is defined by the system. x86 allows for any alignment, but aligning as a word is technically (at least it should be) minimally more optimal.