Beware the anonymous authorities and internet cult groupthinkers. A mistake becomes disinformation and gets parroted loudly by the aforementioned cult of groupthink who are very loud on the internet, and often very wrong. Remember, the majority can be wrong, especially when they don't know what they're talking bout (and 'gaffers rarely know what they are talking about).
DDR3 has a 64-bit interface as standard:http://en.wikipedia.org/wiki/Memory_bandwidth
I don't know where those guys came up with the "16-bit chip" thing. They're right about the power consumption being lower (and thus the RAM being more expensive for nintendo), but these are still 1600 chips (memory is 800mhz). All of the above different manufacturers are using x16 'wide' chips. I presume this is the source of the "16-bit" disinformation. All this number means is that inside of the actual casing of the RAM are 16 individual 256Mb (megabit, not megabyte) modules that total up to 512MB (megabyte). x8 and x4 are also common widths when the chip contains a higher amount of ram (less rows of larger capacity versus many rows of lower capacity).
Judging by the motherboard traces, the RAM is split into two pools. We already know that the OS reserves 1GB for itself while the other 1GB is available for games. There are traces connecting each pair, but the pairs themselves are not connected to each other, not outside of the CPU/GPU. This could mean the reservation is also a physical separation. In other words, 1GB of system RAM and 1GB of GPU RAM. I don't know where those traces all go, but they are closer to the GPU die than the CPU die.
Anyways, a single 64-bit module of DDR3-1600 would have the bandwidth being toted around the internet - 12.8 GB/sec. There is also a 17gb/sec figure floating around and I'm not sure who came up with it. Maybe they were assuming it was running as DDR3-1866? But DDR3 can be combined into dual channel for double the bus width or a 128-bit bus. All that is required are at least two identical 64-bit ram modules. 64-bit + 64-bit = 128-bit. 128-bit bus = 25.6 GB/sec bandwidth
The source of this disinformation is apparently beyond3d. Beyond3d used to have contributors from within the industry, but it's largely a ghost town today. The members don't like Nintendo because Nintendo is not the latest tech. I don't think this was deliberate disinformation, more like confirmation bias against Nintendo coupled with ignorance. Sadly, this information is being parroted by Anandtech and the forementioned gaffer cult. That doesn't mean it's true. You can see for yourselves:http://www.micron.com/~/media/Documents/Products/Data%20Sheet/DRAM/4Gb_1_35V_DDR3L.pdfhttps://www.skhynix.com/products/computing/view.jsp?info.ramKind=19&info.serialNo=H5TQ4G63MFR
It's true that GDDR is superior to DDR, but even nvidia and amd/ati use plain old DDR sometimes. Especially in laptops as well as the middle and low ends. For example:
Notice how it is called DDR3 and not GDDR3, notice the memory speed (800mhz = DDR3-1600), notice the bus width (128-bit) and notice the bandwidth (25.6 GB/sec). Xbox 360 has a bandwidth of 22.4 GB/sec along witch bandwidth advantages due to things like the eDRAM which the Wii U would also have.
So the Wii U has more RAM (2GB vs 512mb), faster RAM (800 MHz vs 360's 700 MHz), more eDRAM, as well as a split RAM pool (1GB for OS, 1GB for games). From a RAM perspective, this is a much more efficient RAM design. So why does Batman's textures have to 'load' and 'stream' sort of in realtime at the start of a new sequence? Because that's how the engine works. Both Arkham Asylum and Arkham City load textures like that on the PC even. I know because looking for the 'sweet spot' I ran the benchmark demo dozens of times and it happened at the start every time, and my PC is a pretty darn good. Engine, not the Wii U's RAM.
So why is everybody claiming something ridiculous
to the contrary? Because of Alstrong from B3D and also AnandTech, both of who are wrong despite their 'reputation'. Also because of neogaf and other groupthink internet cults who have no idea what they are talking about and are just repeating information from some other anonymous internet authority who made a mistake. I have no idea where their "16-bit chip and 16-bit x 4 = 64-bit bus width" math is derived from, but it's somebody who doesn't know what they are talking about.