ddddd wrote:The only problem I have is that an Arkham City dev (the one that leaked the full specs of the console) said that it indeed it was a "1GB of slow RAM", and this was said months before E3. Maybe you could send this post as a tip to RMC so it gets attention.
It is slow RAM by modern PC standards. By console standards it's a tiny bit better than the 360 and equal to the PS3's 256MB of XDR RAM (the other 256MB is slower). A modern low end PC graphics card with only a 64-bit bus (no multi-channel utilization) width using GDDR5 RAM even has a higher bandwidth (28.8 ). The high end is like 288 MB/sec or something like that.
Here is a neogaf thread spreading around erroneous "hard facts" about the Wii U:http://www.neogaf.com/forum/showthread.php?p=44512184#post44512184
Hard facts (either publicly disclosed, or a non-public leak which can be vouched by somebody trustworthy on this very forum):
[*]2 GB of gDDR3 memory @800MHz (DDR3-1600), organized in 4x 4Gb (256Mx16) modules, sitting on a 64bit bus (@800MHz). That gives a net BW of 12800MB/s (12.5GB/s). We can conveniently refer to this pool as 'MEM2'. Currently 1GB of that pool is reserved for the OS.
I've only listed one of their "facts", specifically the "fact" in question (along with quoting the lead sentence referring to these as 'hard facts')
While they understand that the chips are each 4Gb chips with 16 256Mb 'sub chips', they are wrongly assuming it is a 64-bit bus based on the incorrect "experts" like B3D and Anandtech. Like hergipotter mentioned, on a PC there are several chips combined on a DRAM module. They are combined in configurations like x4, x8, x16, doubled-sided, single-sided, low density, high density, etc. These individual chips they make these days, like the ones on the Wii U, also have "sub-chips" underneath their single chip casing. So this neogaf person got the 256Mx16 part right but is repeating the 64-bit 12.5GB/s disinformation.
This is why people shouldn't just accept "hard facts" from some random internet anonymous authority-type. Especially sites like Anandtech and Digital Foundry which should know better. When they repeat the disinfo, it makes the disinformation 'reputable' which then gets repeated as a reliable source. Also, notice the traces in between the pairs of RAM chips:
There are no traces connecting the two pairs, however. Which might be another knock against the neogaf "hard facts" because they are considering the total RAM as a single pool when from a CPU/GPU perspective they might be 'wired' separately. In other words, 1GB isn't just set aside for the CPU, 1GB is physically dedicated system RAM sort of like how the 256MB in the PS3 is. And if that's the case, the GPU's 1GB also may be dedicated. This might mean nothing though, it's just a guess. If it were the case, this would improve bandwidth over the 360 design where all RAM is shared within a single pool. The reason why the split pool is a problem on the PS3 is because it's split into two 256MB chunks and each has different bandwidths for reads and writes. Sometimes it's a good thing for PS3, but when it comes to 360 ports it isn't because the RAM amount is low. This effects games such as Skyrim, the primary reason for the large save game problem. If the PS3 had more RAM it wouldn't be a problem. The Wii U has 4x the RAM as the PS3 so it should have no RAM problem for a game like Skyrim. Skyrim relies on the CPU though, the shadow maps especially are CPU-intensive. It would need some optimization to run on the Wii U and Bethesda aren't very good at things like that.