they better not make 4070 160bit 10GB
Why would this keep you up at night?
Historically the 70-cards have been some of the best resource balanced cards in most of Nvidia's generation. If they choose to put 10 GB 160 bit memory on it, then that's likely going to work out well.
I think 160 bit is fairly unlikely. While it's not impossible, memory controllers are usually enabled as multiples of 64 bit (as each 64 bit is actually a separate memory controller, i.e. 256 bit is four memory controllers). The memory controllers are connected to 32-bit memory chips, so it is technically possible to enable "half" of a memory controller, but it's fairly rare.
Any how how would a 12GB and 16GB work? Surely you can only get 16GB on a 256/512 bit bus and 12GB on 192/384 bit bus. Higher end card has smaller bus??? It would need massively faster memory to offer more bandwidth or have a much larger cache like IC.
It is technically possible to putt different amounts of memory on various memory controllers, like GTX 660 Ti did back in the day. This would effectively make parts of the memory slower, as more memory are connected to that memory controller.
There is no reason for a high-end card like this to have 12 GB of Vram in 2022. The 1080 TI had 11GB and that was back in 2016! 16GB is pretty much the minimum for future-proofing at 4k.
Future proofing with extra VRAM has never panned out in the past.
As I've explained many times before, as future games get more demanding, bandwidth requirements and computational load increases more than VRAM usage, and these will become bottlenecks long before VRAM does. The only exception to this would be if you gradually sacrifice FPS for max details in future games, pushing the frame rate low and the VRAM usage artificially high. But even then, memory bandwidth will probably bottleneck you.