Where do the problems begin?
When you begin creating patches.
First off, it turns out that the tools that "cook" the data don't produce binary-identical data all the time. The result after cooking is always functionally identical, but the bit-content may differ slightly. Items in unordered lists change order, uninitialized fields contain random data, that sort of stuff.
Why didn't this get caught sooner? Because you only notice problems of this kind when re-cooking the same data several times, from scratch. And re-cooking all BC2 data takes over 48 hours for a high-end computer. And locating & correcting all places where cooking isn't deterministic down to the bit level would take a lot of time (both calendar time and effective development time). Perhaps that time is better spent elsewhere?
So. If different "cooking" runs produce slightly different results, it is suddenly difficult to look at version A and version B of the data and answer the question, "what are the differences between these two datasets?". It's easy when looking at the source data, but when looking at the cooked data there are a lot of changes which have no effect on the final game experience.
There are about 40.000 source files, and this results in well over 100.000 cooked files. Going through those by hand is not an option. Writing a perfect filter, which knows which differences are benign and which are for real, will take as much time and effort as making the cooking 100% deterministic. Neither that is an option.
So you make a filter which does something in between; be smart, create rules for the file types you know about, and when in doubt - assume that the change is for real. Err on the side of caution.
Then you realize that those shader databases were never designed to be extendable. What happens when a new object is added to a level in a patch? Its mesh & textures are included, no sweat, but what about the shader combinations? How does one create add something to the shader database, when the shader database is an opaque binary block whose entire contents may change when just one object is added to the level?
(One shader database is about 5MB. There are three shader databases per level - one each for DX9, DX10 and DX11.)
And finally, the patch system itself. Yes, it can replace portions of files on-disk. But due to its heritage (from BF Heroes), it is not able to open BFBC2's archive files and apply differences to individual files within the archives.
The only straightforward way is to make all patched-in content arrive in archives on the side of the original archives.
Given the above scenario, we end up with the situation that we have today.
Each patch gets larger than the previous, because the game drifts ever further away from what was shipped on the DVD. Changes that require shader database updates make the patch balloon in size. And we have to be careful and clever when selecting which file types to include and which to ignore when creating the patch.
And that's where we finally ran into real problems. It was too difficult for one person to identify which changes were required and which were not, and how to update the patch generation process to accommodate the latest set of changes. Most of the delay of Client R8 was because there are very few people at DICE who have the in-depth knowledge of the far-spanning corners of the game engine *and* the cooking tools *and* the patch generation process, to work out what is going wrong, why, and how to fix it.
The new content did work a long while ago - but the patch was back then approximately 7GB large. The patch had to get down to less than 1GB, or else some customers in Australia and South Africa would not be able to download it due to bandwith caps.