No software developer worth his or her salt will optimise their application for a specific number of CPUs/cores; they will just code their app to use as many threads as necessary to work in the most efficient way. Such an application would theoretically run faster on a 6-core CPU as opposed to a 2-core CPU, but in reality the application may show no noticeable performance difference. (As an example, consider an application that writes to 2 files simultaneously. Since it only writes to 2 files, running it on a CPU with more than 2 cores won't improve the performance at all.)
For anyone who's wondering, yes, I am a software engineer by trade.
The main problem, however, is that the vast majority of applications available today are coded to use only 2 threads (often because it's not feasible for them to use more than that - for example, a web browser). Right now, the only people who will benefit from having a quad-core CPU are the crazy multi-taskers, hardcore gamers, and manic overclockers (IMO).
Finally, does it really matter how Intel designs their chips, as long as they offer excellent performance? AMD tried the "native" quad-core approach and look at how badly that turned out... Intel has gone with an approach that is inelegent, but works well, and that's what the consumer cares about.