I'm sorry to do this, but this may be lengthy. The TL;DR is that a completely objective review is impossible.
Now that's out of the way, let's dissect a TPU review.
1) Initial specifications are given for the device, and they are compared to similarly priced and specified items.
2) Unboxing is done and pictured.
3) The device is stripped, and pictured.
4) The testing conditions are set, and defined.
5) A slew of games is tested at various resolutions, and the FPS is graphed (graphs compare aforementioned "similar" devices).
6) Power consumption is logged and graphed.
7) Fan noise is measured and graphed according to a strict and defined test system.
8) Performance summaries are tallied. I'm counting this as one section, because it's all calculated based on previous data, despite spanning multiple pages and graphing all of the information.
9) Overclocking results and figures.
10) Other factors that aren't standard to all reviews. Sometimes this is observations on technology, sometimes it's an opportunity for the author to digress and explain why something was done.
11) Conclusions.
What do we have to remove in order to be 100% non-biased?
1) Can't keep this. You arbitrarily choose a few items that are "similarly" specified in both camps. You can only compare offerings from one camp in an unbiased situation, as demonstrated by the AMD vs. Intel 4 core CPU debacle.
2) This is unbiased. You could argue pictures can be biased, but the shots are generally without bias.
3) Same as 2.
4) Can't do this. You've set a testing condition that not everyone can duplicate with their current hardware. This is a bias that basically means reviewing is impossible.
5) See 4.
6) See 4.
7) The card you receive isn't representative of 100% of the cards on the market, and the test rig isn't representative of most consumers. It is therefore biased.
8) See 4.
9) Silicon lottery. You've only got one sample, so your figures are biased by a sample size insufficient to represent the product line as a whole.
10) Extra information, that's a severe bias based upon what the author is thinking.'
11) By nature, a biased summary of all figures.
What an "unbiased" review would be is the pictures of what came in the box, and the pictures of the card. As a review, that's a pretty crappy piece of information to judge a product like a GPU with.
Anyone asking for 100% unbiased reviews either doesn't understand what they are asking for, or their justification of what unbiased means would result in a 1000 page manual of data that the average consumer would find absolutely useless. Heck, people are still running cards back to the 2xx (Nvidia) and 6xxx (AMD) generation. If those cards still have some market presence why aren't they on every review? Even the various custom board and cooler option should be explored, because they make differences on the performance, and most definitely the performance per dollar. What you are asking for is a flood of information, which negates the purpose of a review.
Addressing some concerns, the review conditions aren't realistic. Absolutely true. You don't have the same rig as the tester. If you do have the same rig, you don't have the same exact cards. Barring some magical coincidence of having the same hardware, you've still got software and environmental conditions to deal with. In case you missed it, this means that scientifically speaking these reviews aren't 100% reproducible. The thing is, it doesn't matter. Sadly, testing has a margin of error that some people continue to forget. The test showing AMD beat Nvidia by 2 frames in one game, then lost by 3 in another, actually show they are equal performers. Additionally, not all games are created equal. There's plenty of software out there which Nvidia has had a large hand in creating. "The way it's meant to be played" is a phrase any older gamer should know. For these games you'll likely never have an AMD product perform as well as a similarly specified Nvidia card. Demonstrating this performance in a review isn't being biased, only showing the impact less than competitive business practices can have. It's the same drum Red Team fans banged upon when the async shaders testing came out, with AMD in a huge lead. Nvidia did the dumbest thing possible and tried to have these results buried. The results of these tests are not biased, and the reason they were conducted was real world application and not the bias of the author.
Put shortly, AMD isn't distributing these to everyone for a good reason. It isn't bias, because that's demonstrably a crappy premise. It isn't a history with media outlets, because they've stated that samples of these products are in extremely short supply. What we've got is AMD trying to craft a social media PR war against Nvidia. They're choosing outlets that address the technologically ignorant masses, where data doesn't trump cool factor. They're trying to get this product out the door as something that PR sells, well in advance of hard numbers. I honestly believe this is AMD admitting that Fiji isn't a home run. It might be a solid runner on base, but they're not selling Fiji as that, nor are they focused on the future with DX12. AMD is trying to get a PR win because they've got a small form factor and good performance with a new technology. I believe the phrase is "desperation play," and not "victory lap." We love you AMD, but you've really got to see that this is cutting off your nose to spite your face. A $650 card isn't an impulse buy, and trying to sell it on social media frames it as such.
Who is the target demographic AMD? It isn't common people, because a console is cheaper and works from the moment you plug it in. It isn't enthusiasts, who drool over the numbers. It isn't gamers, who a long time ago agreed to large cases to fit the GPUs they needed into them. Heck, it isn't even the HTPC market, because that price tag is just painfully high. AMD is trying to manufacture a new market, and I just can't see it.