• Puzzle_Sluts_4Ever@lemmy.world
    link
    fedilink
    arrow-up
    30
    ·
    1 year ago

    There are a LOT of aspects to that video that are going to overshadow everything, but the segment on the “random” CPU review they watched (I think it was Jake talking about a recent Ryzen?) is one that more channels could learn from. https://www.youtube.com/watch?v=FGW3TPytTjc&t=1193s

    Over-summarizing, but GN point out that building your benchmarking around trying to match a company’s public benchmarks and then checking with them if it is okay (and then a weird comment about how not matching those would have made LMG one of the only outlets to not have favorable results?). Yeah, there were a LOT of false specs and blah blah blah.

    I see other channels make the same mistake. They are more interested in vetting their review process against the press releases. That is not reviewing. That is regurgitating a press release.

    The actual way to review is to try to reproduce. If you can’t, you reach out to your contact or the press team for that company. If you can’t reconcile, you publish and explain in detail what is going on. If you can’t build a proper hypothesis on why your data is different? Then you aren’t qualified to review shit. You don’t have to be right (corrections are a thing) but you do have to be willing to take down your video if you are wrong (one of the bigger issues).

    Because… marketing can and will lie.