I won't give Nvidia any easy way out. They do shady shit like releasing the GTX 970 with a gimped memory subsystem and flat out lying about the specifications of the card. So a graphics card with 4GB of vRAM on the PCB can only actually use 3.2GBs of it at full bandwidth and as soon as it crosses that threshold the remaining balance operates at a snail's pace causing stuttering and visual artifacting. There was a big stink about this. Yet, guess which is the top selling video card from their lineup? You guessed it. The GTX 970 (again, their loyal user base bends over and lubes up with pride for the long Nvidia dick). Some time after that they disabled overclocking from laptop GPUs by way of drivers calling laptop overclocking a "glitch" that shouldn't have been available. Again there was a big stink about this and shortly there after they released a new driver allowing overclocking on laptop GPUs and they again went back to calling it a "feature." They also went on to locking their customers out of having DisplayPort 1.2a so that if they wanted an adaptive sync monitor they were forced into paying the price premium for a G-Sync monitor instead of being able to use VESA's free adaptive sync standard also called Freesync by AMD. This is merely a tiny fraction of recent history. They do a lot of fucked up shit.
My largest gripe against them by far and away has to do with the flat out anti-competitive piece of shit software they call GameWorks. That bullshit is aimed squarely at fracturing the hell out of the gaming industry by software gimping AMD's graphics solutions all the while not allowing AMD to optimize for it since its proprietary and only game developers are given access. Nvidia pays, and richly so, to have game developers incorporate GameWorks into their games. Yet there are only a few games with GameWorks in it that have been released without being legitimate hot fucking messes. Game after recent game has been coming out that played like complete pieces of trash like Watchdogs, Assassin's Creed Unity, Far Cry 4, Project Cars, Dying Light, Witcher 3, Batman Arkham Knight etc etc etc. In my mind at least, its no coincidence they were all GameWorks titles.
The direction that Nvidia is steering the industry is that of forcing customers into buying a video card based on what games they want to play. Meanwhile you have AMD still playing Mr. Nice Guy giving away free access to their code. This shit has to end and while Nvidia's user base remains indifferent or all too ready to bend over repeatedly to their will then nothing is going to change.
I actually think nvidia cleaned the 970 mess up as well as you could, and this is the least of my concerns when it comes to their practices. What gets overlooked is the massive library of games that are being released on the 3 consoles, and also appears to continue with the Nintendo NX. So as far as development bias, I think the field is slightly more level than some are willing to admit, especially considering that demanding console games outsell demanding PC games considerably. To me this angle of the argument is a bit of a push, and is not why I have issues with nVidia.
The competitive practices are one thing, but the industry practices are another. What bothers me starts back at the 3Dfx days, where 3Dfx had the opportunity to release their new version of Glide(but never did because of their own mistakes, like becoming their own board maker as well, which created a massive hole in revenue and sales by leading board makers in the industry). 3Dfx pissed everyone off with that move, except the first generation of consumers going into the Voodoo3 line. Well, nVidia knowing that the new version of glide could come out, and with so many game studios loving the glide toolkits for development, rushed in bed with microsoft to lock in Direct 3D acceleration. Now, this was a good business move, but a bad industry move. As nVidia cut themselves out of the OpenGL OS world. Apple, Linux, BSD, Sun(still a big player at this time, and they had to fork into their own solution with E3D), when nVidia could have been the hero here and adopted OpenGL as a primary development option. Instead, nVidia squeezed 3Dfx out by developing a very very very buggy D3D standard(which took a few years to improve). Which is where ATi started to sneak in from being a middle of the road chipmaker that gave consumers the only option for an all-in-one VIVO standard(All-in-wonder) to support OpenGL primarily and support D3D, all while playing nice with those with a voodoo2 or banshee card.
nVidia was subsidized by Microsoft(which is an illegal practice now), which is how they were able to sell RivaTNT2s for $200(the high end of gaming) compared to $300 for 3dfx's high end at the time. But people bought the 3Dfx still because of the name and the sheer amount of glide support still. MS and nVidia didn't screw any single company, but all of them. Matrox didn't care though, they had the contracts of the military and many other government agencies. ATi picked up on this quick and forced Microsoft's to publish D3D 7 standards to the industry at the same time. Of course, since nVidia contributed most of the standards, every other company was limited to how they would implement the standards. By the time the Radeon R300 came out, ATi was getting Apple money for being almost the exclusive GPU chipmaker(because of that OpenGL support at the time).
Well, IBM made Apples CPUs at that time, and guess what, ATi and IBM started working together too. Now shit is getting good. This is where things heat up. But IBM at this point in their history support a more open design standard, which then ATi had little choice, but to adopt it as well. They were, as you said, giving their code away for the world to see/improve/use, while nVidia started locking everything down. nVidia then buys 3Dfx for almost NOTHING. 3Dfx wanted to release glide as open source, and thus Glide would have been merged into the Open Graphics Library. If 3Dfx said they didn't want to release glide, nVidia wouldn't have touched it. They knew their edge was all based around D3D development, and that their 2D performance was behind, their OpenGL performance was iffy at this point. But if OpenGL expands to include the best 3D optimization breakthrough in history, they wouldn't be quick enough to adopt it.
Xbox comes out. nVidia works even closer with MS. But you know what? I think we can make our own console. Sounds a little bit like a Sony + Nintendo relationship from 1992? Yeah, MS cuts nVidia, goes with ATi.
Around the same time, nVidia starts to make the best AMD northbridge chipset. Hey, we can make CPUs too. AMD starts making their own chipsets and nVidia focuses on intel chipsets with the nforce4. Hey intel, we can make the CPU too. Intel goes, oh hell no, starts their own performance chipset team. Intel, Microsoft, AMD, all told nVidia inside secrets to help improve products. nVidia took, RARELY gave back. This is why qualcomm is very hesitant to work with nVidia directly.
AMD is doing good, they buy ATi. ATi could be that chipset company that nvidia used to be(since ATi showed promise that they could make a nice NB-SB chipset), and we can save money by merging our graphics divisions. Well... let's start the rodeo... AMD then made the series of mistakes I mentioned in a few posts earlier, and both companies hurt because of it. It's funny to think, but ATi could have just as easily rode the storm and come out okay. If Apple didn't drop the PPC for x86, I think IBM would have spun off one or multiple graphics division(s) and made a deal with ATi. But AMD(the x86 manufacturer) wasn't ready IMO to piss off nVidia. Instead, they pissed nVidia and Intel off, which brought them back together a bit more.
So in the end, we're not dealing with the best technologies. We're dealing with a lot of closed technology that is barely standardized. And the efficiency of everything is significantly less than what it should be. It's NO SURPRISE that ARM became the fastest growing standard, because ARM is very open. AMD is also very open, but plays david behind the goliath. So AMD's open standards can get shit on 9 times out of 10 for intels closed solution.
nVidia, contribute BACK to VESA, contribute BACK to OpenGL(and no, not just your scrapped garbage that nobody will use to show good will), Adopt FreeSync and other open hardware standards instead of forcing closed standards across platforms, engines, and IDEs. Open Glide. Stop threatening companies for releasing any form of processor design for graphics and calling that an infringement of your GPU idea(then manipulating companies to pay licensing). Oh, and just because you make claims that you take nobody to court for good PR, doesn't mean industry insiders have kept quiet about the massive intimidation and blackballing. Basically be a contributing member to the tech community in more ways than just releasing competitive products, and a lot of your haters will skeptically start switching back, including the next generation consoles, more mobile devices, etc. If Microsoft can open a bulk of IPs(ones that were questionably obtained, like many of your IPs, nVidia), so can you!