Tbh, I don't think necessarily about funding. As counterintuitive as that might sound.
Game engine teams in bug studios are usually up to a hundred people in permanent employment, most of them grunt workers, and a few extremely skilled people working out the hard problems. They probably get paid more than academics, but the amount of time and manpower expended is about the same.
Usually, high budgets spent on R&D have got to do with contracting lots of experts from different fields, paying their salary, and providing them with expensive, maybe even experimental equipment. The guys in the engine teams of video games usually work on a bog standard PC, are in-house, and the special equipment is usually a local data center used to pre-process game assets. It's not anything different than what a decent uni in the west has access to (I wish I could go to a western uni and have free access to a powerful data center, and be surrounded by smart people, sigh).>>59432>>59433
Strangely enough, the video game industry, being a billion dollar industry, is not a victim to patent trolling. There's a culture of sharing at least general technique and information, if not source code. ( interesting post by John Carmack: https://slashdot.org/comments.pl?sid=151312&cid=12701745
There's conferences and talks where people from the industry give presentations and stuff, just not code I guess. The information IS shared, but between people in the industry (it's a relatively small world in the AAA video game development studies, everybody knows each other), but only sometimes trickles down to the public.
I suspect the real reason is that, as programming gets closer and closer to hardware, it becomes less theoretical and more practical, and less of a science and more of an art. As in, theoretical knowledge becomes a smaller factor than raw skill and expertise. In fact, a lot of theoretically performant algorithms that assume an abstract computer, in practice, perform worse than just looping through an array because of cache efficiency.
It's an area of expertise that only becomes relevant when you're trying to squeeze out maximum performance out of a tiny microcontroller, or trying to draw complicated graphics at 60fps, when the average machine the program is going to be run on is probably 5 years out of date (according to Steam hardware surveys, and console lifespan). Most programmers operate in a realm where the difference between a runtime of 1ms and 5 minutes is not that important, because the program will only be run once, or they have the option of just buying better hardware, because the program will only run on their own machines, or any other factors that make performance irrelevant.
Sucks for us interested about that kind of stuff, though. You're either in, or you're out.