Is CoT going to utilize this? https://blogs.msdn.microsoft.com/directx/2015/04/30/directx-12-multiadapter-lighting-up-dormant-silicon-and-making-it-work-for-you/
I know running well on low-end machines is a priority for MWM, and it seems like it could possibly help machines with lower end graphics cards run the game better by leveraging integrated graphics they may have.
There's a reason for the "seems like" and "possibly" in that last sentence--I don't even play a game developer on TV :).
FIGHT EVIL! (or go cause trouble so the Heroes have something to do.)
It's interesting stuff. I find myself wondering how far technology has moved on since that announcement in 2015. Would love to know what MWM has to say on this.
Also thinking that the first pic on that page looks like the world's most crowded episode of MST3K ever.
Spurn all ye kindle.
Makes me think of PhysX in nvidia, and how you can dedicate a non SLI card to it. This seems even better.
[color=red]PR Team, Forum Moderator, Live Response Team[/color]
My guess is that they won't make a specific effort to implement it but rather wait until it becomes a staple part of UE itself, same with the OpenGL equivalent in Vulkan. I just don't think there will be enough ROI if MWM does this themselves compared to if Epic does it for UE at large and thus for all UE based games.
That is a good point. My thinking was that since they had already began using it with UE4 back in 2015 that maybe there was already some framework there--or even it was somewhat "native" in UE4. But if implementing it is work/time intensive, then a lot of work for a little performance boost wouldn't be worth it for MWM at this point.
I discovered the Multiadapter because I have the opportunity to pick up a used laptop crazy cheap that I hoped my wife could use to game with me and eventually play CoT, but upon investigation the discreet card wasn't as good as I thought. Upon further investigation I found that the CPU integrated graphics was almost as good as the discreet card! So I was investigating the possible use of the integrated graphics when I came across Multiadapter.
*Off topic question for those knowledgeable about machines: The laptop has a pretty good CPU and SSD, but it has a 940mx card. Upon investigation of that card (I have to buy it quick to get the awesome price--no, it's not hot :P) I found that proper "gamers" seem to have a special hatred in their heart for this particular "potato" of a card. But casual/light players seemed to think it was anywhere from fine to awesome! So my questions are--how bad is the 940mx really, and will an i7-7500u + 940mx (with SSD) be decent for general MMORPG gaming, or is it as bad as the "gamers" say?
FIGHT EVIL! (or go cause trouble so the Heroes have something to do.)
Most of the time the people who consider themselves "super hardcore gamers" are looking for graphics hardware that can deliver the highest number of FPS (frames per second) while at the same time having all the various graphics settings maxed out. Basically they always want the "best of the best" at any given moment to measure their e-peens with.
In reality though most people don't really need the most bleeding edge hardware in order to have an enjoyable experience. I don't have any direct working knowledge of the 940mx card but I'll bet the only reason the "hardcore gamers" hate it is because it's probably somehow 5% or 10% less capable than some of the other similar cards related to it. Again in reality the functional difference between the 940mx card and its rivals is probably barely noticeable to average players and would likely only be significant if you put two machines literally side-by-side for comparison.
Most "bleeding edge" gamers tend to lose sight of how far graphics processing capability has advanced in the last several decades. Basically ANY graphics card (even the low end ones) made in the last few years would be considered effectively magical to anyone from say 15-20 years ago. All you can do is try to accept and appreciate the fact that even if you don't have the highest end card made in 2018 that pretty much anything made in the last few years is going to more than adequate for CoT and it's going to be miles better than anything anyone was using back in 2004 when CoH was first released.
CoH player from April 25, 2004 to November 30, 2012
[IMG=400x225]https://i.imgur.com/NHUthWM.jpeg[/IMG]
I would say more than decent since up until last fall I was running a laptop with an i7-4700HQ + 765m in it and I thought it was more than decent for the general MMO scene. Did a few searches but couldn't find anything directly stating it was a bad card.
I'm also pretty sure that their test laptop for minimum hardware is worse than that so yours will be very well off afaik.
Thanks Lothic and Black. Good to hear!
I was all excited about getting it for her because her last laptop was an Acer with i5-3317u and a GT 640m and a HDD and it wasn't doing badly, it was just accumulating too much internal and external wear and tear. This one looked like a significant upgrade for an amazing price till I started researching and saw the GPU particularly unloved by more intense gamers.
Ok, I'll stop hijacking my own thread for equipment advice :). Thanks again.
FIGHT EVIL! (or go cause trouble so the Heroes have something to do.)
I consider myself to be somewhat knowledgeable in this field. But I'm not an expert so take this as you may.
Multiadapter tech, much like multi-GPU tech is highly dependent on the engine and the developer making it useful. Not every game is SLI/crossfire compatible, for a reason. When you look at a small volunteer studio like MWM there is absolutely no reason to bother unless they happen to have a SLI engineer from nvidia on the team (or crossfire engineer from AMD). Look at a game that is known for pushing bleeding edge technologies, Ashes of the Singularity. The reason they pushed DX12 and multi-GPU and multiadapter tech so hard was because they got a ton of publicity for it. Also I highly suspect that Microsoft payed them a lot and gave them a great deal of assistance to develop the tech for their own publicity. They didn't do it because it was easy, they did it to sell games. (and it worked, I bought AOS on sale on steam...didn't install it yet...)
Another reason this isn't a good investment. Not many people have multiple GPUs. GPUs are more expensive now they they have ever been. That's referencing MSRP, not street prices, which were pushed by the cryptocurrency craze. It's just too darn expensive to buy two GPUs. For the money you are better off getting a single better GPU. There's more than cost pushing back against multi-GPU, it doesn't scale well. 2x GPU might yield 1.5x the performance. It also has some side effects that some gamers find worse than low frame rates. Microstutter, and software stability. And now you're asking, why does this loon keep talking about multiGPU and not multi-adapter? It is effectively the same tech, but worse. It's very difficult to get working. Ashes never released to the public their multi-adapter code (to my knowledge). Some reviewers got a hold of it but it was so temperamental and difficult to get working that they never released it to the public. But what about that test in the article? That was a highly specific test. I'd equate that to something more like an animation rendering than a game rendering. If they really wanted to sell that tech they would have grabbed a bunch of triple-A games from different studios and said "look we go into windows display setting and flip DX12 "multi adapter" and poof instant +6 FPS."
Another consideration is that an onboard GPU is sharing die space and "heat space" with your CPU. The more you push you onboard adapter the more heat your CPU has to deal with. Most laptop solutions aren't meant for that kind of work. That means you'll see problems with either the onboard GPU being throttled or your CPU being throttled, and you'll lose those extra frames, one way or the other.
Finally you have to consider adoption. That article was written in 2015. DX12 is still, 3 years later, not heavily adopted by studios outside of Microsoft's and a couple others. DX12 is so poorly adopted that it's not even a filter on nvidia's Geforce page. scroll down to "browse by technology," DX11 is there, not DX12. https://www.geforce.com/games-applications
Personally, I don't think multi-adapter is the future. Relative to the article, this is the future and it hasn't taken off, I'd say that is pretty definitive. I think the next generation is going to change the way we render games. There is a lot of rumor and speculation surrounding ray tracing. Which could yield a far better image from the same amount of hardware. Rumor has it that the next generation of GPUs will be equipped with co-processors that can start to leverage this lighting methodology. A couple years ago Imagination demoed a chip that demolished a 980ti GPU while using a fraction of the power. If nVidia were to incorporate that tech into a GPU photo realism in triple-A games could just be a couple years away.
https://www.youtube.com/watch?v=Fz6AUj2PY9c
[img]http://www.loon.org/assets/images/about-loons-crw_2232_1.jpg[/img] Loon out...
Second Chance: https://store.missingworldsmedia.com/CityOfTitans/SecondChance/
Dev Tracker: http://cityoftitans.com/forum/fixing-dev-digest
Dev Comments: https://cityoftitans.com/forum/dev-comments
Thanks for the info. I tend to only bother to "study up" on the latest GPU tech when I'm in the market to buy something new and it's just been a few years since then.
I've always been left with the general impression that pretty much at any point you can easily pay 2 or 3 times the cost of a "simple" solution to get maybe 10% better performance no matter what the "current technology" is. That's why I usually don't buy the absolute bleeding edge option at any given moment but the one that's like just one model below that level. I've been following that strategy for 25+ years now and it hasn't really let me down yet. ;)
CoH player from April 25, 2004 to November 30, 2012
[IMG=400x225]https://i.imgur.com/NHUthWM.jpeg[/IMG]
Additional reading
https://www.pcgamesn.com/amd-crossfire-vs-nvidia-sli
This is still bleeding edge software, but shows how powerful optimized ray tracing can be and gives a little bit more background into how RT works.
https://blenderartists.org/t/another-giant-step-towards-real-time-ray-tracing/1117438
It occurred to me after my post that the RT video I first posted suffered one of the arguments I made against multiadapter. The TH video is several years old and the tech is still a couple years out. I think that has more to do with Immagination not finding a buyer and generally being on the decline as a company (having just lost Apple as a customer). Just poor business sense or over valuing their tech. Ideally they would have produced a high power device with 10 of their 15W chips and sold that to a SFX company or animation studio like Pixar. Or just sold the licensing or been bought out by nV or AMD. In any event I think it's more a business issue than a tech issue. In any event there is still some momentum there that I don't see for multi adapter/multi-GPU.
Next gen GPUs should be interesting. It looks like we'll see new nV GPUs at the end of the month (maybe) and we'll see if the RT co-processor rumor was real or not. Also it seems that the reality of cryto-currencies is sinking in and people are no longer buying up every last GPU.
Lothic to comment on your point. A lot of the tech personalities seem to agree that buying something in the upper echelons is actually a good idea. They suggest buying a 1060 or higher or going back a generation and buying second hand 970 or higher. The reason is that current gen low level GPUs are generally garbage for the money when compared last gens offerings which on the second hand market can be had for similar pricing. YMMV of course depending on the market in your area. That opinion may vary in relevance from generation to generation. The GTX 720 (a low tier card) was somewhat lauded when it was released for being pretty remarkable for the money. That was of course 3 gens ago.
*flaps away*
Second Chance: https://store.missingworldsmedia.com/CityOfTitans/SecondChance/
Dev Tracker: http://cityoftitans.com/forum/fixing-dev-digest
Dev Comments: https://cityoftitans.com/forum/dev-comments
Well here we are now with the RTX 2080 and ray tracing. Curious if this will make its way to cityoftitans someday or if it's already too late to implement? Wouldn't want to give the developers more to worry about at this stage but it could be another shiny selling point for CoT if its something that can be added in the future. As it stands right now I think only battlefield 5 uses this.
That is entirely in Epic's hands.
Technical Director
Read enough Facebook and you have to make Sanity Checks. I guess FB is the Great Old One of the interent these days... - Beamrider
Exactly. Many of these "underlying tech" things are not really worth doing on a per-game basis in its whole, but rather better to wait until there is full support by the the engine they run on (UE4 in this case) so that there isn't that much more than ticking a checkbox (if even that) and some QA.