• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

AI Processor

I wouldn't call better 'terrain analysis, line-of-sight sense, path finding' Artificial Intelligence or in any way revolutionary. I'd rather see a learning AI in computer games than better path finding.
 
That's the issue I'm having with this. Designing specific hardware to speed up these basic abilities is nice, and would allow for hundreds of units to all do these sorts of things a lot quicker, but in the end the utility is very limited, in fact too limited. As AI itself advances (in terms of gaming) then they'll need to update these chips more and more to speed up those specific sorts of algorythms as they become more commonplace. This thing will either take off but require updating at a very fast pace or it simply won't take off at all because it'll be too limited.
 
I wouldn't call better 'terrain analysis, line-of-sight sense, path finding' Artificial Intelligence or in any way revolutionary. I'd rather see a learning AI in computer games than better path finding.
Agreed, sort of. But it sure is AI and it would be revolutionary if it can do those things consistently well, and quickly. Some games are better than others but you'll concede that common complaints about "in-game AI" (as opponents or allies) are:

"I killed his mate and was standing right next to him and he did nothing, so I shot him..what a crock"
"My squad spent so much time milling about the door I got shot by the <insert bad guy here>...what a crock"
"The enemy just seem to line up and run at me, regardless of cover...what a crock"
"I spend more time reloading because my guys don't know when I run in front of them and they shoot me...what a crock"
"I turned up the AI slider and my machine slowed to a crawl...what a crock"

etc etc

If the card can alleviate these problems, it'll be a good thing, surely?
 
I agree that it would be nice if this chip would solve those problems you name. But as far as I can see, it won't make the computer opponent any smarter but only faster. So this chip does nothing I would call AI. A capable AI certainly needs all these features like path-finding and line-of-sight, but on their own these features don't make a computer opponent act intelligent.
 
I agree that it would be nice if this chip would solve those problems you name. But as far as I can see, it won't make the computer opponent any smarter but only faster. So this chip does nothing I would call AI. A capable AI certainly needs all these features like path-finding and line-of-sight, but on their own these features don't make a computer opponent act intelligent.
All true.

But consider the benefit of having a consistently defined interface to AI functionality, even relatively low-level functionality like path-finding and LOS. If it were standard it would have the benefit of relieving the designer of implementing their own, allowing them to concentrate on implementing richer behaviours, and would also provide a context for improvement, in much the same way that the various DirectX 3D revisions, in concert with the constantly scaling chipset performance, have done with T&L, pixel shaders and so forth. That's not to say that it won't be done badly of course, or that it would be done in that way and have that effect.

It's a mixed bag and I'm in no position to gainsay what they will do or how it might be used, but as a general principle, the idea has its merits and is, I find, inherently interesting.

You say that the chip does nothing you would call AI, and I'd probably agree although I would stress that 3D scene analysis is still the subject of academic AI research. Your comment that it would not make an opponent act intelligently is also conceded. The first 3d accelerators were pants in comparison to the speed and functionality currently available. The bad news is, I guess, that we're in for a bit of a wait. :D
 
Standardisation is of course always a good idea in software development, but I doubt that a dedicated chip for these low level functions is needed. A chip for neural networks is something I could find much more useful.
 

Back
Top Bottom