• Welcome to Battlezone Universe.
 

News:

Welcome to the BZU Archive dated December 24, 2009. Topics and posts are in read-only mode. Those with accounts will be able to login and browse anything the account had access granted to at the time. No changes to permissions will be made to be given access to particular content. If you have any questions, please reach out to squirrelof09/Rapazzini.

Main Menu

problems

Started by slasherc4, December 04, 2006, 05:59:33 PM

Previous topic - Next topic

GSH

Quote
The hardware's only going to keep getting better.

P.T. Barnum was right.

Here's the real mantra you need to learn: it all sucks. Just in different ways.

-- GSH

Red Devil

  :-o   Dammit!  I replaced my 233Mhz Pentium II with a 64-bit Athlon for nothing!   :x
What box???

OvermindDL1

Quote from: GSH on December 05, 2006, 03:04:38 PM
P.T. Barnum was right.

Here's the real mantra you need to learn: it all sucks. Just in different ways.

-- GSH
So right, just found out that the newer 64-bit systems have the OPCODE CMPXCHG16B, whereas my first-gen Athlon64 does not. >.>
Ah well, it is only necessary in multi-core/cpu builds anywhere, so just adding defines for different builds currently (I have 3 builds, a 32-bit that always uses CMPXCHG8B, a 64-bit that uses CMPXCHG16B, and a 64-bit that does plain, non-atomic copies (single-threaded).

Raven

Crap...
I'm saving for a lapptop. So getting a card isn't exactly on now...  I'm keeping this hoote machine as its windows, but that laptop won't be able to play bz2 I can assure you ;)

Well at least theres the /nodxt :)

llulla

I tested BZ2 1.3b3 on two laptops. The Dell with Centrino and ATI X600  was from bad to very bad. The other, newer machine, Aspire 5xxxx with Turion proc and ATI X1300 performs quite well at 1024x763x32 with no limitations.
I think Raven, there are out there a lot of economy laptops that are perfect for this BZ2. You can use it as a tester  :-D for the laptop.
Haven't had yet a laptop with Nvidia graphics.

Avatar

Quote from: OvermindDL1 on December 06, 2006, 10:52:07 AM
So right, just found out that the newer 64-bit systems have the OPCODE CMPXCHG16B, whereas my first-gen Athlon64 does not. >.>
Ah well, it is only necessary in multi-core/cpu builds anywhere, so just adding defines for different builds currently (I have 3 builds, a 32-bit that always uses CMPXCHG8B, a 64-bit that uses CMPXCHG16B, and a 64-bit that does plain, non-atomic copies (single-threaded).

:-o

"Plain, non-atomic copies"???

Where do you have ATOMIC copies running?   

Does the Guv'ment know about you OM?    :evil:

-Av-

llulla

I  see also  the word BUILD associated with the ATOMIC. Hmm.... 8-)

GSH

My laptop (Gateway MX7515, Mobile AMD Athlon™ 64-bit 4000+ Processor, ATI X600, 1GB ram) seems to play BZ2 fine.

-- GSH

OvermindDL1

Atomic operations on a cpu is where an operation is done fast enough or hardware locked so only one cpu/core is accessing the resource at a time so you a garranteed not to have corrupt data due multiple things operating on something at the same time. :)

Avatar

Uh-huh...   sure, buddy...  anything you say...

(he's over there, Officer...)

:roll:

-Av-

OvermindDL1


llulla

What do you think has more impact on 1.3 Graphic performance G.SH., the amount of texture memory or the number of those pixel/vertex processors of the graphic card?
The reason I'm asking is the shaping of the graphic cards market, that offers at midrange level, combinations of the type: good GPU/128MB or mediocre GPU/512MB. Allegedly, this keeps the price affordable. :-P 
If 1.3 benefits mainly from the memory available to textures, even integrated graphics that can shave 512MBs from the system RAM, will work OK.

OvermindDL1

Modern videocards with their fancy pipelines and shaders and all such won't make a hide-nor-hair amount of different to BZ2. :P
The only real thing on modern cards that could make a difference would be memory, and even then BZ2 is quite low in that regard.
But wait for GSH to answer since he has actually seen the graphics engine.

GSH

BZ2 doesn't use any shaders at all. And, all the (dxt) textures shipped with 1.3pb3 add up to 27.8MB. On modern cards, they ought to occupy roughly the same amount of memory. The lower mipmaps may be rounded up to some minimum page size (I know a 4x4 mipmap got rounded up to 4KB on the XBox 1, but I don't know the details of consumer cards), which'd increase memory needed. On the other hand, not all the textures on disk are used in a single level, which reduces some memory needed. I'm going to assume those two roughly cancel each other out, but that's a total guess.

On laptops, most graphics cards that claim "128MB" don't have their own memory, but they steal it from main memory. This reduces the total amount of RAM available, and makes the CPU and graphics chip fight over memory bus. On most PCI/AGP/PCIE addin graphics cards, they have their own memory, which doesn't cause any of these faults. WinXP needs at least 256MB, but seems to get happier close to 512MB. BZ2 needs at least 128MB of main memory on top of that, more if you're ignoring unit limits. So, on a laptop with memory-stealing graphics chip, I'd recommend at least 768MB memory, 1GB is more comfortable.

I'd suspect that the biggest win from a modern video card is basically this: fill rate. That is, the ability to paint polygons quickly. BZ2 still makes your CPU do all the work in transforming, lighting polygons.

-- GSH

llulla

Yeeah! I have seen 1.3 work fine with any 1GB RAM unit, mainly Intel integrated graphic desktops. It should be the same for laptops. The difference must be in the CPU though.
My GF6600GT allows me to  go thrugh Quake4 or HalfLife2 with no problems at all. It is fanatastic, most of the time, with BZ2 1.3.
I experience a sensible drop  of FPS, down to a halt, when there are like 30+ simultaneous explosions of plasma mines (Fleshstrorm) accompanied with the explosions of swarm units caught in the mine fields. My good old  P4630 3.0GHz/2MB, probably is not enough anymore.