I downloaded and installed the Far Cry Demo, which, to be quite honest runs pretty sluggishly on my M6810 @ 1024x768 and most of the settings on low.
Is this what I should expect from Far Cry and the R9600? Is the full game any better?
I also downloaded and installed the Painkiller Demo and it tends to stutter through the game although it does generally play reasonably smoothly. I get occasional pauses during game play which is spoiling an otherwise peasant gaming experience
I also purchased the game called XIII and it runs as smooth as silk thankfully!
I was expecting a bit more horsepower out of the M6811 to be honest though.
Anybody else got any gaming performance tips experiences to share?
Tell it like it is...NOT how it should be
-
when i got my M6811 first thing i did was get rid of the stock Video drivers, they arent ment for Gaming, try the newest Omega drivers.
Once i did that my games played perfectly, well the ones i had at the time. havent tried doom 3 or far cry, since i dont have my M6811 anymore but, i can tell you it was a Very capable Gaming machine. -
I have the M6809, and I can play FarCry smoothly, both before and after the Omega upgrade. Try changing the resolution to 800x600 and increasing the options. Make sure that all other apps are closed when you begin play. Upgrade the RAM. You might have a better experience with one or all of these options. Remember, some games, like Doom3, REQUIRE a minimum of 384MB of RAM. Also consider that XP basically usurps 128MB of RAM idling.
-
Hello again
I've actually done a full format and fresh install of XP Pro with SP1.
The current drivers loaded were the Omegas from 2 weeks ago (I forget the revision number).
I did try to run the Far Cry Demo at 800x600, however, when I changed to that resolution I could not see the full screen - it somehow gave a partial display of the screen and would not allow me to change it back to 1024x768 - The only way I managed to solve it was to uninstall/reinstall the game!
I'll attempt to change the resolution tonight and see how it plays. I'm actually hoping that the demo is just not fully optimised compared to the finished product
Last night I overclocked the R9600 to 400/210 (approx) and game play was pretty much the same as at standard clock speeds. I certainly did not "feel" or see any difference (which I have noticed on desktop AGP cards that I have overclocked).
Other than Far Cry or Painkiller the machine is pretty good for gaming indeed, e.g. XIII & Max Payne 2, although this is hardly an extensive test.
Tell it like it is...NOT how it should be -
The full version is definitely better than the demo to display the system's ability from what you are writing here.
-
on any resolution Doom 3 would crash on my HP, but after i updated the drivers the problem went away. and its been smooth sailing ever since.
-
I play far cry on my 6809 all the time. Works great. Not sluggish at all at 1024x768 high. I play it at 720x576 high when on battery and it runs fine as well. Of course that's with the M Radeon9600 overlocked to 420/210
-
Yeah. With a good overclock (in no way do I suggest or promote changing the OEM specs on any installed hardware, including overclocking, etc...blah, blah, blah (DISCLAIMER)), pretty much any current game is hot on this machine. In fact, with the last gen GPU and less VRAM, the eMachines notebook beats out a bunch of other systems stock, thanks in large part to the AMD 64 chip and chipset, even stock.
-
I play Star Wars Galaxies on it with a medium quality graphics settings and things run great until I walk into a busy town and there's hundreds of people around yelling advertisements.
I overclocked and installed Omegas and it helps out tremendously in these situations.
Battlefield: Vietnam runs great too. As does a few other titles.
"If something goes wrong, just blame the guy who can't speak english."
--Homer Simpson -
I'm now clocked at 420/410 (ish) and Far Cry is still pretty awful really.
I can only assume that you guys are playing the full game and not the demo?
I noticed on the demo of Painkiller that EAX was enabled and turning that off has improved the gameplay.
I finished Max Payne 2 last night (1280x800) and it is time to start on XIII.
Tell it like it is...NOT how it should be -
The full game is a better gauge of this game's possibilities on this system, to be sure. Demos only give you the very basic components of the game...just enough to get you hooked. By the way, the full game install runs through FIVE CDs!
-
Funny...I've tried the following games:
Unreal Tournament 2004
Steam games (Including Counter-Strike:Source)
Farcry (full version)
Painkiller demo
Halo
Star Wars: Galaxies
and numerous other games on my STOCK M6805, with XP Pro. I haven't overclocked it, and I haven't upgraded any hardware. I run them all at native resolution when supported (1280x800) and they run fine. Now, I can't run them with all the graphical settings on high (like I can on my pair of desktops with 9600XTs in them) but they run quite well. -
I've had a few comments from people elsewhere regarding Far Cry with the M68xx series and most people do seem to agree it is a very sluggish game unless the settings are turned well down (e.g. 800x600 @ low details).
Is this a situation where the people here "think" it runs well by comparision to other slower systems?
By "running well" I mean it is smoooooooth when turning around in all environments. I expect somewhere around 60-70fps to be a minimum for 3D gaming - I suspect Far Cry runs at less than 30fps on the M6810 that I own.
I also read somewhere about running Far Cry using Open GL..is this possible?
I'm about to finish XIII so I'll need a new game to play [ ]
Tell it like it is...NOT how it should be -
It doesn't run at 60-70 FPS, but it's playable (by playable I mean 35+ FPS) at native resolution, with settings on medium. I did crank them all up (to see all the purdy shaders...oooooo) but that obviously ran like crap.
If a friend's got a copy, install it. Doesn't need the key til you go online. Give it a try. Worst case, you don't put it on there. -
Well that seems more realistic then [ ]
Not sure about this objective nature of this earlier quote though:
"I play far cry on my 6809 all the time. Works great. Not sluggish at all at 1024x768 high. I play it at 720x576 high when on battery and it runs fine as well. Of course that's with the M Radeon9600 overlocked to 420/210"
From my own preferences 30+ fps is sluggish to me so I'm presumably getting the expected performance out of the M6810 in Far Cry then?
A mate of mine has Far Cry so I'll speak to him to see about borrowing it. [ ]
Tell it like it is...NOT how it should be -
Exactly what are you not sure of? That i play far cry on my m6809 all the time (or at lest did, beat it, uninstalled) or that it runs fine (as far as i'm concerned) at 1024x768 high when others say it runs fine at 1280x800 (native res) med? With overclock, why wouldn't it run fine with those settings?
If you want, i'll reinstall and run it w/ fraps to check the exact fps. but it still depends on the situation in-game (number of objects being rendered, etc) -
"Exactly what are you not sure of?"
I was not sure how "objective" your advice was - by your own words now, "it runs fine (as far as i'm concerned)" I understand exactly where you are coming from now though - "subjective"[ ]
As I mentioned earlier I generally consider 60-70fps to be a minumum for FPS shooters. So as you can imagine 30+fps is very sluggish to my expectations as well as many others.
If 30+fps is fine for you then that's cool [8D]
It would be interesting to know what sort of fps you are achieving though [ ]
Tell it like it is...NOT how it should be -
As a reference point, it is highly unlikely that your eyes will be able to recognize any additional speeds over about 33 FPS. Secondly, there are few notebooks that will give you that type of performance on a high end game.
-
That is certainly considered to be true when looking at a 2D image such as a TV screen presentation, however, and this is a big HOWEVER [ ] that does not apply to a rotating 3D simulation where sudden movement is the norm and explosions galore need to be rendered.
What you suggest is a common misconception (correct assumption but applied to the wrong scenario) - but the faster the framerates in 3D - the "smoother" the gameplay will be. You can "feel" this, as well as see it.
30 or 33fps just ain't cutting the mustard unfortunately (in the quest for smooth gameplay) [ ]
Tell it like it is...NOT how it should be -
If you say so...
The only machines that will render images for you like you describe are extreme high end systems with at least an ATI MR 9700 or greater. Look at Sager, which have an Athlon64 3700+ with an ATI MR 9700 with 128MB VRAM. Also look to pay almost twice what you will with the 68xx. -
Well more like if YOU don't believe so..[ ]
I was talking about the general gaming peformance of the m68xx series.
Now I did a quick search on yahoo and came up with this article from 1998!
"See" what you think? (Coincidentally the 72fps is the max fps of the standard version of Half Life - although many modify that to 100fps inthe console)
30 vs. 60 Frames Per Second
A Technical Overview
by Joshua Walrath (1998)
Movies | TV & DVD | Computer Games | Human Eye | Conclusion
60 frames per second vs. 30 frames per second has been one of the most contested ideas around the web and in print for the last year. Today we will look to see who is right, who is wrong, and who is just plain confused.
Chipmaker (and now boardmaker) 3dfx has been evangelizing gaming at 60 fps since the Voodoo 2 was released. Many have looked down upon 3dfx for this due to the common misconception that humans cannot distinguish framerates over 30 fps, so what is the point of having visuals running at 60 fps? Misconception you say? Yes. In this article we will look behind the technology of games, computers, movies, and television and the physiology and neuro-ethology of the human visual system.
Movies
I have seen film students write in to columns about how anything over 24 fps is wasted. Why 24 fps? Movies in theaters run at 24 fps. They seem pretty smooth to me, so why would we need more? Well, let's take a look at movies from the eyes' perspective.
First off, you are sitting in a dark movie theater and the projector is flashing a really bright light on a highly reflective screen. What does this do? Have you ever had a doctor flash a bright light in your eye to look at your retina? Most of us have. What happens? A thing called "afterimage". When the doctor turns off the bright light, you see an afterimage of the light (and it is not real comfortable). Movie theaters do the same thing. The light reflected off the screen is much brighter than the theater surroundings. You get an afterimage of the screen after the frame is passed on, so the next frame change is not as noticable.
Screen refresh is also a very important factor in this equation. Unlike a television or a computer monitor, the movie theater screen is refreshed all at once (the entire frame is instantly projected and not drawn line for line horizontally as in a TV or monitor). So every frame is projected in its entirety all at once. This then leads back to afterimage due to the large neurotransmitter release in the retina.
Perhaps the most important factor in the theater is the artifact known as "motion blur". Motion blur is the main reason why movies can be shown at 24 fps, therefore saving Hollywood money by not having to make the film any longer than possible (30 fps for a full feature film would be approximately 20% longer than a film shown at 24 fps, that turns out to be a lot of money). What motion blur does is give the impression of more intervening frames between the two actual frames. If you stop a movie during a high action scene with lots of movement, the scene that you will see will have a lot of blur, and any person or thing will be almost unrecognizable with highly blury detail. When it is played at full 24 fps, things again look good and sharp. The human eye is used to motion blur (later on that phenomena) so the movie looks fine and sharp.
TV, Video Tape, and DVD
TV's run at a refresh rate of 60 Hz. This is not bad for viewing due to the distance we usually sit from the TV, and the size of the phosphors on your average set and the distance between phosphors (between .39 for a high end one, to .5 and higher for cheaper models). This is actually quite big and fuzzy for most of us, but as long as we are not doing any kind of productivity software (such as word processing) and just watching movies at least 6 feet from the TV, that is just fine.
Now TV transmissions, video tape, and DVD play at 30 fps. The increase from movies is due mostly to the environment that the TV is watched in. It is usually quite a bit brighter than in a movie theater, and most importantly a TV does not do a full screen refresh, rather each frame is drawn line by line horizontally by an electron gun hitting the phosphors in the screen. So basically each frame is drawn twice by the TV (60 refreshes per second, 30 frames per second). Now because the frame rate is 1/2 the refresh, transitions between frames go a lot smoother than if you had say a 72 Hz refresh and a movie playing at 30 fps. Don't ask me why, it is due to wave behavior, which is higher level physics, and I can't go into that without making this a 30 page paper. Needless to say, the physics behind this make video and DVD look very smooth.
Motion blur again is a very important part to making videos look seamless. With motion blur, those two refreshes per frame give the impression of two frames to our eyes. This makes a really well encoded DVD look absolutely incredible. Another factor to consider is that neither movies or videos dip in frame rate when it comes to complex scenes. With no frame rate drops, the action is again seamless.
Games on the Computer
This is the second toughest part of this article. TV and Movies are easy to understand, and the technology behind it is also easy to understand. Computers and the way games are projected to us is a lot more complex (the most complex is the actual physiology / neuro-ethology of the visual system).
First off, the hardware used for visualization (namely the monitor) is a very fine piece of equipment. It has a very small dot pitch (distance between phosphors) and the phosphors themselves are very fine, so we can get exquisite detail. We set the refresh rates at over 72 Hz for comfort (flicker free). This makes a very nice canvas to display information on, unfortunately because it is so fine it can greatly magnify flaws in the output of a video card. We will get into refresh in the section on the human eye.
Let us start with how a scene or frame is set up by the computer. Each frame is put together in the frame buffer of the video card and is then sent out through the RAMDAC to the monitor. That part is very easy, nothing complex there (except the actual setup of the frame). Now each frame is perfectly rendered and sent to the monitor. It looks good on the screen, but there is something missing when that action gets fast. So far, programmers have been unable to make motion blur in these scenes. When a game runs at 30 fps, you are getting 30 perfectly rendered scenes. This does not fool the eye one bit. There is no motion blur, so the transition from frame to frame is not as smooth as in movies. 3dfx put out a demo that runs half the screen at 30 fps, and the other half at 60 fps. There is a definite difference between the two scenes, with the 60 fps looking much better and smoother than the 30 fps.
The lack of motion blur with current rendering techniques is a huge setback for smooth playback. Even if you could put motion blur into games, it really is not a good idea whatsoever. We live in an analog world, and in doing so, we receive information continuously. We do not perceive the world through frames. In games, motion blur would cause the game to behave erratically. An example would be playing a game like Quake II, if there was motion blur used, there would be problems calculating the exact position of an object, so it would be really tough to hit something with your weapon. With motion blur in a game, the object in question would not really exist in any of the places where the "blur" is positioned. So we have perfectly drawn frames, so objects are always able to be calculated in set places in space. So how do you simulate motion blur in a video game? Easy, have games go at over 60 fps! Why? Read the section on the human eye.
Variations in frame rate also contribute to games looking jerky. In any game, there is an average frame rate. Rates can be as high as the refresh rate of your monitor (70+), or it can go down in the 20's to 30's. This can really affect the visual quality of the game, and in fast moving ones can actually be detrimental to your gameplaying performance. One of the great ideas that came from the now defunct Talisman project at Microsoft was the ability to lock frame rates (so the rate goes neither above or below a certain framerate). In the next series of graphics cards, we may see this go into effect.
The Human Eye (and Visual Cortex)
Here is where things get a little interesting, and where we will see that humans can perceive up to 60+ fps.
Light is focused onto the retina of the eye by the lens. Light comes in a steady stream and not pulses (ok, so this is a little wrong, but we are not talking about the dual nature of light, where it acts as both a particle -photon- and a wave). Again, we live in an analog world, where information is continuously streamed to us. The retina interprets light in several ways with two types of cells. Rods and Cones make up the receiving cells for light. Intensity, color, and position (relative to where the cell is on the retina) is the information transmitted by the retina to the optic nerve, which then sends that info to the Visual Cortex for it to be translated to our conscious self (whoa, went from science to philosophy in one step!).
Rods are the simpler of the two cell types, as it really only interprets position and intensity. Rods are essentially color blind, and are referred to as transmitting in black and white. The black and white is not really true, but rather it is just intensity of the light hitting the cell. Rods are also very fast due to the basic nature of intensity. The amount of neurotransmitter released is basically the amount of light that is stimulating the rod. The more light, the more neurotransmitter. Rods are also much more sensitive than cones. How is this proven? We know by microscopic examination of the retina shows that there is a much greater concentration of rods on the outer edges. A simple experiment that you can do yourself is to go out on a starry night and look at the stars out of your peripheral vision. Pick out a faint star from your periphery and then look at it directly. It should disappear, and when you again turn and look at it from the periphery, it will pop back into view.
Cones are the second cell type, and these are much more complex. There are three basic parts to them that absorb different wavelengths of light and release differing amounts of different neurotransmitters depending on the wavelength and intensity of that light. Basically there are three receptors in a cone that absorb red, green, and blue wavelengths of light. Each of these receptors release a different neurotransmitter for the color, with differing amounts of the neurotransmitter depending on the intensity of the wavelength. Purple is a combination of blue and red, so the red and blue receptors would release differing amounts of neurotransmitter, while the green wouldn't release any. This information then passes onto your visual cortex and we "see" purple. Cones are much more inefficient than rods due to their more complex nature. They also are a little slower to react to changes in light and are also not as sensitive as rods (see above experiment). Cones are what largely make up the center of the retina and fovea (focal point of the retina).
The optic nerve is the highway from which information is passed from the eye to the visual cortex in the brain. This nerve is just a pathway, and does no processing on its own. Its bandwidth is actually really huge, so a lot of information can be passed on. Nerve impulses also travel at over 200 mph to the brain, so it is nearly instantaneous for information to be received from the eye (since the optic nerve is only about 2 cm to 3 cm long).
The visual cortex is where all the information is put together. Humans only have so much room in the brain, so there are some tricks it uses to give us the most information possible in the smallest, most efficient structure. One of these tricks is the property of motion blur. We cannot get away from the phenomena because it is so important to the way we perceive the world. In the visual cortex we can theorize the existence of what I call the motion blur filter. Because the eye can only receive so much information, and the visual cortex can only process so much of that, there needs to be a way to properly visualize the world. This is where it gets tough.
Take for example a fast moving object. The faster it goes, the more it blurs (be it a snowflake or a train). Why does this happen? Let's take the example of a snowflake. At any time it has a fixed position in the universe, no matter what speed it goes at (unless it starts to get relativistic, then we go into some strange physics, but something that is not applicable to what we are talking about). Lets say at 5 mph, we see the snowflake in perfect detail as it falls to the ground. Now we hop into a car and go 55 mph. Can we see the detail of the snowflake? No, it is just a streak to us. Has the snowflake changed itself? Of course not. If we had a really fast camera with a fast shutter speed, it would see the snowflake in perfect detail. Now due to the speed in which our eyes/visual cortex can process information, we cannot see the snowflake in detail. A bird such as an eagle would be able to see more detail and not so much of a streak because it only has rods (it is color blind) and the distance from the eyes to its highly specialized visual cortex is 1/16th the distance of ours. This leads to more information being pumped into the visual cortex. So what would look like a streak to us would look like a fast moving snowflake to the eagle.
If we didn't have the ability to produce motion blur, we would see the snowflake pop in and out of existence at high speeds. We would first see it one place, then it would disappear and pop into existence several feet beyond depending on the direction it is going. Is this a good thing? No, we would have a hard time figuring out the direction of the snowflake and have many problems with perceiving movement in three dimensional space. With motion blur we get the impression of continuity where our hardware cannot distinguish fine detail while the object is moving at high speeds.
Contrary to the belief that we cannot distinguish anything over 30 fps, we can actually see and recognize speeds up to 70+ fps. How can you test this? You can quickly do this with your monitor at home. Set the refresh rate to 60 Hz and stare at it for a while. You can actually see the refreshes and it is very tiring to your eyes. Now if we couldn't see more than 30 fps, why is it that flicker free is considered to be 72 Hz (refreshes per second). You can really tell if the refresh is below 72 by turning your head and looking at the screen through your peripheral vision. You can definitely see the screen refreshes then (due to rods being much more efficient and fast).
Conclusion
We as humans have a very advanced visual system. While some animals out there have sharper vision, there is usually something given up with it (for eagles there is color, for owls it is the inability to move the eye in its socket). We can see in millions of colors (women can see up to 30% more colors than men, so if a woman doesn't think your outfit matches, she is probably right, go change), we have highly movable eyes, and we can perceive up to and over 60 fps. We have the ability to focus as close as an inch, and as far as infinity, and the time it takes to change focus is faster than the fastest, most expensive auto-focusing camera out there. We have a field of view that encompasses almost 170 degrees of sight, and about 30 degrees of fine focus. We receive information constantly and are able to decode it very quickly.
So what is the answer to how many frames per second should we be looking for? Anything over 60 fps is adequate, 72 fps is maximal (anything over that would be overkill). Framerates cannot drop though from that 72 fps, or we will start to see a degradation in the smoothness of the game. Don't get me wrong, it is not bad to play a game at 30 fps, it is fine, but to get the illusion of reality, you really need a frame rate of 72 fps. What this does is saturate the pipeline from your eyes to your visual cortex, just as reality does. As visual quality increases, it really becomes more important to keep frame rates high so we can get the most immersive feel possible. While we still may be several years away from photographic quality in 3D accelerators, it is important to keep the speed up there.
Looks like 3dfx isn't so full of it.
Joshua Walrath (1998)
B.S. Natural Sciences and Mathematics
B.S. Zoology and Physiology
Source:
http://www.joz3d.net/html/fps.html
So the result is that the m6800 is not really upto "my" desired standards for playing games such as Far Cry and Doom 3 (which I DL'd the demo of last night).
It is more than adequate for the last generation of games [8D]
Tell it like it is...NOT how it should be -
Thank goodness for "copy n paste" considering the lack of debate that the above prompted [ ]
Tell it like it is...NOT how it should be -
Sorry, just haven't been around in a while.
Big article, same story. Yes, 60FPS looks better than 30FPS in some circumstances. But the price vs. performance comparison comes in big time here. If you want to pay big bucks ($3000+), then you can always get a Dell XPS with the new ATI MR 9800. That system is big, ugly, hot, and, from what I am hearing, unstable. You could go with the aforementionend Sager, again, looking at about $2800 custom. You are looking at a $1300 laptop that comes real close performance-wise in many respects, but does not have the raw FPS generating ability.
If you are willing to spend as much money on a computer as a car down-payment, no problem. I love my M6809. It runs every game I have thrown at it, including Rome Total War and Doom 3, and does it pretty darn well. While I am not playing it at Ultra levels, it is good enough for me.
In this case, you will need to make up your own mind. -
I think we might be talking at cross purposes here.
The M68xx series is a great value notebook with generally excellent gaming performance for the price.
The only point I'm really trying to verify is whether or not the M68xx can really cope with the newer generation games like Far Cry or Doom 3 and play them at an acceptable desktop gamers level of frame-rates.
When people are reporting that the game runs well on their M68xx and it runs like a slug on mine I have to try to establish if there is a problem with my notebook?
I consider that I am being objective by saying 30fps is not acceptable to the vast majority of fps gamers. True, the M68xx does "run" the game and it "looks" fine but it is too slow for enjoyable (smooth) game play. It may well be fast enough for those that are not familiar with high frame rates and I'm happy for you! [8D]
Example:
Around 1999 a colleague of mine asked me to recommend a good game for his new PC system. I suggested Half Life, so he went out and picked up a copy and installed it and was very impressed by the graphics etc.
I was somewhat confused by this response as he did not have enough PC knowledge to configure the game properly.
Anyway, one evening, several weeks later, he invited me round to show him how to play games online. As we booted up the PC he kept raving about how realistic the environments were and the level of details in the graphics.
We eventually loaded the game and entered the world of Half Life in Software mode @ 320 x 240 or some god awful resolution. When I configured the system to run on Open GL at 800x600 he could not believe his eyes! [ )]
I accept the limitations of my M68xx and wont be considering any other options for the next year or so. As a result, I am back to playing Deus Ex which I have ignored for some 3 years! There are lots of older generation games which this notebook excels at those are the games that I will be playing on it. [ ]
Tell it like it is...NOT how it should be -
Read your other topic about the HD...if it is the same computer, then it is a problem with the machine.
I am very used to extreme frame rates. I have a BIY FX-53 system at home with 2GB RAM and an ATI Radeon 9800 XT. I am very satisfied with its performance thus far. And yes, there is some occasional studdering on my 6809. But this is nothing to stop me from having a very enjoyable time playing my games away from home.
Funny part is that FarCry does not run at 30FPS on the 6809, especially with some minor OCing on the VC. Unfortunately, I have not been able to give precise numbers, but it looks to be around 40-50FPS with decent settings. I have my chip OC'ed to 350/208, very modest indeed by this chip's standards (others have OC'ed to 420+/220+).
But I digress...as I originally said, methinks the problem lies with the hard drive that clicks. Put in a 7200RPM drive for max performance. -
By the way, I am sure you know that there is a frame rate limit for Doom3 no matter what computer you are using...I think it is in the 70s somewhere.
-
I forgot to say that if 70fps could be maintained at all times then that would certainly be smooth enough.
It amazes me with the quantum leap that graphics cards have taken over the past 5 years or so that we are back to such a low fps scenario. And that's with the high end goodies too!
I'm sure a good many of these games could be made to look much better and run faster using older technology if more intelligent rendering was used.
However, that does not make any money for the likes of ATI or Nvidia does it
Tell it like it is...NOT how it should be -
I'll certainly try swapping the old and new HD to see if there is any difference in performance.
I noticed that during start up when the desktop loads that the intro music is garbled/stuttering on my original drive and not on the new HD.
If I was getting 45FPS in Far Cry I'd be amazed, the frame rates are very sluggish (for me) as far as I can tell.
Tell it like it is...NOT how it should be -
Anti-Aliasing on these new machines kills fps rates.
-
I'm afraid to say I've not been doing any gaming at all this past week or so while my m6810 is in "386 simulation mode" (Using ed2k!) to download some old UK TV shows
I've now been thru 25 episodes of the BBC Series "Survivors" from the mid 1970s [8D]
Tell it like it is...NOT how it should be -
Since my M6809 is much better than my 2 year old tower I've been using it for everything. Every game I tried runs just fine. Haven't tried Doom3 but HL2 plays fine. Average framerate in the stress test at 1280x800 is 50fps. Of course out of some oddity I am able to overclock my card to 400/250 and ATI tool runs artifact free. I had it clocked at 375/215 for quite a while and just decided to see how high it goes. Go Best Buy warranty!
-
Yeah it sounds like HL2 is a good performer in general.
I'm looking forwards to playing it as HL was THE game that really got me into PC gaming.
The funny part is that when I first played it I was not too impressed and played endless Q2 online and Sin offline instead.
It was only when I got stuck in Sin that I went back to playing HL and never looked back!
Doom 3 (demo) for me looks good but is pretty slow overall at 800x600.
Tell it like it is...NOT how it should be -
One of the previous posters hit the nail on its head. You really should use it to play the somewhat older games.
It is not realistic to expect the latest games to run on a notebook.
I have high hopes for CS:S since it is so cpu dependant.
If you see high end desktops struggling with doom 3 dont even think about putting it on a $1300n laptop!
Learux
edit:quake into doom
-
I have played HL2 on my 6809, no problems at all.
Don't see what the big deal is, every game I throw at it plays wonderfully, including FarCry.
Hmm.
I use mine as a gaming notebook.
HD
Proud 0wner of the Emachines 6809: AMD Athlon 64 Bit Processor @ 2.0 ghz/512 RAM/80 Gig HD/ATI Radeon 9600 OC'ed @ 432/222.75. DVD-RW/15.4 WideScreen -
I think the main problem is that what some people consider "playable" is very subjective.
I'm used to playing FPS games with framerates as smooth as silk - some people appear to be happy to play "slideshows".
I'm playing Nolf 2 at the moment (Contract Jack) and it has generally very decent framerates at max settings but I'm now on the spacestation level and it has turned into a "slideshow"
<blockquote id='quote'> quote:<hr height='1' noshade id='quote'>Originally posted by Learux
Last edited by a moderator: May 8, 2015 -
Well, I don't like slideshows either. As I said, it plays everything I throw at it, and plays it smoothly. Granted, I don't have all the grpahics specs on high, but it still plays EXTREMELY well.
I have my video card overclocked nicely, but I noticed you have your framerates set to 'maximum'? What if you lowered that just a notch? Something about the HD and system cache? I don't know, just a guess.
Are you using stock? I am, but I still have a great experience.
I know lagging and skipping when I see it...
HD
PS, I didn't buy a 1500 dollar machine to play Pac-man.
Proud 0wner of the Emachines 6809: AMD Athlon 64 Bit Processor @ 2.0 ghz/512 RAM/80 Gig HD/ATI Radeon 9600 OC'ed @ 432/222.75. DVD-RW/15.4 WideScreen -
Yeah I had no problem with HL2 either (generally...) but it did have its moments with HD swapping etc. I think the problems I encountered were more to do with the actual game than anything else though.
I've yet to play the full version of Far Cry which I have sitting in the case on DVD.
I know that the Demos did not play well on my system but thats when I had my defective HD so I'm hoping it will run smoothly
I overclock as well at 420/220 generally
Tell it like it is...NOT how it should be -
Well, I haven't played HL2..so to speak...but I play Counter Strike Source almost nightly. I had been playing it on my wife's AMD XP 2000 with 512 PC2700, a 256 meg 9600 XT, and sound blaster audigy. I actually havent notice much improvement with my 6805 though. Actually have noticed some moments when sound go all crappy and I cant hear ppl talking in the game. Dont know if that has to do with lag or on my system. I have OC'd the vid card a little bit to like 420/208 i think, something like that. I do want to get rid of this crappy 4200 rpm hard drive and put a decent 7200 in here. I mean with these specs...to put a 4200 rpm drive in there, what were they thinkin? After that I will up the ram to most likely 1 gig. 2 gigs is just too much for my budget. That's my experience with the Source engine and my 6805.
-
Upgrading the harddrive will not increase gaming performance.
Yes applications/levels will load faster but in terms of FPS you will not see a difference.
Learux -
Are you sure about that?
Hearing something about the systems cache from the HD WILL make the lagging cease, though I don't know how or why really... I have stock and OC'ed VC and it works fine for me, and when I say fine it means no skipping, no lagging, nothing.
HD
PS, it does take some time to load though...
Proud 0wner of the Emachines 6809: AMD Athlon 64 Bit Processor @ 2.0 ghz/512 RAM/80 Gig HD/ATI Radeon 9600 OC'ed @ 432/222.75. DVD-RW/15.4 WideScreen -
Yeah I had thought that a faster HD would improve gaming as well...for the same reasons.
<blockquote id='quote'> quote:<hr height='1' noshade id='quote'>Originally posted by HardDrive
Last edited by a moderator: May 8, 2015 -
It will improve gaming performance somewhat but not in terms of FPS
Notice a lot of hardrive swapping in Halflife 2, upgrade to a faster harddrive the swapping will still be there. The time it takes will be shorter.
Notice that on my 2.4c and 9800 pro 1 gig of pc3200 I still have the harddrive swaping ever so often(In halflife 2).
This is mainly a function of available ram.
Learux -
Well I completed Contract Jack and finally got round to loading Far Cry.
This is the UK edition on one DVD [8D]
The gameplay is a WORLD of a difference from my original experience on my old HD playing the demo.
I'm really enjoying this game so far - I'm playing at 1024x768 with most details on low but a few on medium quality (water & environment) and low sound quality.
Also running the graphics at 420/209.
If anybody has recommended settings for Far Cry - please let me know what you recommend [ ] (e.g. the widescreen mode vs 1024x768)
I played originally with V1.0 and all was good but last night I upgraded to V1.3 and I can't say there is any difference that I notice yet.
Tell it like it is...NOT how it should be
General Gaming Performance of M68xx Series?
Discussion in 'eMachines' started by gtd2000, Sep 8, 2004.