![]() ![]() Your second set of settings probably works well because by turning off RetroArch’s vsync you’ve also disabled hard sync and are instead using your video card’s triple buffered adaptive vsync method. Do you get good performance with the first set of settings, but with hard sync off? So, does anyone have any ideas as to why I’m unable to use the suggested settings without running into problems, and why retroarch is awesome when i use the second set of settings? I’m assuming this is hardware specific, but what would I need to upgrade? Does my cpu just suck? I thought 2.2ghz dual core would be enough to handle any game up to the late 90s… Maximum pre-rendered frames: use application setting. Threaded driver optimization - auto or off Vsync - “adaptive.” I also tried “ON” and got nearly the same results, but I think adaptive was slightly better (it’s advertised as a more advanced form of vsync by nvidia). I’ve tried basically every combination of the above settings and only found one that resulted in no screen tearing, no input lag, no a/v jitters and no horrendous frame rate drops. ![]() I’ve tried adjusting triple buffer and vsync and threaded driver in my graphics card cp, but get the same results. It’s quite distracting and takes one out of the experience when it occurs. With these settings, I get skippy video/audio, and every so often my frame rate plummets- maybe to 30- for just a second or so, then returns to normal. Maximum pre-rendered frames- use application setting Using the suggested settings in this guide ( … ows-guide/), I get sub-optimal performance. I wanted to comment on the unusual experience I’ve had with configuring retroarch to run optimally on my machine. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |