The D3D backend was always forcing Anisotropic filtering when that is enabled regardless of how the game chose to configure the texture filtering registers; this causes the same issues as "Force Filtering" without Anisotropy, such as causing game UI elements to no longer line up adjacent correctly. Historically, OpenGL's Anisotropy support has always worked "better" than D3D's due to seeming to not have this problem; unfortunately, OpenGL's Anisotropy specification only gives GL_LINEAR based filtering modes defined behavior, with only the mipmap setting being required to be considered. Some OpenGL implementations were implicitly disabling Anisotropy when the min/mag filters were set to GL_NEAREST, but this behavior is not required by the spec so cannot be relied on.
Fast depth is now more accurate than slow depth and should always be used.
The option will be kept in a different form as it is still used as a hack to fix some games.
Also, the slow depth code path will still be relied upon by cards that don't support GL_ARB_clip_control.
This is only queried, there's no need to expose it for writing.
Even if it was written to, a data member shouldn't be part of
your public API unless its part of a dumb object or trivial struct.
The dumb wxAUI stuff isn't fully implemented for GTK. So the wxAuiToolBar doesn't properly deduce the size it needs to be when it contains a
wxSearchCtrl object.
Force the manager to set its minimum size to something reasonable.
When there are no games to display in the game list, DolphinWX shows a
message instead. Clicking the message will perform an action. If the game
list truly is empty, the message and action are for opening a browse
dialog, but if the user has hidden some games, they are instead for
unhiding all games. However, the condition for checking which message to
display lacked some parts that are in the condition for checking which
action to use, so the two could be different in rare cases. This PR fixes
that by breaking out the two conditions to a new unified function.
Dolphin has supported the recalibration shortcut (X+Y+Start) for quite a long while. So if someont's axises are terrible, you could easily
recalibrate.
Games even get the initial calibration upon boot(Most of the time).
While changing over the GCAdapter code, I was testing to make sure the reset and calibration shortcuts still worked, turns out they didn't work at
all.
Looking in to the problem, we capture the combination properly, and we wait three seconds until we actually fire that off recalibration.
The problem is for Nintendo's SDK to properly handle recalibrating, we need to send back data saying that it needs to recalibrate.
On hardware this is done as part of the 64bits of data the controller sends back to us.
On holding of the controller, bit 61 of the return value is set, which the Nintendo SDK catches, and then signals immediately afterwards a CMD_ORIGIN
command in order to recalibrate the controller.
We were outright ignoring this bit, so the library wasn't ever recalibrating. I suspect in the past the class itself used to use the calibration data
to to offset the data, but somewhere along the lines it got munged out of existence.
The Gamecube adapter does this shortcut in a bit of a unique way, instead of sending the command and having the library support it and what have you.
Once holding the shortcut for the amount of time, the adapter reports back that the controller has actually been disconnected. Then when you let go of
the combination, the adapter states that a new device has been connected to that port, and the recalibration happens because a new device is
"connected."
This fixes controller calibration for both emulated GC controllers and also the Wii Gamecube Adapter.
We don't throttle by frames, we throttle by coretiming speed.
So looking up VI for calculating the speed was just very wrong.
The new ini option is a float, 1.0f for fullspeed.
In the GUI, percentual values are used.