If app doesn't call SDL_Quit(), then our thread subsystem will not be cleaned up. Maybe I should try again constructor/destructor approach which might solve this implicitly.
@Capehill Btw, in SDL1, there was the SDL_VideoModeOK() function, which tested whether the specific video modes worked, and I didn't see it in SDL2, so perhaps it no longer exists?
@All Did i understand right, that if i want to use SDL2's scaling, i should create Renderer in any case ? Right now that what i do:
-- SDL_Init_everything
-- SDL_CreateWindow with SDL_WINDOW_FULLSCREEN_DESKTOP (so it open 1920x1080 screen for me), but with the window of 640x480 size
-- screen = SDL_GetWindowSurface(_Window);
And then, use this screen just like it was used in SDL1 (as surfaces still supported in SDL2).
When i need to flip frames i just call SDL_UpdateWindowSurface(_Window);
Question is how i can scale by SDL2 my 640x480 window in this case, to the whole 1920x1080 ? Should i create renderers for ? Or i can do so by some function even with surfaces ?
I didn't see function similar to SDL_VideoModeOK either.
Scaling seems to exist for surface (https://wiki.libsdl.org/SDL2/SDL_BlitScaled) but if you want to accelerate it, you should create a renderer and a texture, and copy surface to texture each frame.
SDL_WINDOW_FULLSCREEN_DESKTOP is not working properly (it was never supported so far) but I started to test it yesterday. Need some debugging.
@Capehill For me need at least even slow method to see how it will be. The point is that i just make some hack&slash to change SDL1 in the EUAE to SDL2, and while i do have now all renders on screen, i want to make the usually 320x240 windowses scale to 1920x1080. For that i open window with FULLSCREEN desctop , and it already open 1920x1080 screen for me, but, just with the the same small window.
If you in interest to have a brief looks of changes with SDL2, there they are:
There i just create dektop_kind window , so it opens in 1920x1080, but inside still open the same small uae window (520x732 or something), which i want to scale till full 1920x1080. Just to see how it will be looks like, and if it at all worth to add SDL2 support to EUAE. Of course by itself swith to SDL2 need it, but i mostly worry there about scaling to fullscreen (and later may add some better scaling algos, etc once it will works with SDL2 scaling)
- Fix joystick GUID generation and update internal gamecontroller database with new GUIDs. Remove related RC1 workaround. Please use the SDL 2.26.1 RC2 (or newer) library when providing new controller mappings. - Support SDL_WINDOW_FULLSCREEN_DESKTOP flag. It's implemented as a custom screen (instead of WB). - Refactor library init/quit routines: initialize thread subsystem using constructor. - Refactor shared library management: open common libraries in constructor and close them in destructor. - Require ogles2.library minimum version 2 (instead of 0).
Is it correct that AHI supports surround (7.1) only in 32-bit mode? So if application is requesting 8 channels in 16 bits that should fallback to stereo? Or if application is asking 3-7 channels, that should fallback to stereo?
First of all, I'm am not an AHI expert. Sniffing through the AHI device sourcecode is still on my todo list.
But this is how I understand AHI so far:
Yes, AHI expects 32bit samples in a 7.1 mode.
According to AHI.h: #define AHIST_L7_1 (0x00c3000aUL) /* 7.1, 32 bit signed (8xLONG) */
It's the format of the application output buffer itself that determines how many independant ouput channels are presented to the driver.
After that, AHI does 1:1 mapping of channels in your databuffer.
If you present a stereo buffer to AHI when a 7.1 audiomode is selected for your application, AHI will do no channel upmixing for you. It simple present a stereo buffer to the driver and my driver will playback only on front left and right. The other channels will be muted.
If you present a 7.1 buffer when a 7.1 audio mode is selected then all 8 channels will be played. The order of the samples in the buffer will determine on which speaker the channel is played. If your application has for example just 5.1 data then do the correct channel mapping in your 7.1 buffer and leave the other two channels empty.
HIFI modes will always present 32bit mixing buffers to the driver. When you present a 16bit stereo buffer to any HIFI mode, AHI will upmix the format to 32bit and presents a 32bit stereo mixing buffer to the driver.
Unfortunately AHI doesn't know a 16bit 7.1 buffer format. So in case of multichannel 16bit sound, I would do int16->int32 conversion instead of clipping to stereo. It will not cost much processing power: *output_channel = (int32)*input_channel*65536;
It seems that SDL should do the format conversions automatically when needed so application can use almost any format.
@geennaam
I tried to test 32-bit audio a little bit, and it works nicely in stereo mode. But in 7.1 mode there is some kind of extra noise on FL/FR. I don't have surround speakers so cannot verify 7.1 mode properly.
The noise that you are referring to sounds like a looping sample click. Basically, the end of the sample doesn't fit the the start of the sample. I don't hear any other noise.
Surround works. Only the channel mapping is off.
testsurround32 output -> soundcard channel
"Front center" -> Rear left "Low Frequency Effect" -> Rear Right "Back Left" -> Side left "Back right" -> Side Right "Side Left" -> Center "Side right" -> LFE