Have some interesting theoretical questions. Over the years, from time to time, i find out in AmigaOS apps some strange issues, like adding the calling to any function (as a stub) inside of the code blocks, may cause different execution and results of that code block.
And sometimes, depends on having print, and not having print, cause different results of executed code.
I meet with that issues already 3-4 times in the last years in different projects and do not remember if i was able to fix it all and understand wtf.
All I understand is that once we have another call in the code, we have a) bigger text code segment b) slow the resources down for a mile second, so that can "fix" race condition issue or something.
Having prints inside, make everything work. Commenting prints, make everything not work. As far as I remember, it doesn't matter much if it prinf() or any other place-holder function, the important bit is to call "something", and probably not just from amiga libs, but from the newlib/etc.
Have anyone any clue about ?
Edited by kas1e on 2022/3/2 11:58:45 Edited by kas1e on 2022/3/2 11:59:15 Edited by kas1e on 2022/3/12 11:28:47
This can be a sign of the optimizer doing something funny. Try to compile this file with -O0, or if the compiler supports it add __attribute__((optimize("O0"))) to the affected function. If that fixes the problem you can either leave that function unoptimized, or you can try to make problematic variables "volatile".
I tried to replace printf() on just IDOS->Delay(1); , that also fixed the problem.
It all looks like some race condition or something. Or maybe that is the way the function pointer is gathered?
What is more interesting is that this all about exactly "whole" fuction, i can put call to other functions anywhere else to make whole logic of function work. There that piece of code:
static SDL_bool InitShaders()
{
int i;
/* Check for shader support */
shaders_supported = SDL_FALSE;
if (SDL_GL_ExtensionSupported("GL_ARB_shader_objects") &&
SDL_GL_ExtensionSupported("GL_ARB_shading_language_100") &&
SDL_GL_ExtensionSupported("GL_ARB_vertex_shader") &&
SDL_GL_ExtensionSupported("GL_ARB_fragment_shader")) {
if (!shaders_supported) {
//printf("shaders support false!\n");
return SDL_FALSE;
}
IDOS->Delay(1);
/* Compile all the shaders */
for (i = 0; i < NUM_SHADERS; ++i) {
if (!CompileShaderProgram(&shaders[i])) {
//printf("but compile shaders fail!\n");
SDL_LogError(SDL_LOG_CATEGORY_APPLICATION, "Unable to compile shader!\n");
return SDL_FALSE;
}
}
/* We're done! */
return SDL_TRUE;
}
See, i even can put Delay(1) before compiling all the shaders, and then it all start to works..
I'm not near anything you can do in AmigaOS compiling/writing/coding, so I don't think I can add anything valuable to this thread, but I'll still try.
This (and what you wrote on ptitseb thread) sounds like a race condition, probably limited to our platform, maybe even our gfx driver/ogles2 (or SDL2?).
Since you tried with both a printf and a delay it looks like it always needs some sort of delay to make the code work (after GL stuff), even if it's just a milisecond(?)
Since we have some sort of workaround already (delay), you could maybe create a short example and ask Hans/Daytona?
@Raziel What happens back in time for me for some other stuff, where if I remember right GL wasn't involved, but can't be sure. But what for sure, that it is not the first time.
But i also think it's some kind of race condition when something is just not fast enough.
It may also just be that your magic function (Delay() or printf()) triggers some linker autoinit stuff which is otherwise not happening.
What happens if you put the Delay() or printf() in some place in the .c file which never gets called. Or inside some check which never is true (but make sure it's not something the compiler optimizes away completely). Like "if (SysBase->SysFlags == 0xABCD)"
Interesting that if i add instead of Delay or Printf that:
for (i = 0; i < 100000000; ++i) {
}
Then while i clearly see pause visually, it still didn't make it works. So that is not "pause" per se fix it.
I also tried to use rand(); - no luck.
What is more fun, if i use small printf() like prinf("a"); then it also didn't fix issue, but more "a" i put, then it works.
And i 100% have the same kind of issue somewhere back in past. Exactly that kind, when "long" print fixes issues, but small are not. If i remember right i fix it only by restructuring the source or something, but still, never understand the root cause.
@kas1e Aha, so you had to introduce a small delay before compiling the shaders, I was completely off the mark. Is this is the program in question? https://fossies.org/linux/SDL2/test/testshader.c In this case it would be good to know at which point CompileShaderProgram returns FALSE.
In this case it would be good to know at which point CompileShaderProgram returns FALSE.
If only I know :) Once i will start put _any_ printf things will start work.. I may try to comment out some parts, but then, the code will be shorter and may start to work without pointing us out on the issue.
@BSzili Just tried to comment out in InitShaders() two "return SDL_FALSE;" => no luck. Also tried to rebuild it with GCC 8.4.0 and with 11.2.0 => the same result.
At the moment trying to short the test case as much as possible
EDIT: Don't know if i got any futher,or just looping on the same ring, but seems that issue actually coming from CompileShaderProgram() function in that test case. At least, if i add printfs() there, it also works then.
But i still can't get how it. Maybe some auto-paging of memory regions enables/disables by default..
Remember that while printf() looks simple in the source code, there's a lot going on under the hood, involving the console (which decides where the printed text goes), the graphics library (which renders the font characters into pixels), and the graphics card driver (which gets the pixels to the screen).
Somewhere in all of that there's almost certainly something that has to Wait(), which gives other tasks a chance to run. So printf() doesn't just take some time to execute. but it also allows other tasks to run. Your for() loop takes time, but most likely doesn't cause a task switch (assuming the delay isn't so long that it's preempted). Calling Delay() of course also allows other tasks to run, without all the other complexity of printf().
I'm not a 3D graphics guru, so I have no idea what those graphics calls are doing. But if it involves another task that's running asynchronously, then letting that task run while you're delaying may give it a chance to complete some necessary work that doesn't get done in time otherwise.
@kas1e Yeah, it's probably not a compiler issue, but likely related to something going in inside the opengl library. Did you try adding a global variable for CompileShaderProgram? Just set it to __LINE__ before every return, and print it in InitShaders. This way you'll have a better idea of which point it fails.
To me this does not look or "smell" like race condition or timing related at all. Add an always failing runtime checked (to make sure it can't be optimized away) check around printf call that usually makes the thing work and see what happens. If it still works even if the actual printf() call never gets executed then you know it's not timing/race condition problem.
- printf("blablabla") + if ((int)SysBase == 0x1234) printf("blablabla");
if ((int)SysBase == 0x1234) printf("blablabla");
// IDOS->Delay(1);
// printf("aa");
/* Compile all the shaders */
for (i = 0; i < NUM_SHADERS; ++i) {
if (!CompileShaderProgram(&shaders[i])) {
SDL_LogError(SDL_LOG_CATEGORY_APPLICATION, "Unable to compile shader!\n");
return SDL_FALSE;
}
}
IIRC in AROS I once had a funny situation in a some test program which opened a window and would quit when you press some key. In the shell you typed in the command, pressed RETURN key, the window opened quickly and the RETURN key release happened after that and so the event went to the window/test program which then quit immediately.