Login
Username:

Password:

Remember me



Lost Password?

Register now!

Sections

Who's Online
145 user(s) are online (121 user(s) are browsing Forums)

Members: 0
Guests: 145

more...

Support us!

Headlines

 
  Register To Post  

ChangeScreenBuffer() - can't reliably close program without OS locking-up
Home away from home
Home away from home


See User information
Does anyone have any advice on the order that things should be destroyed, when quitting a program? Having added ChangeScreenBuffer() code (instead of blitting) there seems a roughly 40% chance that the OS locks-up complete when my program quits.

As far as the documentation indicates, I should be able to create & destroy ScreenBuffers without any problem... presumably as long as ChangeScreenBuffer() has finished doing it's magic?

I've tried listening for both Safe & Disp messages (before anything else can happen - such as quitting), but that didn't seem to help.

It may be relevant that it seems to crash 100% of the time, if I don't have a bogus WaitTOF() before destroying the screen buffers & their associated message ports.

Author of the PortablE programming language.
Go to top
Re: ChangeScreenBuffer() - can't reliably close program without OS locking-up
Not too shy to talk
Not too shy to talk


See User information

Go to top
Re: ChangeScreenBuffer() - can't reliably close program without OS locking-up
Home away from home
Home away from home


See User information
@thomas
Thanks for the example, but unfortunately it doesn't tell me anything I didn't know. So either I screwed-up somewhere (bug), or else my more-complex code breaks some other restrictions not obvious in that example...

For example, is it OK to use double-buffering on a screen with an (e.g. background) window already open on it? I would presume so, since it is Intuition which offers ChangeScreenBuffer().

And is it OK to continue to draw to the window, close the window, etc after destroying the screen buffers?

Author of the PortablE programming language.
Go to top
Re: ChangeScreenBuffer() - can't reliably close program without OS locking-up
Home away from home
Home away from home


See User information
@chris

I may be wrong as it's a long time since I worked with a truely double buffered screen, but I donlt think you can use windows on a doubke buffered screen at all, let alone after you have disposed of the screen buffers.

That's use for rendering, you can open a window (usually backfrop / borderless) to capture input.

All rendering should be via the screen rastport.




Go to top
Re: ChangeScreenBuffer() - can't reliably close program without OS locking-up
Home away from home
Home away from home


See User information
@broadblues
While the autodocs do not mention windows specifically, they DO say that ChangeScreenBuffer() "Performs double (or multiple) buffering on an Intuition screen in an Intuition-cooperative manner". I'll see if any can quiz anyone else on what exactly that means.

Author of the PortablE programming language.
Go to top
Re: ChangeScreenBuffer() - can't reliably close program without OS locking-up
Home away from home
Home away from home


See User information
I have created a post on Hyperion's forum, in the hope more OS devs (and other clever people) might see it:
http://forum.hyperion-entertainment.biz/viewtopic.php?f=26&t=1100

It also documents that I discovered that it (sometimes) crashes at the point of closing the window.

Author of the PortablE programming language.
Go to top
Re: ChangeScreenBuffer() - can't reliably close program without OS locking-up
Just popping in
Just popping in


See User information
I know why it crashes, because it happened to me too on Amiga OS 4.0, it relies on order of closing Window and freeing Screen Buffer. It's different in AmigaOS 3.x and 4.x.

Screen Buffers have also another bug, when you push the screen to back your program sometimes freezes in the infinite loop. I suggest using ChangeVPBitMap.

Go to top
Re: ChangeScreenBuffer() - can't reliably close program without OS locking-up
Home away from home
Home away from home


See User information
Quote:

RNS-AMiGA-Club wrote:
Screen Buffers have also another bug, when you push the screen to back your program sometimes freezes in the infinite loop. I suggest using ChangeVPBitMap.


No, don't use ChangeVPBitMap(). You shouldn't be dropping down to messing with viewports. If there is a bug in the OS, then let Hyperion know via the forum.

Composite3DDemo uses ChangeScreenBuffer(), and doesn't have any problem with either pushing the screen to back, or quitting. The source-code is available for anyone who is interested.

Hans

Join Kea Campus' Amiga Corner and support Amiga content creation
https://keasigmadelta.com/ - see more of my work
Go to top
Re: ChangeScreenBuffer() - can't reliably close program without OS locking-up
Home away from home
Home away from home


See User information
Quote:

ChrisH wrote:
Does anyone have any advice on the order that things should be destroyed, when quitting a program? Having added ChangeScreenBuffer() code (instead of blitting) there seems a roughly 40% chance that the OS locks-up complete when my program quits.

As far as the documentation indicates, I should be able to create & destroy ScreenBuffers without any problem... presumably as long as ChangeScreenBuffer() has finished doing it's magic?


Obviously you have to close any windows that are open on that screen first, before closing the screen. Next, clear out and free the message ports. After this is done, deallocate the screen buffer objects, then close the screen, and then the screen bitmaps.

If you let AllocScreenBuffer() allocate the extra bitmaps, then you need to pay special attention to the rules regarding which bitmaps get automatically freed, and which ones you need to do manually. From memory, it will automatically free the buffer that is currently shown (i.e., the current front buffer), but you will have to free the other one(s). This requires carefully keeping track of which buffer is in use. If you don't want to do that, then allocate everything manually.


Quote:
I've tried listening for both Safe & Disp messages (before anything else can happen - such as quitting), but that didn't seem to help.

It may be relevant that it seems to crash 100% of the time, if I don't have a bogus WaitTOF() before destroying the screen buffers & their associated message ports.


I know that you said that you aleady tried waiting for messages, but that does sound like there could still be messages waiting in the safe & disp message queues.

Here's the message port deallocation code that Composite3DDemo uses:
// Clear out any pending messages and deallocate the message ports
        
if(safeToRenderMsgPort)
        {
            while(
IExec->GetMsg(safeToRenderMsgPort))
            {
                ;
            }
            
IExec->FreeSysObject(ASOT_PORTsafeToRenderMsgPort);
            
safeToRenderMsgPort NULL;
        }
        if(
safeToChangeMsgPort)
        {
            while(
IExec->GetMsg(safeToChangeMsgPort))
            {
                ;
            }
            
IExec->FreeSysObject(ASOT_PORTsafeToChangeMsgPort);
            
safeToChangeMsgPort NULL;
        }


Hans

Join Kea Campus' Amiga Corner and support Amiga content creation
https://keasigmadelta.com/ - see more of my work
Go to top
Re: ChangeScreenBuffer() - can't reliably close program without OS locking-up
Home away from home
Home away from home


See User information
@RNS-AMiGA-Club
Thanks for your hints. While they do NOT explain my problem, they WERE close enough to point me in the right direction. As your solution still seems to have problems, possibly you misunderstood your own problem, so my findings might help you too. (I guess you use it for your 2D graphics library that was announced?)

@Hans
I have found the EXACT circumstances which cause (or avoid) the crash on exit. First, here is the shut-down sequence which WAS causing the *intermittent* crashes:

1. ChangeScreenBuffer() had been called, so I wait until it sends a message to the (previously allocated) DispMessage port. (I guess it may be sufficient to only wait on the SafeMessage port, but I have been playing it safe.)

2. WaitBlit(), to ensuring nothing is blitting to a buffer.

3. FreeScreenBuffer() on both buffers. Then DeleteMsgPort() on any ports that were allocated for the buffers.

4. Inside a Forbit()/Permit() pair I ensure the window's userport has no messages & then close the window. If you don't use the OS4-only StripIntuiMessages(), then message handling may be a little fiddly to get 100% right. If it crashes, then it crashes at the point I call CloseWindow().

5. CloseScreen()


Now, when it DID crash, it turns out that that the screen buffer bitmap that was being used by the screen was NOT the one that the screen was allocated with originally (aka SB_SCREEN_BITMAP buffer), but rather one that had been specially allocated for double-buffering (aka SB_COPY_BITMAP buffer). To me this looks like an OS bug. My suspicion is that when I close the window, Intuition tries to write to the screen's bitmap... but it writes to the screen buffer's CURRENT bitmap, rather than the the screen's ORIGINAL bitmap. And when these buffers are different, this means that it writes to the buffer bitmap that has been WRONGLY destroyed by FreeScreenBuffer(), rather than the original bitmap that WRONGLY still exists. EDIT: Removed wrong & unnecessary statement (and also added "WRONGLY" twice for clarification).

If it is an OS bug, then we have the puzzle why it did not affect your own 3D boing-ball demo.


So what I did to fix the problem was add the following extra step before the aforementioned shut-down sequence:

0. Check which buffer the screen is using, and if it is not using the one it originally had, then call ChangeScreenBuffer(). (You obviously must not do this until the last ChangeScreenBuffer() has finished, unless your code can handle that situation.)


Edited by ChrisH on 2012/6/5 10:39:49
Edited by ChrisH on 2012/6/5 10:42:42
Author of the PortablE programming language.
Go to top
Re: ChangeScreenBuffer() - can't reliably close program without OS locking-up
Home away from home
Home away from home


See User information
Quote:

ChrisH wrote:
@RNS-AMiGA-Club
Thanks for your hints. While they do NOT explain my problem, they WERE close enough to point me in the right direction. As your solution still seems to have problems, possibly you misunderstood your own problem, so my findings might help you too. (I guess you use it for your 2D graphics library that was announced?)

@Hans
I have found the EXACT circumstances which cause (or avoid) the crash on exit. First, here is the shut-down sequence which WAS causing the *intermittent* crashes:

1. ChangeScreenBuffer() had been called, so I wait until it sends a message to the (previously allocated) DispMessage port. (I guess it may be sufficient to only wait on the SafeMessage port, but I have been playing it safe.)

2. WaitBlit(), to ensuring nothing is blitting to a buffer.

3. FreeScreenBuffer() on both buffers. Then DeleteMsgPort() on any ports that were allocated for the buffers.

4. Inside a Forbit()/Permit() pair I ensure the window's userport has no messages & then close the window. If you don't use the OS4-only StripIntuiMessages(), then message handling may be a little fiddly to get 100% right. If it crashes, then it crashes at the point I call CloseWindow().


5. CloseScreen()


There is a possible problem in the highlighted bit above. You should be closing the window before you start freeing the screen-buffers. As I said, unless you're manually allocating all bitmaps FreeScreenBuffer() will automatically free the bitmap that is currently not the front buffer. If the screen's original bitmap is the back-buffer, then that is the one that will be deallocated. I expect that CloseWindow() won't like this.

Try swapping the order of steps 3 and 4. Also, when setting up double-buffering, open the window after you have set up the additional screen buffers.


Quote:
Now, when it DID crash, it turns out that that the screen buffer bitmap that was being used by the screen was NOT the one that the screen was allocated with originally (aka SB_SCREEN_BITMAP buffer), but rather one that had been specially allocated for double-buffering (aka SB_COPY_BITMAP buffer). To me this looks like an OS bug. My suspicion is that when I close the window, Intuition tries to write to the screen's bitmap... but it writes to the screen buffer's CURRENT bitmap, rather than the the screen's ORIGINAL bitmap. And when these buffers are different, this means that it writes to the buffer bitmap that has been destroyed by FreeScreenBuffer(), rather than the original bitmap that still exists (as FreeScreenBuffer() will not destroy the bitmap of a SB_SCREEN_BITMAP buffer).

If it is an OS bug, then we have the puzzle why it did not affect your own 3D boing-ball demo.


Try the change that I mentioned above. If it still crashes, one difference may be that Composite3DDemo allocates its own bitmap for all screen buffers (see the SA_BitMap tag for OpenScreen()). You'll find the relevant code in Composite3DDemo.cpp. For reference, the safe and render message ports are allocated and freed in the C3DContext class' constructor and destructor (C3D/C3DContext.cpp), and the window opening and closing occurs in the Demo class' constructor and destructor (Demo.cpp).


Quote:
So what I did to fix the problem was add the following extra step before the aforementioned shut-down sequence:

0. Check which buffer the screen is using, and if it is not using the one it originally had, then call ChangeScreenBuffer(). (You obviously must not do this until the last ChangeScreenBuffer() has finished, unless your code can handle that situation.)


The documentation is pretty clear that this should not be necessary.

Hans

Join Kea Campus' Amiga Corner and support Amiga content creation
https://keasigmadelta.com/ - see more of my work
Go to top
Re: ChangeScreenBuffer() - can't reliably close program without OS locking-up
Home away from home
Home away from home


See User information
@ChrisH

One more thing, you said:
Quote:
(as FreeScreenBuffer() will not destroy the bitmap of a SB_SCREEN_BITMAP buffer)


That is not what the autodocs say. From the autodocs:
Quote:

The SB_SCREEN_BITMAP flag instructs AllocScreenBuffer() to provide
a ScreenBuffer referring to the screen's actual bitmap. When
you are done changing screen buffers, you must FreeScreenBuffer()
the currently-installed ScreenBuffer before you close the screen.
Intuition will recognize when FreeScreenBuffer() is called for
the currently-installed ScreenBuffer, and will know to free the
supporting structures but not the BitMap. CloseScreen() will
take care of that.


So, FreeScreenBuffer will destroy the bitmap of an SB_SCREEN_BITMAP if it is not the front buffer. It is CloseScreen()'s responsibility to free the bitmap that is the current front-buffer (and hence, is attached to the screen).

Hans

Join Kea Campus' Amiga Corner and support Amiga content creation
https://keasigmadelta.com/ - see more of my work
Go to top
Re: ChangeScreenBuffer() - can't reliably close program without OS locking-up
Home away from home
Home away from home


See User information
@Hans
After sleeping on this, I realised exactly what argument we would probably have, because you would be seeing this from a different point of view to me. I still think this is an OS bug, so I'm afraid you will have to follow a bit more explanation from me...

ALSO, one of my final statements in my earlier post was wrong (but also entirely unnessary), so I have removed it. I guess this also contributed to you misunderstanding me. (And I also added two "WRONGLY" words, to clarify my original post.)

Quote:
There is a possible problem in the highlighted bit above. You should be closing the window before you start freeing the screen-buffers. ...

Try swapping the order of steps 3 and 4. Also, when setting up double-buffering, open the window after you have set up the additional screen buffers.

What I am doing is NOT a mistake, but entirely intentional! Because what I want to allow is that the user of my graphics system can enable & disable double-buffering when ever they want. This means that I (may) enable double-buffering AFTER the window has been opened, and similarly I (may) draw into the intuition window after disabling the double-buffering.

While I can change the order of closing the window & freeing the buffer, I strongly suspect that I will still see crashes in other situations, such as drawing into the window after double-buffering has been disabled.

Quote:
FreeScreenBuffer() will automatically free the bitmap that is currently not the front buffer. If the screen's original bitmap is the back-buffer, then that is the one that will be deallocated.

Quote:
So, FreeScreenBuffer will destroy the bitmap of an SB_SCREEN_BITMAP if it is not the front buffer. It is CloseScreen()'s responsibility to free the bitmap that is the current front-buffer (and hence, is attached to the screen).

That is exactly what I *expected* to happen, but the evidence from my previous post is that does *not* happen! Let me restate (in a different way) the sequence of events which causes my crash:

1. Open screen, open window & add double-buffering (please do not argue about the ordering of these, as that is irrelevant to the point I am making).

2. ChangeScreenBuffer() is called an *odd* number of times. i.e. 1, 3, 5, etc. For simplicity, let us imagine that it is only called once.

3. We now have the situation that the screen buffer allocated using SB_SCREEN_BITMAP is not visible anymore, and instead the buffer allocated using SB_COPY_BITMAP is visible. So drawing into the window will modify the SB_COPY_BITMAP buffer.

4. Now we decide to disable double-buffering (switching back to single buffering), so we call FreeScreenBuffer() on BOTH screen buffers. According to the documentation this should destroy the bitmap assocated with the SB_SCREEN_BITMAP buffer, since it is not visible. HOWEVER, what *appears* to happen instead is that the bitmap associated with SB_COPY_BITMAP buffer is destroyed, even though it is still visible.

5. Destroying the window (or probably just drawing into the window) will cause Intuition to try to update the screen bitmap which *was* associated with the (now destroyed) SB_COPY_BITMAP buffer. Unfortunately this bitmap (seems to) have been destroyed, so the OS crashes.


Note that the OS does NOT crash if step 2 has called ChangeScreenBuffer() an *even* number of times. This seems pretty concrete proof.

And from what RNS-AMiGA-Club said, AmigaOS3 probably behaved how we expected, but I imagine that AmigaOS4 totally rewrote the ChangeScreenBuffer() stuff (perhaps around the time they added screen-dragging or Compositing support?), and probably the bug appeared at that point. It would be EASY to (wrongly) think that SB_COPY_BITMAP buffers should always have their bitmaps destroyed (and SB_SCREEN_BITMAP never destroyed), if you didn't VERY think carefully about it.

And since the common usage of double-buffering is to use it until the screen is destroyed (such as your 3D demo I imagine), those programs would not crash the OS, because they would destroy the screen immediately after destroying the buffers. Presumably CloseScreen() suffers from the same logic flaw, and so destroys the bitmap that was associated with the SB_SCREEN_BITMAP buffer.

P.S. I will create a BugZilla report if you agree with me it is a bug.


Edited by ChrisH on 2012/6/5 10:57:32
Edited by ChrisH on 2012/6/5 11:19:43
Author of the PortablE programming language.
Go to top
Re: ChangeScreenBuffer() - can't reliably close program without OS locking-up
Home away from home
Home away from home


See User information
Quote:

3. We now have the situation that the screen buffer allocated using SB_SCREEN_BITMAP is not visible anymore, and instead the buffer allocated using SB_COPY_BITMAP is visible. So drawing into the window will modify the SB_COPY_BITMAP buffer.


Have you verified that by checking the window layers bitmap pointer and comparing it to you other bitmaps to see which one it points too?

I suspect it more likely still refereing to the original bitmap that was setup with the screen. Which maybe why it will crash later when you close it having destroyed that bitmap.

Quote:

It would be EASY to (wrongly) think that SB_COPY_BITMAP buffers should always have their bitmaps destroyed (and SB_SCREEN_BITMAP never destroyed), if you didn't VERY think carefully about it.


If that were happening it would likely crash when you closed the screen after an odd number of swaps. I tested my old double buffered app and there no issues with it. It open the window before setting up the buffers but it closes it before destoying them. Whihc is what you should do. (I know you want to do something else but I don't think you can).



Go to top
Re: ChangeScreenBuffer() - can't reliably close program without OS locking-up
Home away from home
Home away from home


See User information
Quote:
Have you verified that by checking the window layers bitmap pointer and comparing it to you other bitmaps to see which one it points too?

Nope. At this point I am just trying to verify that what I am doing makes sense. Chasing down exactly why it is failing can be left until after then.

Quote:
If that were happening it would likely crash when you closed the screen after an odd number of swaps.

WHY? If CloseScreen() was implemented using the same faulty assumption, then it would destroy the bitmap (buffer) that had not been destroyed by ChangeScreenBitmap(). i.e. It would destroy the bitmap that was associated with the SB_SCREEN_BITMAP buffer.

Quote:
It open the window before setting up the buffers but it closes it before destoying them. Whihc is what you should do. (I know you want to do something else but I don't think you can).

By working-around the OS bug I have identified, my programs now work very nicely . I think this also proves that I am (basically) right.

Author of the PortablE programming language.
Go to top
Re: ChangeScreenBuffer() - can't reliably close program without OS locking-up
Home away from home
Home away from home


See User information
@all
I thought it would be useful to see how AmigaOS*3* behaves. Having done a quick test, it appears to have a similar (not identical) problem. Which implies this is not an OS bug, but rather an OS limitation.

Specifically, if I enabled & then disable double-buffering, I get a nasty crash if I draw to the (unbuffered) window when I have used an *odd* number of ChangeScreenBuffer() calls. With an *even* number it works fine.

So NO need for me to file a bug report in BugZilla I think.

@broadblues Quote:
suspect it more likely still refereing to the original bitmap that was setup with the screen. Which maybe why it will crash later when you close it having destroyed that bitmap.

This now looks like a more plausible explanation.

Author of the PortablE programming language.
Go to top
Re: ChangeScreenBuffer() - can't reliably close program without OS locking-up
Home away from home
Home away from home


See User information
Quote:

ChrisH wrote:
@Hans
After sleeping on this, I realised exactly what argument we would probably have, because you would be seeing this from a different point of view to me. I still think this is an OS bug, so I'm afraid you will have to follow a bit more explanation from me...


I'm seeing this from the point-of-view of helping another developer figure out screen double-buffering without having to resort to experimentation like I did.


Quote:
Quote:
There is a possible problem in the highlighted bit above. You should be closing the window before you start freeing the screen-buffers. ...

Try swapping the order of steps 3 and 4. Also, when setting up double-buffering, open the window after you have set up the additional screen buffers.

What I am doing is NOT a mistake, but entirely intentional! Because what I want to allow is that the user of my graphics system can enable & disable double-buffering when ever they want. This means that I (may) enable double-buffering AFTER the window has been opened, and similarly I (may) draw into the intuition window after disabling the double-buffering.


Okay, so you are intentionally doing something in the "wrong" order. You're asking for trouble doing things this way. Screen double-buffering was never designed for you to be able to enable/disable it at-will like you are trying to do.

EDIT: On reflection, deleting buffers while still using the screen might be okay; the big problem is rendering via the window. It's still not intended usage though.

You're also supposed to render directly to a double-buffered screen's bitmap, and not via windows. The windows are clueless as to which buffer should be rendered to. So, the window should only be used to capture input events; don't use it for rendering.

If you want to disable double-buffering at will, then you should either reopen the screen with the new settings(what most people do), or simply render straight to the front-buffer, and stop swapping buffers.


Quote:
3. We now have the situation that the screen buffer allocated using SB_SCREEN_BITMAP is not visible anymore, and instead the buffer allocated using SB_COPY_BITMAP is visible. So drawing into the window will modify the SB_COPY_BITMAP buffer.


Actually, it's probably still rendering to the the SB_SCREEN_BITMAP. The Window structure has a pointer to a rast-port, which in turn has a pointer directly to the target bitmap. I highly doubt that ChangeScreenBuffer() will go through all of the screen's windows and update the rast-ports; that's not what it's designed to do. It doesn't actually know which buffer you want to render to next, so it doesn't know what to change it to.

Since FreeScreenBuffer() is probably correctly deleting the back-buffer's bitmap which happens to be the SB_SCREEN_BITMAP, your window now has a bad pointer.


Quote:
P.S. I will create a BugZilla report if you agree with me it is a bug.


You'll need to check whether it really is rendering to SB_COPY_BITMAP as you suspect, or if it's doing what I think that it is. If you can confirm that it isn't doing what the documentation says that it should, then please submit the bug report.

Hans


P.S. You might want to consider filing an enhancement request against the SDK for ambiguities in the documentation. Your assumptions about how a window can be used on a double-buffered screen are reasonable, even if they are incorrect. The autodocs should clarify this, and it should also be explained in the new documentation wiki.

Join Kea Campus' Amiga Corner and support Amiga content creation
https://keasigmadelta.com/ - see more of my work
Go to top
Re: ChangeScreenBuffer() - can't reliably close program without OS locking-up
Home away from home
Home away from home


See User information
Quote:

broadblues wrote:
I suspect it more likely still refereing to the original bitmap that was setup with the screen. Which maybe why it will crash later when you close it having destroyed that bitmap.


I didn't notice this comment before I wrote my reply. Yes, I think that you've hit the nail on the head with this, and that's exactly what's happening.

I also think that the autodocs could use a little clarification. Like Chris, I figured out screen double-buffering by trial and error. The only reason why I didn't have as much trouble as ChrisH, is because what I was trying to achieve was much simpler.

Hans

Join Kea Campus' Amiga Corner and support Amiga content creation
https://keasigmadelta.com/ - see more of my work
Go to top

  Register To Post

 




Currently Active Users Viewing This Thread: 1 ( 0 members and 1 Anonymous Users )




Powered by XOOPS 2.0 © 2001-2024 The XOOPS Project