That's probably the clearest explanation of what needs to be done. Hopefully AMD will release the Radeon 500+ GPU specifications as promised over the next 6 months or so. Their AtomBIOS looks like it will allow 2D support for pretty much every new graphics card they produce from now on. The 3D drivers, on the other hand, will probably be more work.
Out of curiosity, could part of the MESA GPU driver API be used? Obviously the underlying hardware APIs are different, but the MESA->driver API should be usable as-is. Or is there something that they do that's tailored to the Linux way of doing things?
Out of curiosity, could part of the MESA GPU driver API be used? Obviously the underlying hardware APIs are different, but the MESA->driver API should be usable as-is. Or is there something that they do that's tailored to the Linux way of doing things?
Even the OSMesa uses the MESA->driver API. However, implementing a hardware driver means re-implementing a good part of the Mesa internals, since this overrides the default implementations of the MESA pipeline. For example, you can use the software T&L module (which is done in OSmesa) but you would need to re-implement it for your specific card when you want to make a driver that has hardware T&L support.
Just have a look at the sheer amount of code in the drivers subdirectory of the Mesa tree, and there you only see part of it, the rest is in DRM which is a Linux kernel module, both a generic one as well as a hardware-specific one.
Just for giggles I ran a "wc" on it, the common plus radeon part in the Mesa tree is about 30k lines of code. I think people tend to underestimate the complexity of this.
Seriously, if you do want to contact me write me a mail. You're more likely to get a reply then.
That doesn't work either. See my post. It's not only the drivers. For the same reason you cannot use ATI's Windows driver on Linux, and vice versa.
Yes, but having drivers you can take a look at all chipset initialisations that are done at moment using reverse engineering On linux platform, for example, now that there are official ATI/NVIDIA drivers the situation has changed, in the past all attemps to write an accelerated opengl drivers (like in Mesa) has created only SLOW implementations. we need also the chipsed documentation and as you know very well we haven't.. maybe WHEN and IF ATI will release its open source driver (it is planned but when?)
Yes I've looked at the driver code, both in MESA & in DRI, and it's sizeable. There's also the Gallium3D driver system which is supposed to be cross-platform. I have no idea how advanced that is, but it looks promising. In particular they claim that it will make drivers smaller and simpler; that sounds like a good thing.
Hans
EDIT: Actually, Gallium3D looks set to become the official HW driver subsystem of MESA. I've had a look at its architecture, and it looks like you'll only have to write the OS abstraction layer once, and a GPU-specific drm module; the core driver itself is platform independent. Having a common OS abstraction module should reduce the coding burden somewhat.
As I tried to point out, it's not about the hardware itself. I do have documentation on the R100 and R200 and could write a Mesa driver myself given enough time, including hardware T&L, but that is not the point.
@Hans
Yes Gallium3D looks quite good, but they are still quite a bit away from a working system AFAIK. It's something to keep a very close eye on, though.
Seriously, if you do want to contact me write me a mail. You're more likely to get a reply then.
so, to "summarize", no actual work has been done regarding the new graphics system with proper 3d implementation? i mean except of specifications? i hoped this enhancement to be somhow a little bit more advanced in the current os4 betas as it has first been mentiond a long time ago (last year?) there surely have been other priorities additional to the classic version.
so, to "summarize", no actual work has been done regarding the new graphics system with proper 3d implementation? i mean except of specifications?
I didn't say that. As a matter of fact, I have done some work already. It's just not going as fast as I hoped because of other tasks.
Quote:
i hoped this enhancement to be somhow a little bit more advanced in the current os4 betas as it has first been mentiond a long time ago (last year?) there surely have been other priorities additional to the classic version.
We are not talking about "enhancements" but about total replacement. What use is it to build a concrete skyscraper on top of a wooden shack? OpenGL rather belongs at the foundation of a graphics system, not on top of it, so that the graphics system can make use of OpenGL, not the other way around.
Seriously, if you do want to contact me write me a mail. You're more likely to get a reply then.
so, to "summarize", no actual work has been done regarding the new graphics system with proper 3d implementation? i mean except of specifications?
I didn't say that.
that's how i understood it. sorry if i was wrong. and nice to hear that something is done already! Quote:
As a matter of fact, I have done some work already. It's just not going as fast as I hoped because of other tasks.
Quote:
i hoped this enhancement to be somhow a little bit more advanced in the current os4 betas as it has first been mentiond a long time ago (last year?) there surely have been other priorities additional to the classic version.
We are not talking about "enhancements" but about total replacement.
that was a not so well wording from my side. with "enhancement" i meant an enhancement to the os in general (complete new graphic system) - not "only" the 3d part - sorry Quote:
What use is it to build a concrete skyscraper on top of a wooden shack? OpenGL rather belongs at the foundation of a graphics system, not on top of it, so that the graphics system can make use of OpenGL, not the other way around.
You're pretty-much right about OpenGL and MiniGL. MiniGL is simply a cut-down implementation of OpenGL. It supports some of the OpenGL API, but not all. If a program written using OpenGL only uses features that MiniGL has, it can run via MiniGL without problems. We're going to have to live with MiniGL for now.
We'll try! I ran this miniGL demo - Stars of Nukleus and looks real good in the gfx, but the sound is kind of crappy. Is the sound function a part of demo construction and nothing to do with miniGL?
Quote: ...and nice to hear that something is done already!
He didn't say that either. You just can't seem to keep out of trouble with that whole jumping to conclusions problem. [just kidding]
Actually he did, see post #29
Quote:
@MichaelMerkel Quote:
Quote: so, to "summarize", no actual work has been done regarding the new graphics system with proper 3d implementation? i mean except of specifications?
I didn't say that. As a matter of fact, I have done some work already. It's just not going as fast as I hoped because of other tasks.
You're pretty-much right about OpenGL and MiniGL. MiniGL is simply a cut-down implementation of OpenGL. It supports some of the OpenGL API, but not all. If a program written using OpenGL only uses features that MiniGL has, it can run via MiniGL without problems. We're going to have to live with MiniGL for now.
We'll try! I ran this miniGL demo - Stars of Nukleus and looks real good in the gfx, but the sound is kind of crappy. Is the sound function a part of demo construction and nothing to do with miniGL?
That particular demo sounds fine most of the time on my machine.
Sound has nothing to do with MiniGL. However, Warp3D's driver system has to pull some tricks including hardware locking. The hardware locking can cause trouble with other parts of the system, for example, causing the audio thread to not pass data to the sound-card fast enough, resulting in audio stuttering. I managed to reduce it somewhat, but I can't eliminate it. Actually, if a programmer uses the automatic locking setting in MiniGL rather than the smart-lock, no audio stuttering should occur, but the 3D graphics performance will drop like a rock (it's locking unlocking 3D hardware every primitive that it draws).
My one advice is for people to have Interrupt=yes set in their monitor's tooltypes as it reduces this issue. Unfortunately some people's graphics cards don't work in that mode.
Unless someone has some fancy little trick that I haven't thought of, this problem is not going to disappear until Rogue and the OS4 dev team have the new graphics system done.