What I'm dreaming of is a tool being able to load a set of .h files, let select some functions out of them and then would generate a corresponding xml file for idltool. This would greatly ease the conversion of unix libraries to Amiga shared librairies.
@broadblues The PDF says "The file exec_sg/src/exec/testlib.c implements an example OS 4 library which exports several interfaces." HOWEVER, I can't find the "testlib.c" file. Can you help?
You can generate a skeleton projects using files from SDK:includes/interfaces/*.xml and passing them to idltool (with -a option IIRC, but can't check right now) it will give you a set of files ready to be compiled to generate the library (even the makefile is here IIRC).
Of course this will only generate the skeleton not the real library. The advantage of IDLTool is that you only have to implement bodies of the library functions everything else is already done.
See also http://www.libcpu.org/ for translating assembly to LLVM Bitcode. The AROS team has already got a bounty going for an LLVM backend to produce machine language for AROS x86 and other backends are sure to follow. Perhaps somebody would be willing to produce an LLVM backend for OS 4 on the PPC based on the PPC Linux backend on the LLVM archives.
As far as being universal, we'll need some universal runtime libraries as well. This brings us back to the subject.
Would it, by the way, be possible for IDLTool to generate extern directives for each function so that an automated build script could link code to the library skeleton instead of creating an empty one?
Samurai_Crow wrote: See also http://www.libcpu.org/ for translating assembly to LLVM Bitcode. The AROS team has already got a bounty going for an LLVM backend to produce machine language for AROS x86 and other backends are sure to follow. Perhaps somebody would be willing to produce an LLVM backend for OS 4 on the PPC based on the PPC Linux backend on the LLVM archives.
It would definitely be useful to have an LLVM backend for AmigaOS 4.x.
Quote:
Would it, by the way, be possible for IDLTool to generate extern directives for each function so that an automated build script could link code to the library skeleton instead of creating an empty one?
IIRC, this is already possible. You can ask it to not create the empty function stubs. This is kind of essential since you don't want it to wipe your existing functions if you use IDLTool to add extra functions, or another interface.
I have hardly ever used IDLTool myself (not in the habit of creating new libraries), so you'll have to look up the details in the documentation provided in the SDK.
@Deniil The main reasons I can see for using LLVM:
1. To use it's C compiler to replace GCC, if it produces a better executable. (In general this is not the case yet, but maybe soon.)
2. If you want to write a compiled language that will easily work on more than one CPU, without using C as an intermediate. (Quite a bit of work, but some people enjoy that kind of thing.)
3. If you want to write a JIT that is not limited to one CPU.
4. If you want to have a "compile once run anywhere" platform. This requires having a (functionally identical) run-time for every OS that you want to support, which is not small feat.
5. If you don't like GPL 3 code dependencies, you can switch to LLVM's UIUC Public License. (Not a big deal for most of us but maybe a big deal to Hyperion.)
6. If you're developing on a Classic Amiga and nobody will apply your patches to the mainline to make the 68k backend of GCC better, you can switch to LLVM where patches are welcome.
Hmm, I've just been playing around with creating a .library from a .a using Joerg's old example code.
It took me about an hour for a small library (including the odd schoolboy error and locating changes which weren't explicitly mentioned in the docs). Actually I haven't tested it so I don't know for sure that it worked.
A lot of the process ought to be able to be automated. Something like idltool could generate most of the files (most of the files are modified versions of ones created by idltool anyway). You can skip generating the cruddy xml file and instead rip function definitions straight out of the C headers which can then directly be used in the library and libauto-style stub code.
Well, OK, ripping the function defs out is the key part, and the one bit I'm not entirely sure how to do.
Whilst working on that problem, a manual function table in a sane format (ie. not XML, I'm thinking more along the lines of .FD) could easily be used to create all the files. A bunch of template code with the functions injected is something that could be written in ARexx.
A manual function table I suppose has the benefit of ensuring new functions are actually added to the end, rather than somewhere in the middle, although I'm not sure whether order matter for OS4-style interfaces (it obviously does for 68k libraries)
As you said doing this on a simple library is not a big deal, but try to do it with a bigger library and you'll be gone for an annoying job. Converting a C definition to the XML format is real a pain an error prone. Being able to automate the writing of the XML file would be really a great advance. I don't understand why you want to invent yet another format we already have this IDL why change again, all it needs is a a tool above it for porters to help them. And about your critics about XML not being "sane format", come on it's not 1990 anymore it's 2010 and despite not being an all-XML fanatic I really think that here it's rightly used. It's just that the format used forbid a straigth copy/paste from C include but if you want to be language independant you don't have the choice.