"A download manager written using Reaction that accepts, on a port, requests for new downloads and is in charge of bringing them down from the network to a local file area (selected by the application, or from the default, by the user). It also manages bandwidth used in total up to a preference set limit, and can resume (if resume is supported). If not started when an application needs it, the application can start it but must check on startup to see if the message port is active."
Its just about downloading files for browsers like OWB or NetSurf, not about keeping software up to date.
As for the sentence.
OWB starts up, you click on download a link with a download manager, it checks to see if the CDM is started (public message port active), and if not, calls for CDM to be started.
OWB starts up, you click on download a link with a download manager, it checks to see if the CDM is started (public message port active), and if not, calls for CDM to be started.
Wouldn't it be better if the CDM did that itself? ie when a new instance is started it check for it own message port and passes any args onto an existing copy. (that's how aweb handles multiple startups anyway)
broadblues wrote: Wouldn't it be better if the CDM did that itself? ie when a new instance is started it check for it own message port and passes any args onto an existing copy. (that's how aweb handles multiple startups anyway)
It would of course save "all" (both) of the calling apps from implementing that logic, but at the cost of having to load the full CDM executable from disk every time you click on some download link, just to make it send the download details to the already running instance and die.
It wouldn't be my choice, but I don't know if DaveP has already thought of something neat ... e.g. some sort of lightweight watchdog daemon could handle all the calls from the apps and only start the full executable the first time (and maybe shut it down when idle).
It would of course save "all" (both) of the calling apps from implementing that logic, but at the cost of having to load the full CDM executable from disk every time you click on some download link, just to make it send the download details to the already running instance and die.
Hmm, were talking small utility here though? Not major app. So simplicty of API would outway the startup load issue?
"small utility" is in no way derogative, I'm imagineing a lean efficient implementation. If it followed the AWeb model I mention aboth the check for multiple instances would occur before any external libraies were reoppened.
It wouldn't be my choice, but I don't know if DaveP has already thought of something neat ... e.g. some sort of lightweight watchdog daemon could handle all the calls from the apps and only start the full executable the first time (and maybe shut it down when idle).
Well the idea would be that you call something like DMConnect( ) and under the covers this checks for the message port, if it is not active, starts the application up (which itself will check in case there is a race condition). This reduces the overhead, GDM is only active when it is needed and you are right, the main daemon should shut down when the reference count is zero, indicating a reference count be maintained.
Quote:
Hmm, were talking small utility here though? Not major app. So simplicty of API would outway the startup load issue?
"small utility" is in no way derogative, I'm imagineing a lean efficient implementation. If it followed the AWeb model I mention aboth the check for multiple instances would occur before any external libraies were reoppened.
In intend for it to be the smallest most lightweight possible. In fact, the actual daemon and downloaders will be the first things available so that people can just use it and develop against it. The user interface could just be a seperate module. Ideally we'd have a common "message pop up" notification system in OS4 that could act as the reporting tool for when the UI was not active.
Quote:
Oh. Well.. that sort of tramples all over my plans for the next version of pftp. Is it worth me continuing now?
Why not collaberate? If you are already doing this tool, then I will quite happily move onto something else.
Well, my plan for the next version of PFTP was to impliment it as an FTP/SFTP/HTTP download manager that also had an FTP/SFTP client bolted on. It was going to be a more-or-less total rewrite.
Unfortunately, I've just not had the time to sit and start work on it yet... so I would feel bad if I put you off of doing your project, and then for some reason never completed my plans.
I also have a bunch of other things I want to do (BlackWidow rewrite, a game, work on SDL, and other things).
So really, i guess i dunno what to do.
Anonymous
Re: Assigned project: Graphical Common Download Manager
Well the source is completely open, so you could use it whatever is contributed to get a head start on your own utility?
I was intending to create a different subtask implementation for each protocol, which just happens to know how to download using that given protocol. So the whole thing is modular.
We could even just bounce ideas off each other, and whomever gets further fastest has the other one help them? I really don't mind at all.
Later on I want to get to harder things, like porting a JVM onto AmigaOS 4.1 (maybe completing one of the ones out there) but I'm going to need to take my time building up my confidence again with Amiga programming with small stuff. It has been a long time...
Tell you what, if you don't mind the download manager having a full FTP/SFTP client attached, we can work together.
If you want, I can create an account for you on my private SVN, or if you prefer something public, I recommend Google Code, but as long as its SVN based, I'm happy
Feel free to get started, i've decided to work on my next game for a while.
For the download manager to work properly with secure sites, it should look like it's part of the program that initiates the download. Otherwise you get the problem that OWB currently has: see here.
This is important when deciding what API to use, because simply being given a URL to download is no use when that URL redirects unauthenticated users to a log in page.
broadblues wrote: @nbache Hmm, were talking small utility here though? Not major app. So simplicty of API would outway the startup load issue?
Yes, I agree that it might be on the brink of not being worth the design effort in this case, but still ... I'd personally try to avoid all that disk trashing. Anyway, it seems DaveP already has a neater design in mind, so this point is moot
Quote:
BTW which two apps constitue "both" ?
Erm ... hehe ... I may have misread the "browsers like OWB or NetSurf" above.
For the download manager to work properly with secure sites, it should look like it's part of the program that initiates the download. Otherwise you get the problem that OWB currently has: see here.
This is important when deciding what API to use, because simply being given a URL to download is no use when that URL redirects unauthenticated users to a log in page.
Agreed. It may be better if CDM doesn't do any downloading itself, but instead only acts as a manager for the downloads.
The API would then simply be something like: id=CDMDownloadAdd("http://blahbahblah/download.zip"); CDMUpdate(id,CDM_BytesCompleted,25,TAG_DONE); CDMCompleted(id);
This would need two-way communication back to the calling application for aborting downloads. Either a poll: event = CDMGetEvents(); Or (better) a hook function which gets called when something interesting happens: CDMRegisterHook(struct Hook *);
Chris wrote: Agreed. It may be better if CDM doesn't do any downloading itself, but instead only acts as a manager for the downloads.
I'm not sure if that's a good idea. One of the reasons that I use a download manager for Firefox is that I sometimes (too often) get downloads that get cut off, but Firefox marks them as having completed successfully. When that happens, I have to restart the download from scratch, even if over 2 GiB were downloaded successfully. If the download manager didn't do its own downloading, then it would probably have the same issues.
I think that it would make sense if the download manager took care of the download too. However, to the remote server it should look like the browser is doing the downloading. The remote server shouldn't see a third-party tool such as wget, but the browser. If the server asks for session cookies, they should be available; and if the server sends a redirect to the log in page, this should be sent back to the browser to handle.
Looking at the server logs of my website, OWB seems to use WGet for file downloads, which won't work with files that require authentication.