Export pages to XML
I'd like to know how to export part (ie, a page, it's supages, attachments etc) of our wiki to XML.
I want to export a subsection of the wiki, not all of it. At the moment this encompasses about 30 pages, plus up to 50 graphics and other attachments.
I've seen how to export info using GET, but none of the options suit what I want to do.
Has anyone else done this or know how to do it?
If you are asking to create an offline copy of selected content, based on the request to include attachments & images, would something like HTTrack website copier be one solution (or any other of the same type of product - chose that one as I'd used it before)?
Originally Posted by mickdavidson
It should allow you to select what to export to a flat folder based web container, albeit in html format, which may be nearer to your desired solution.
Steve, thanks, I'll investigate this and let you know. Cheers,