ページのエクスポート

Single wiki pages can be exported to different formats by adding an appropriate “do” parameter to the URL. The following export Options are available currently:

You can specify the do parameter as an HTTP header called X-DOKUWIKI-DO, too. This may usefull for creating a static dump with a website spider.

For exporting other formats refer to the Exporting Discussion page.

Is there a way to export only the table of contents? for the newsflash on ourhomepage (www.brooklynroadrunners.org) this would suffice, referring to the full news story. The programming has already been done, just a way to bring it up? – Geert

Export multiple pages to HTML

For exporting multiple pages or whole 名前空間 have a look at the offline-doku script by Pavel Shevaev.

Unfortunatelly offline-doku does not handle plugin content correctly. Does anyone knows a solution for that?
Pavel's script requires php >4.3. for those who don't want to upgrade change line ~46
from
$tokens = $parser->parse(file_get_contents($file));
to
$fp = fopen($file, "rb");
$buffer = fread($fp, filesize($file));
fclose($fp);
$tokens = $parser->parse($buffer);

Here's an example command line using Pavuk for exporting all pages:

pavuk -dont_leave_site -noRobots -index_name "index.html" -httpad "+X_DOKUWIKI_DO: export_xhtml" -cookie_file cookies.txt -cookie_send -skip_rpattern "(.*\?do=(diff|revisions|backlink|index|export_.*))|feed\.php.*" -tr_chr_chr "?&*:" _ -post_update -fnrules F "*" "%h/%d/%b%E" http://wiki.splitbrain.org

Simply change the URL at the end of the command. Also, this command handles ACL restrictions using a cookie file. Copy the “cookies.txt” file from your web browser's profile to allow the script to login using your credentials.

→ see also related page exporting_website