Anyone can make a backup of any site by retrieving the system/export.json from the site.
Pages from an export file can be restored one at a time by dropping the export file on empty space in the destination wiki's web page. See Backup, Keep Safe
The export.json file differs from the format pages are routinely saved. The wiki-server implementation prefers to keep pages in separate files within a pages/ directory.
An administrator with ssh access to a server can reconstruct the pages directory from an export file using the json utility 'jq'. stackoverflow
cat export.json | jq -c -r 'keys[] as $slug|"\($slug)\n\(.[$slug])"' | while read -r slug ; do read -r item printf "%s" "$item" | jq . > "pages/$slug" done
Here is a slower but slightly simpler version of this script.
for slug in `cat export.json | jq -r 'keys[]'` ; do cat export.json | jq '.["'$slug'"]' > pages/$slug done