Scrape any online Mediawiki motorised wiki (like Wikipedia) to your local filesystem
MWoffliner is a tool for making a local offline HTML snapshot of any
online MediaWiki instance. It goes through
all online articles (or a selection if specified) and create the
corresponding ZIM file. It has mainly been
tested against Wikimedia projects like
Wikipedia and
Wiktionary — but it should also work for
any recent MediaWiki.
Read CONTRIBUTING.md to know more about
MWoffliner development.
User Help is available in the for a a
FAQ.
Run mwoffliner --help
to get all the possible options.
libjpeg-dev
, libglu1
, autoconf
, automake
, gcc
on… and an online MediaWiki with its API available.
To install latest released MWoffliner version from NPM repo (use -g
to install globally, not only in current folder):
npm i -g mwoffliner
[!WARNING]
Note that you might need to run this command with thesudo
command, depending
how yournpm
/ OS is configured.npm
permission checking can be a bit annoying for a
newcomer. Please read the documentation carefully if you hit problems: https://docs.npmjs.com/cli/v7/using-npm/scripts#user
Then you can run the scraper:
mwoffliner --help
To use MWoffliner with a S3 cache, you should provide a S3 URL like
this:
--optimisationCacheUrl="https://wasabisys.com/?bucketName=my-bucket&keyId=my-key-id&secretAccessKey=my-sac"
If you’ve retrieved mwoffliner source code (e.g. with a git clone of our repo), you can then install and run it locally (including with your local modifications):
npm i
npm run mwoffliner -- --help
Detailed contribution documentation and guidelines are available.
MWoffliner provides also an API and therefore can be used as a NodeJS
library. Here a stub example that could go in your index.mjs file:
import * as mwoffliner from 'mwoffliner';
const parameters = {
mwUrl: "https://es.wikipedia.org",
adminEmail: "[email protected]",
verbose: true,
format: "nopic",
articleList: "./articleList"
};
mwoffliner.execute(parameters); // returns a Promise
Complementary information about MWoffliner: