8 GB bzip2-compressed, for the English Wikipedia, if you download the dump that contains only the current revisions of article-space pages. Much bigger if you want the one with every historical revision.
Forking does not lead to spooning. This sounds like it will quickly become an impenetrable rat's nest. Better to fight it out on the one true source, Wikipedia.
The Internet is already made of "rat's nests" like these. DNS records, BGP route tables, cross-domain embedding, all of these are pulling together information from lots of inconsistent sources to render a page in your browser, and it works pretty well.
The problem Wikipedia is having at the moment is that some people have taken over sections of it as their own "turf". And there will always be a holy war between the completionists and the deletionists, etc. With a federated wiki, you get to pick what kind of info you want to see, and who gets to curate it.
Anyone but Apple.