A dedicated repository that collects collections to practice/use in MongoDB.
Name | Size | Data type | How to import |
---|---|---|---|
610 Ko |
zip → dump folder |
mongorestore |
|
3.1 Mo |
JSON |
mongoimport |
|
731 Ko |
zip → JSON files |
mongoimport |
|
92 Ko |
JSON |
mongoimport |
|
35 Ko |
JSON |
mongoimport |
|
454 Ko |
JSON |
mongoimport |
|
2.8 Ko |
JSON |
mongoimport |
|
329 Ko |
JSON |
mongoimport |
|
2.3 Mo |
JSON |
mongoimport |
|
666 Ko |
JSON |
mongoimport |
|
470 Ko |
JSON |
mongoimport |
|
525 Ko |
JSON |
mongoimport |
Name | Size | Data type | How to import |
---|---|---|---|
21 Mo |
zip → dump gzip |
mongorestore --gzip |
|
24 Mo |
JSON |
mongoimport |
|
75 Mo |
JSON |
mongoimport |
|
85 Mo |
zip → dump folder |
mongorestore |
|
232 Mo |
JSON |
mongoimport |
|
55 Mo |
RAR (named .zip for confusion) → dump folder |
mongorestore |
Use the import.sh
script provided to insert the "small" and the "bigger" datasets. You can see the help and the options with import.sh --help
.
Docker support: starts a MongoDB automatically in Docker for you.
Only insert the smallest dataset for a quick data import with --small
(cool for live demos).
Docker if you use the docker option.
MongoDB (mongoimport, mongorestore)
unzip
unrar (for the Enron dataset)
此处可能存在不合适展示的内容,页面不予展示。您可通过相关编辑功能自查并修改。
如您确认内容无涉及 不当用语 / 纯广告导流 / 暴力 / 低俗色情 / 侵权 / 盗版 / 虚假 / 无价值内容或违法国家有关法律法规的内容,可点击提交进行申诉,我们将尽快为您处理。