Simple graviton client to help importing and exporting data.
composer require graviton/import-export
Get help:
./vendor/bin/graviton-import-export help
This will show you all available commands.
Load from dir:
./vendor/bin/graviton-import-export graviton:import http://localhost:8000 ./test/fixtures
Besides loading data via the HTTP interface, there are core commands available that allow you to load data into the database backend (MongoDB) of a Graviton instance.
You can import a set of existing files via the graviton:core:import
command:
./vendor/bin/graviton-import-export graviton:core:import ./test/data
The core commands file format is slightly different from the normal import format as we need to preserve certain class types. Thus, it's best to insert data into MongoDB and export that into the necessary format. This can be done using the export command:
./vendor/bin/graviton-import-export graviton:core:export ./test/dump-dir
This will dump all the data in the default database. The graviton:core:export
has more options, refer to the --help
print
for more details.
Additionally, we have a purge command that allows you to easily purge (meaning delete!) all collections inside a MongoDB database. You need to pass 'yes' as an only parameter to show that you're sure about that action.
./vendor/bin/graviton-import-export graviton:core:purge yes
The files to be loaded must contain yaml with additional yaml frontmatt (yo dawg...).
The frontmatter part defines what target path a file is to be loaded to.
---
target: /core/app/test
---
{ "id": "test", "name": { "en": "Test" }, "showInMenu": true, "order": 100 }
docker build -t graviton/import-export .
docker run --rm -ti -v `pwd`:/data graviton/import-export
Run phar build:
composer build
cf push <name-of-host>
Or use deploy-scripts to deploy in automated blue/green fashion.
- implement importer
- implement exporter
- build phar
- deploy phar
- automate phar deployment
- document phar usage