Storage Management ================== .. note:: Please consult the detailed usage in the help of each command (use ``-h`` or ``--help`` argument to display the manual). Backend.AI abstracts shared network storages into per-user slices called **"virtual folders"** (aka **"vfolders"**), which can be shared between users and user group members. Creating vfolders and managing them ----------------------------------- The command-line interface provides a set of subcommands under ``backend.ai vfolder`` to manage vfolders and files inside them. To list accessible vfolders including your own ones and those shared by other users: .. code-block:: console $ backend.ai vfolder list To create a virtual folder named "mydata1": .. code-block:: console $ backend.ai vfolder create mydata1 mynas The second argument ``mynas`` corresponds to the name of a storage host. To list up storage hosts that you are allowed to use: .. code-block:: console $ backend.ai vfolder list-hosts To delete the vfolder completely: .. code-block:: console $ backend.ai vfolder delete mydata1 File transfers and management ----------------------------- To upload a file from the current working directory into the vfolder: .. code-block:: console $ backend.ai vfolder upload mydata1 ./bigdata.csv To download a file from the vfolder into the current working directory: .. code-block:: console $ backend.ai vfolder download mydata1 ./bigresult.txt To list files in the vfolder's specific path: .. code-block:: console $ backend.ai vfolder ls mydata1 . To delete files in the vfolder: .. code-block:: console $ backend.ai vfolder rm mydata1 ./bigdata.csv .. warning:: All file uploads and downloads overwrite existing files and all file operations are irreversible. Running sessions with storages ------------------------------ The following command spawns a Python session where the virtual folder "mydata1" is mounted. The execution options are omitted in this example. Then, it downloads ``./bigresult.txt`` file (generated by your code) from the "mydata1" virtual folder. .. code-block:: console $ backend.ai vfolder upload mydata1 ./bigdata.csv $ backend.ai run --rm -m mydata1 python:3.6-ubuntu18.04 ... $ backend.ai vfolder download mydata1 ./bigresult.txt In your code, you may access the virtual folder via ``/home/work/mydata1`` (where the default current working directory is ``/home/work``) just like a normal directory. If you want to mount vfolders in other path, add '/' as prefix at the forefont of the vfolder path. By reusing the same vfolder in subsequent sessions, you do not have to download the result and upload it as the input for next sessions, just keeping them in the storage. Creating default files for kernels ---------------------------------- Backend.AI has a feature called 'dotfile', created to all the kernels user spawns. As you can guess, dotfile's path should start with ``.``. The following command creates dotfile named ``.aws/config`` with permission `755`. This file will be created under ``/home/work`` every time user spawns Backend.AI kernel. .. code-block:: console $ backend.ai dotfile create .aws/config < ~/.aws/config