This tutorial is targeted film production with an accsyn BYOS Workspace, and are not applicable for Cloud Workspaces.
This tutorial describes how to setup API driven sync of file assets between main premises (hq) and a remote site - a common scenario in film post production for distributed organisations:
The main storage on main premises (hq) holds source production assets.
Each projects has a fixed templated folder structure, enabling automisations.
Work is going to be performed on a selected subset (e.g. shots) at the remote office (site)
An operator (employee) should be able to push (download) the required file assets to the site, from within their in-house production tool.
Each project should have its own queue, to be able to prioritise transfers on a per-project basis.
The operator should get an email notification when all assets are synced.
Note: Made up example data is provided in [brackets] throughout this tutorial.
An active BYOS workspace or an ongoing BYOS trial. Example workspace code/API domain for this tutorial: [acmepost].
At least one active storage and one active site server license.
An elevated admin accsyn account for setting up the site [admin@acmepost.com]
An employee account for the operator initiating the sync [coordinator@acmepost.com]
A valid API key for the operator, see Python API documentation for instructions on how to create one.
Hint: find out your workspace API code identifier @ https://accsyn.io/admin/settings >"General" tab.
We are not going into great details on how to setup remote sites with accsyn, this is well covered in the site documentation:
Create the site @ https://accsyn.io/admin/sites/new that will represent the remote office in accsyn [london]
Install a server on the site @ https://accsyn.io/admin/servers/new, acting as the endpoint for all p2p ASC file transfer between main hq site and the office.
Edit the accsyn storage volume to be served at the site by the new server, provide a custom volume path in case storage is not mounted at the same path at site as it is at hq.
Test that you can transfer files to site, by logging on the accsyn Desktop app and transfer a file from hq to site.
You are now all set to start using the API to push files.
Note: You can skip this step if you do not need project queues - all sync job to run in the default (Medium prio) queue.
Create a queue for the project @ https://accsyn.io/admin/queues/new [proj001]
Repeat this for all projects you wish to enable sync for.
Queues can also be created using the API, suitable for integrating accsyn into an internal project creation tool:
queue = session.create("job", {"type":2,"code":"proj001"})
In this tutorial, we assume you have your project data stored on a NAS on a share and this share is mapped/served as the volume "projects" in accsyn. To verify your storage configuration, head over to https://accsyn.io/admin/volumes.
Further on, we assume projects are located directly within the root of that folder, and the file assets to be synced for each shot is located in a scans folder like this:
projects/
proj001/
sc0010/
sh0010/
scans/
published/
..
sh0020/
scans/
published/
..
..
proj002/
As an alternative, to provide more granular project permissions and enhanced security, you can setup a volume for each project. This way you can control which project(s) each operator can work with.
In this tutorial, we assume you have an in-house project management tool such as ftrack or Autodesk Flow, were you have projects and shots.
We are not going to cover how to implement this task, as it varies a lot depending on the tool, but here are a rough guideline:
Write a plugin within the tool that operators can run on one or more select shots.
Have the plugin resolve the relative path/paths for each selected shot, and provide it as a list [proj001/sc0010/sh0010/scans] together with the site that user selects.
Have the plugin call the download sync script below, or if it is a Python based tool that is compatible with the accsyn Python API - implement the sync script below directly within your tool/plugin.
This is the script that does the heavy lifting, and is either implemented directly as a module within the tool above by extracting the 'site_sync' function or executed as standalone CLI tool in the shell - suitable if the in-house tool is not written in Python / can't use the accsyn Python API directly.
Prepare the local script environment so the API can authenticate with accsyn backend, we recommend setting the ACCSYN_API_USER and ACCSYN_API_KEY environment variables locally on the machine, or store it locally in a hidden file. Hard coding the API key in your Python script and then pushing it to Github can be a security hazard in is strongly discouraged.
Here is a working example CLI implementation of the sync script, that takes that list of shot assets as arguments and downloads the files using the fast and secure ASC protocol to the site - (code is available on Github, see resources below):
Breakdown of the script:
Command line arguments are read and passed on the site_sync function.
The accsyn API session object is created.
The share, queue and site entities are loaded from accsyn and verified.
The list of tasks are built, with paths mirrored on destination end.
The script submits and re-uses one sync job, for each project and direction (download/upload) each day. This gives a good balance when it comes to amount of jobs within a project queue and the amounts of tasks (files) within each job making it easy to manage at a later stage.
Note: this is just a sample script - modify as you see fit within your production environment.
View and download the source code on Github
accsyn Python API support main page.
accsyn site management documentation.