Note: This tutorial is for on-prem BYOS deployments of accsyn, please reach out to support if you need corresponding functionality with your cloud hosted workspace.
This tutorial goes through how to setup accsyn for automated delivery of work material to a remote vendor, and ingest of produced asset(s) back into a media production pipeline.
Collecting and sending a work file package manually is time consuming and error prone.
The receiving end needs to manually receive the material and unpack it, can be a challenge when in different timezones.
When the work has been done, the vendor has no way of telling that their result aligns with your interla asset specifications.
The assets uploaded must be manually unpacked, transferred into the production storage, validated and then ingested into the production pipeline (e.g. Flow / ftrack publish).
In short, providing an automated outsourcing pipeline saves a vast amount of time and leaves your staff to do creative work instead of file wrangling - invaluable when in tight deadlines and when vendors are in different timezones.
Can easily be setup to provide a keyturn solution facilitating a fully automated and streamlined outsourcing pipeline.
Python API support, facilitates programmable file transfers in workflows.
Affordable, not hidden costs or limitations - batteries included.
Transfers go at maximum speed using ASC - the accsyn accelerated file transfer protocol.
Full encryption during transfer (AES128/256) and no passwords needs to be set and sent to users.
No server listening 24/7, accsyn only starts a temporary firewalled server during a brief moment during client connection establishment.
No need to share/expose any on-prem data, have your scripts carefully collect and send exactly what the remote vendor should need to perform their work.
Provides central monitoring, with advanced queue/bandwidth management, that is mobile phone/tablet friendly.
Never-give-up transfers - resumes where left off when connection is re-established, no need for manual intervention.
This tutorial will cover:
Starting your trial accsyn workspace.
Setting up a server on-prem (BYOS)
Set up a user server and a map the storage volume at the vendor.
The concepts of API driven file transfers.
How to build a outsourcing script that collects and delivers a work package to the vendor.
Configure a publishing pipeline that validates and ingest the finished assets produced by the vendor.
NOTE: Made up example data is provided bold in [brackets] throughout this tutorial.
The schematics visualise two different scenarios:
Standard setup - expose production storage; In this setup you install accsyn and expose your main production area, no need to further sync assets to and from this server.
TPN hardened setup; For those scenarios were production servers cannot be exposed to the Internet through a port forward, the storage server and accsyn "HQ"(main) is installed in a DMZ locally or cloud and with additional asset pre/post sync to an internal site server on the hardened internal production network.
In this tutorial, we assume the following:
The company and workspace name is "Acme VFX" (API code: acmevfx, API workspace endpoint: https://acmevfx.accsyn.com).
A linux storage server configured with an accsyn volume named "proj", having project "Something Completely Different" located at /mnt/proj/scd.
An active workspace admin (or employee user with access to proj volume) user: "pipeline@acmevfx.co.uk". Script API sessions will run as this user.
The project SCD has a shot with a cleanup task, with conformed source plates in the folder "scd_0010_0010_src" located at /mnt/proj/scd/0010/0010/src/plates.
ftrack is used as the project management tool, holding all information about colourspace, delivery format, handles and so on. This could be any database such as excel sheets, Autodesk Flow or even text/markup files on disk.
The remote vendor "Compers Inc" has a Windows file server were work files should be downloaded and performed on the share "clients" on "server", subfolder AcmeVFX.
At the remote vendor, the coordinator user email is "emma@compersinc.com".
The asset produced by the vendor should be transferred back to /mnt/proj/_VENDORS/CompersInc/FROM_CLIENT.
The first step is to create your own accsyn Workspace in trial mode, the whole process is described in detail here:
Open https://accsyn.io/trial in your web browser, you will be asked to sign up your personal accsyn account (email address identifier).
Create a BYOS Workspace
The BYOS setup wizard will be initiated, it will guide you through the process of setting up and configuring your on-prem server and storage - "Storage server" and "Storage" in the schematics above. [Storage volume configured at /mnt/proj]
Conclude the BYOS setup wizard, you now have a fully working accsyn workspace were you can start send deliveries and share folders (FTP replacement) - all backed by the high speed ASC transfer protocol.
A home share is a shared folder on your storage dedicated to a user, were they can upload and download material outside the context of a shared folder or collection - like the home folder on Linux or user Documents folder on Windows.
The home share provides a more fluent collaboration with the vendor, giving you the option to send additional material manually to them and also giving them a location to upload complementary assets. Think of it like a FTP account folder, outside the context of projects.
Automatic home share creation is turned off by default for all users but can be manually initiated on invite, to turn it on:
Log on to https://accsyn.io/admin/settings as an admin and go to Storage tab.
Enable "Auto create homes".
By default, accsyn create home shares at the folder"accsyn-homes" at the root of the default volume [proj]. Edit "Homes directory" and change it to "_VENDORS" - accsyn will now create home shares at /mnt/proj/_VENDORS.
We also want a subfolder structure with default ACLs applied when a home share is created, Edit "Share config (advanced)" and enter this for "Home folder template":
This will ensure that the user can only read (download) from the TO_CLIENT folder and upload to the FROM_CLIENT folder, but not see any other file content beneath their home share root - additional files and folders are hidden to them.
NOTE: You can omit the home share mechanism, just delivering content from your storage and publishing back to paths defined by your publish hooks.
Next, we need to have the vendor setup an accsyn "user server" - an instance of the accsyn file transfer client runing unattended 24/7. This enables us to push material to the vendor without them needing to manually choose where to download and store the files. Also it enables the vendor to expose their local storage, enabling you to pull missing material as needed.
Start by inviting the remote coordinator:
Logon as an administrator (or an employee with full access to volume [proj]) to https://accsyn.io/admin/users in your browser and click INVITE USER.
Enter the email address of the vendor [emma@compersinc.com] and choose standard user role.
The user will get an invitation email, with instructions on how to activate their personal accsyn account.
Send these vendor instructions on how to install a user server:
Ask them to register the personal accsyn account and the log on to https://accsyn.io/hosts.
Go to "Daemon instances" and click INSTALL DAEMON.
Allocate a server machine, or workstation computer, that is powered on 24/7 with full access to the local storage and can act as the file transfer endpoint [Windows server].
Download the installer and install the accsyn daemon, enter the code presented in web browser when asked. Remember configure the system user to run accsyn daemon process as, the user needs to have full read and write access to the mapped share folder [\\server\clients\AcmeVFX]. There is no need to enter a "Delivery download path" when asked.
The daemon instance should appear shortly at the "Daemon instances". At "Workspace user servers", click "NEW" button.
Select your workspace [Acme VFX] and click "Install accsyn user server for workspace 'Acme VFX'".
After a minute or so, the user server for your workspace should appear beneath the host.
More detailed information on how to setup a host/user server here.
As a final step, we want to define the path at vendor were all deliveries should be stored - with mirrored (preserved) folder structure. Send these additional instructions to the vendor to setup a mapped share:
Shut down the accsyn daemon service on the windows server.
Edit system environment variables and set "ACCSYN_PROJ_PATH" = "\\server\clients\AcmeVFX".
Start the accsyn daemon again.
Launch and logon to the accsyn desktop app, on the same server machine, as the vendor [emma@compersinc.com].
Open Settings dialog by clicking the user account button in upper right corner.
Go to Share Mappings.
There should be an entry named "proj", check "Read" and "Write" boxes. This enable Acme VFX operators to push files to, and pull files from, the local mapped storage folder @ \\server\clients\AcmeVFX.
Note: "PROJ" element in the environment variable definition must match the API code attribute on the default accsyn volume.
We now have a working sync setup, were assets can be transferred back and forth using high speed file transfers. If a home share were configured, the vendor can already now upload files to your storage - and you can put files in the TO_CLIENT folder for them to download.
At this point you can start using accsyn File Sharing, to create a shared folder on the project [/mnt/proj/scd] and use ACLs to grant access to work folders. In this tutorial we will focus on API driven automated transfers, controlled from within the project management tooling [ftrack].
Create the vendor user in ftrack [emma@compersinc.com].
Create a task on shot 0010 named cleanup and assign the vendor to it.
Craft a ftrack action script named "Outsource", for detailed information see ftrack Actions Documentation.
Design the script to operate on one or more selected tasks, have it produce a list of file asset paths relative proj volume that should be sent to the remote vendor.
Generate work description (notes) and technical metadata (colourspace, frame range, handles etc) to a file named scd_0010_0010_cleanup_description.txt.
We are not going into detail how to implement the ftrack logic of this script, it is fairly simple to derive from the official ftrack Python API Documentation.
With the list of relative file paths at hand, use the accsyn Python API to create a daily sync job that pushes the files to the vendor. For detailed information about the accsyn Python API:
Boilerplate code:
# Install accsyn python API
>pip install accsyn-python-api
# (Inside ftrack action script)
<code to collect/generate outsourcing file assets>
# Assume the list of file paths calculated from ftrack are generated and stored in the variable source_files:
source_files = ["scd/0010/0010/src/plates/scd_0010_0010_src", "scd/0010/0010/cleanup/scd_0010_0010_cleanup_description.txt"]
# Import and create the accsyn session object
# It requires the following environment variables set:
# ACCSYN_WORKSPACE=acmevfx
# ACCSYN_API_USER=pipeline@acmevfx.co.uk
# ACCSYN_API_KEY=...
# Note: Create a new API key @ https://accsyn.io/developer
import accsyn_api
session = accsyn_api.Session()
# Assume vendor name is already looked up by task assignee and defined. Also assume project name has been evaluated
vendor_name = "CompersInc"
project_name = "scd"
# Generate sync job name, optimal is to have on per project, vendor and day.
job_name = f"Outsource - download - {project_name} - {vendor_name} - {datetime.now.strftime('%y%m%d')}"
# Build accsyn sync transfer tasks
tasks = []
for path_rel in source_files:
tasks.append(dict(
source=f"volume=proj/{path_rel}",
destination="emma@compersinc.com"
))
# Locate daily sync transfer:
job = session.find(f'Transfer where name="{job_name}"')
# Create or append to existing job
if not job:
job = session.create("Transfer", dict(
tasks=tasks,
mirror_paths=True
))
else:
# Append tasks to existing sync job, will resume/retry finished job and retry(resend) existing duplicate tasks
tasks = session.create("Task", tasks, entityid=job['id'], allow_duplicates=True)
# (Optional) End script with feedback to ftrack web UI session that outsource transfer were dispatched, providing job['name']/job['id'] for reference
The transfer job will be queued with rest of jobs.
Note: if multiple jobs (projects, or previous day) jobs are active, operators can use the power accsyn queue mechanism to prioritise between sync jobs. Even within a sync job, tasks can be prioritised, enabling transfer of certain shots before others.
We now have the pipeline ready for send material, the final part of the pipeline is the receiving endpoint enabling vendor to publish produced asset back into the production workflow.
It is a variant of the standard accsyn desktop app upload mechanism, designed to pre-validate material before uploading. In addition to that, it supports defining were to upload the assets (without necessarily sharing the folder and granting ACL access), time report, notes taking, status and guidelines in-app.
The publish mechanism is driven by two scripts:
Pre-publish script; Receives that path to a JSON file were remote user dropped folders are provided as-is, with directory tree and metadata such as size and date. The pre-publish script should provide per-asset feedback to the user in case there are any inconsistencies, or the upload destination path if validated. It also provides what fields user should input (time report, notes and status are supported) and additional optional outsource guidelines.
Publish script; This script is run after the upload has finished, with path to the same data generated by the pre-publish script for reference. The publish script is designed to provide additional processing after the files has landed, this is typically proxy (JPEG/MP4/MOV) generation and publish to project management tools [ftrack] including the metadata (notes, time report and status) provided by the remote vendor.
Usually, it is implemented as a single script as their logic are very much the same but separated by a command line argument like "--pre" or similar.
We recommend designing a pipeline were the remove vendor publish one folder per task, named as the task. Example:
scd_0010_0010_cleanup_v001
scd_0010_0020_rotoscoping_v003
..
And then let each folder contain all the file sequences, geometry, documents etc that constitute the final result.
We are not going into great detail on how to design the publish script, for detailed information on how to create the scripts with example, check out:
The documentation provide links to our Github where template publish scripts can be downloaded. The scripts contain clear pointers were to integrate into a production management tool like ftrack, for validating asset names against tasks and create a version upload with a reviewable and notes/time report.
accsyn provides a verstile API enable file I/O platform, easy to integrate into an existing production workflow, with proper user interfaces both for desktop and web. This tutorial only shows a small subset of what accsyn can provide in terms of media production automisations.
Learn how Filmgate Films use accsyn within their VFX pipeline for conform, data wrangling, outsourcing and rendering.
Download the accsyn Python API source code from Github.