Tutorial - Automised production outsourcing

Rev 1.005, 2019.10.28

How to read this tutorial


  • The tutorial does not cover all features of Accsyn, just a selected set.

  • Entries in Bold are made up data for this tutorial only, here you provide your own data as appropriate.

  • Labels in Italic correspond to the name of an input were you enter data.

  • Text in [ BRACKETS ] denotes a button within user interface, were LINK points out a link - for example a navigation link.

  • Python scripts from this tutorial is a available for at GitHub:

Summary


This tutorial is targeted :

  • Companies that seek to integrate Accsyn in their production tracking software using Python programming language, so relevant files are sent to outsourcing partners and the result can be sent back into the correct folders.


Wishing to:

  • Not manually put files on FTP/cloud storage or similar for outsourcing partners to do their job.

  • Have file transfers go at maximum speeds, even if they are located on the other side of the globe.

  • Be able to back track transfers using a mobile friendly interface, also send additional missing files or share additional folders.

  • Enable outsourcing companies to send back material and have a script being run when files arrive for ingestion into production management system.


This tutorial is a complete walkthrough covering:

  • Installing Accsyn from scratch.

  • Prepare Accsyn with an API user.

  • Using the API to make a shot available to a remote freelancer through a share.

  • Send a starter source package to freelancer using the API.

  • Have freelancer send back result using the API.

  • Setup a script to run at server when a freelancer package arrives (hook).

Scenario

  • In this tutorial, we name your fictive company "GeoScan" and admin username (E-mail) is "andrea@geoscan.org".

  • Your company outsources data analysis to several subcontractors.

  • You have a Windows server "alpha" connected to disk "production" at your main headquarters, accessible at path "\\alpha\production".

  • You have the possibility (typically through a 3:rd party API) to write programs that talks to your production management software, these tools are installed and accessible on the server "alpha".


Your task would be to have large package of material being sent smoothly to a remote subcontractor "jennifer@gda.com" when task is assigned in your production management system, for easy sync and pick up of work project files. You also want to implement a "button" that freelancer can press that sends back the result into the correct folders at headquarters, with a post publish step that updates your internal project tracking tool.

IMPORTANT NOTE: The terms mentioned/configuration steps outlined below are described in detail in our Accsyn Admin Manual, for instructions on how to use the Accsyn desktop app - have a look at the Accsyn User Manual. Always consult the manuals before your reach out to support@accsyn.com.

Installing Accsyn


The following guide is a short summary of the installation process described in detail here: Accsyn Admin Manual

  • Register your domain @ https://customer.accsyn.com

  • Follow the guide to initialise domain.

  • Install server; when the guide instructs you to install the Accsyn daemon, download and install it on your current file transfer server [alpha].

  • Network; the guide will ask you to configure your firewall, add NAT port forwards 45190-45210 (tcp) to file transfer server "alpha". Note that Accsyn daemon DOES NOT listen to any ports 24/7, it will only start a listening process during file transfer init, software firewalled to accept incoming connections from remote client WAN IP only.

  • Root share; browse to were the FTP volume/disk "production" is mounted on server [\\alpha\production].

  • Finish installation.


By now you have a fully working file transfer solution that can be used by external user to receive and send back large file packages at high speed (i.e. an accelerated encrypted FTP server replacement). We are now going to continue and make file transfers submitted by Python scripts to enable an automised workflow.

Configuring Accsyn


API User

This step is optional, you can run the Python API as your own user but we recommend you create a separate user for this purpose:

  1. Pick an existing E-mail account or create a new with your E-mail server [workflow@geoscan.org].

  2. Logon to your domain [https://geoscan.accsyn.com], using the admin account you registered with above.

  3. On transfer window, hit [ INVITE USER] button or go to ADMIN>User and click [ INVITE USER ] in lower right corner.

  4. Invite the user to Accsyn as an Employee [workflow@geoscan.org].

  5. Login to their E-mail and activate their account - choose a password.

  6. Login to Accsyn using their login and click on account name (E-mail) in upper right corner.

  7. On "API key", hit [ SHOW ] and record the API KEY string presented.


Installing API

The Accsyn Python API has the same role as the web admin interface - it only controls Accsyn and cannot perform actual file transfers. A file transfer happens p2p between your daemon(server) and a remote client (desktop app or background daemon). This means you can choose any machine that supports Python for running your scripts. In this tutorial we have chosen the file server "alpha" for this purpose, to keep things simple:

  1. Open a terminal on the server.

  2. Download and install the Accsyn Python API: "pip install accsyn-python-api". If you cannot use Pip, simply download and unzip [https://<your domain>.accsyn.com/app/accsyn-python-api.zip] into your Python site-packages folder.

  3. Set the following environment variables (if you are running your scripts as a background launchdaemon - make sure they are set in Mac .plist, see Accsyn Admin Manual for detailed instructions on setting environment variables). Enter the API KEY you recorded in the previous step:

ACCSYN_DOMAIN=geoscan

ACCSYN_API_USER=workflow@geoscan.org

ACCSYN_API_KEY=...

Note: You can skip these environment variables and instead enter them when creating the API session later.


Using the API

In your Python code, to use the Accsyn API, prepare a session instance that will be used throughout rest of your scripts to access Accsyn:


import accsyn_api


session = accsyn_api.Session()

Note: For detailed information about how to install, setup and use the Python API, refer to: Accsyn Python API Documentation


You are now all setup to start using the Accsyn API within your Python scripts.


Inviting the subcontractor, sharing initial data and sending package


Within your organization, when a subcontractor is assigned to a data analysis task, a Python script is triggered that shares the dataset folder and sends a starter file package so they can start working.


Prerequisites

Let's assume the following:

  • We have a subcontractor identified by their E-mail, in this tutorial "jennifer@gda.com".

  • Jennifer should start working on the final dataset "DATASET_001_1203 " in project "proj002" @ folder \\alpha\production\proj002\DATASET_001_1203.

  • The name of the share for subcontractors on this project is "proj002-outsourcing" . (The reason we add the -outsourcing suffix is that you might want to have other types of shares like for example "proj002-delivery" and so on).

  • The data that resides in sub folder source @ DATASET_001_1203, should be shared read only.

  • A text file with description of the task is assumed being available @ subfolder TO_VENDORS\analysis_task_description.txt, this folder is shared with read permissions.

  • The subcontractor should upload the result back to FROM_VENDORS\jennifer@gda.com , shared with both read & write permissions.

  • Our script is triggered somehow when management team assigns Jennifer to the task in the project management tool, how this is implemented is not described in this tutorial but is usually through an event listener that runs in the background or a script that polls a folder structure/custom database/Google Sheet or similar.


Inviting the user to Accsyn

If the freelancer have not been using Accsyn before at domain "geoscan", they have to be invited:

u = session.find_one("user WHERE code=jennifer@gda.com")

if u is None: u = session.create("user",{"code":"jennifer@gda.com","clearance":"user"})

Note: We assign 'user' clearance meaning that she will only have access to their default home share (@<default share>/accsyn/<E-mail>/ folder) and additional shares granted access to. Assign 'employee' access to grant read&write access to entire projects share.


Creating/loading the share

A share in Accsyn is similar to a FTP account on a FTP server - a directory on a root share/volume that one or more specific users have access to through ACLs. We put the share at root of film project folder, this way we can easily share further shots/assets later using the same workarea to the same freelancer or other freelancers.

Note: Even if share resides in the project root, it does NOT mean that user have access to entire project. Access is later granted by defining ACL:s on subfolders within the share.


The share "proj002-outsourcing" might already exist, so we attempt to first load it and then create if it does not exist:

share = session.find_one("share WHERE code=proj002-outsourcing")

if share is None: share = session.create("share", {"code":"proj002-outsourcing","parent":"production","path":"proj002"})

A dict containing share data is returned, we use the 'id' key when referencing the share from now on. Note how share paths are relative to "production" root share location, no need to give an absolute path.


Next we need to give subcontractor read access to the dataset, information about the task and upload folder. For each check if an ACL exists and creates otherwise:

p_dataset="DATASET_001_1203"

acl_source = session.find_one("acl WHERE (ident=user:{0} AND target=share:{1} AND path={2})".format(u['id'], share['id'], p_dataset))

if acl_source is None: acl_source = session.create("acl",{"ident":"user:{0}".format(u['id']),"target":"share:{0}".format(share['id']),"path":p_dataset,"read":True,"write":False})


p_taskref="TO_VENDORS"

acl_ref = session.find_one("acl WHERE (ident=user:{0} AND target=share:{1} AND path={2})".format(u['id'], share['id'], p_taskref))

if acl_ref is None: acl_ref = session.create("acl",{"ident":"user:{0}".format(u['id']),"target":"share:{0}".format(share['id']),"path":p_taskref,"read":True,"write":False})




Finally we give read & write access to "FROM_VENDORS/jennifer@gda.com" work folder:

p_work="FROM_VENDORS\jennifer@gda.com"

acl_work = session.find_one("acl WHERE (ident=user:{0} AND target=share:{1} AND path={2})".format(u['id'], share['id'], p_work))

if acl_work is None: acl_work = session.create("acl",{"ident":"user:{0}".format(u['id']),"target":"share:{0}".format(share['id']),"path":p_work,"read":True,"write":True})


From this point and on, the subcontractor can download dataset and task information using Accsyn desktop app (or browser for <5GB files). She can also upload material back to "FROM_VENDORS\jennifer@gda.com" folder.


(Optional) Configure subcontractor to automatically receive and store files with folder structure preserved

To further enhance the workflow, Accsyn supports setting up the receiving end so all files lands with the same folder structure as on server - a local share mapping:

  1. Have subcontractor point out a local folder were material shold land, in this tutorial "E:\WORK\GEOSCAN\Production”.

  2. Have them set system environment variable "ACCSYN_PRODUCTION_PATH=E:\WORK\GEOSCAN\Production".

  3. Install Accsyn Daemon to have app run 24/7 in the background, not needing users to logon and have desktop app running.

From now on, packages sent to subcontractor will start immediately saving files to, given the example in this tutorial: "E:\WORK\GEOSCAN\Production\proj002\DATASET_001_1203".


Sending dataset to subcontractor

In the next step we collect all files needed for subcontractor to do their work, this enables subcontractor to immediately receive the material with needing to ask for it or start manual download.

Compiling and sending file package:

job = session.create("job",{

"code":"proj002_DATASET_001_1203_outsource_jennifer@gda.com",

"tasks":{

"0":{

"source":"share=proj002-outsourcing/DATASET_001_1203",

"dest":"jennifer@vfxhouse.com:proj002\DATASET_001_1203"

},

"1":{

"source":"share=proj002-outsourcing/TO_VENDORS",

"dest":"jennifer@gda.com:TO_VENDORS"

},

}

})

Notice how also supply a relative path at receiver end, this way files gets delivered with the same folder structure and we can easily remap paths when project files arrives back. (Will be ignored if the setup the optional local share mapping above).

The freelancer, Jennifer, will receive an E-mail that a package is pending and can download it using the Accsyn desktop app. We assume she downloads it to the following local work path:

E:\WORK\GEOSCAN\Production

Resulting in the following files:

E:\WORK\GEOSCAN\Production\proj002\DATASET_001_1203

E:\WORK\GEOSCAN\Production\proj002\TO_VENDORS


Have freelancer send back results


Summary

The standard way of doing this is to have subcontractor launch Accsyn desktop app and upload result back to share "proj002-outsourcing" folder "FROM_VENDORS/jennifer@gda.com". In this tutorial we are going to be a bit more advanced, allowing subcontractor to build a Python script that collects the data for upload and submits the transfer job by a push of a button within app.


Hint: If you do not wish to do Python scripting at client end, you can instead utilise the Accsyn Publish workflow - allows users to publish results back into correct directories at production servers with workflow integrations.


We assume the subcontractor is familiar to the Accsyn Python API and have set the proper environment variables - ACCSYN_DOMAIN, ACCSYN_API_USER and ACCSYN_API_KEY ( Can also be provided inline when creating the Accsyn API session).


Subcontractor task would be to write a small Python script that collects and sends back the work done to GeoScan, that is followed by a post process call at the server that publishes result to production management software.


Compile and send the job

First, we detect and store all dependencies into an list, together with the path prefix of project directory. How that is done in analysis app is left out:

project_dir="E:\\WORK\\GEOSCAN\\Production\\proj002"

share_name="proj002-outsourcing"

task_name="DATASET_001_1203"

files=[

"FROM_VENDORS\\jennifer@gda.com\\cleaned_set.dat", # An asset

"FROM_VENDORS\\jennifer@gda.com\\proj002_DATASET_001_1203.analyze", # The project file

"FROM_VENDORS\\jennifer@gda.com\\output\\proj002_DATASET_001_1203_exported.dat", # The output

]

Note: if we were sending files back into other share folders, like for DATASET_001_1203/, job submit below will fail as she does not have write access to these folders.


Now we have all we need to send back the files:

import accsyn_api

session = accsyn_api.Session()


tasks = {}


n = 0

for p in files:

tasks[str(n)] = {"source":"%s\\%s"%(project_dir,p),"destination":"acmefilme:share=%s:%s"%(share_name,p)}

n += 1


job = session.create("job",{

"code":"%s_output"%(task_name),

"tasks":tasks

})


The job will automatically start if Jennifer has Accsyn desktop client running and enabled.


Creating the post process script at server for ingestion

As a last step, we want to have the production management system updated with the results - change status of task. To achieve this we first configure a hook in Accsyn:

  1. Login to Accsyn webapp as admin [https:/geoscan.accsyn.com].

  2. Go to ADMIN>SETTINGS>Hooks.

  3. Scroll down to "hook-job-post-done-server" and enter full path to script on the platform server is running, for example:

\\alpha\production\workflow\accsyn_job_done.py ${PATH_JSON_INPUT}


The Accsyn server (for default share, "projects" in this case) will now run this script with path to job JSON data when all files have arrived.

accsyn_job_done.py:

#!/usr/local/bin/python3

# Accsyn Hook example Python 2/3 script for post-processing a subcontractor upload


import json,sys,os


import accsyn_api


def generic_print(s):

try:

if ((3, 0) < sys.version_info):

# Python 3 code in this block

expr = """print(s)"""

else:

# Python 2 code in this block

expr = """print s"""

eval(expr)

except:

pass



def publish(user=None, comment=None, filename=None):

''' Do something with data uploaded by user. '''

generic_print("User {0} published '{1}' ,comment: {2}".format(user, filename, comment))


if __name__ == '__main__':


p_input = sys.argv[1]

data = json.load(open(p_input, "r"))

generic_print("Publish hook incoming data from user %s: %s"%(data['user_hr'], json.dumps(data, indent=3)))


# Find the output amongst files

for p in [d["source"]["path_abs"] for d in data["tasks"].values()]:

if p.lower().endswith("_exported.dat"):

publish(user=data["source_hr"].split(":")[1], comment="Subcontractor publish", filename=p)


Note: How you write the "publish" function is depending on which project management system you have, we leave that to you.


Wrapping it all up

These scriptlets are just rough guidelines, you can implement this in any way you want. Our goal with this tutorial has been to show how much more powerful an API scriptable file delivery system can be compared to an traditional FTP/Dropbox/WeTransfer solution combined with manual efforts of keeping an remote subcontract in sync with workflow