Tutorial - Publish workflow
How to read this tutorial
The tutorial does not cover all features of accsyn, just a selected set.
Entries in Bold are made up data for this tutorial only, here you provide your own data as appropriate.
Labels in Italic correspond to the name of an input were you enter data.
Text in [ BRACKETS ] denotes a button within user interface, were LINK points out a link - for example a navigation link.
Python scripts from this tutorial is a available for at GitHub:
Youtube video explaining the workflow: https://www.youtube.com/watch?v=k4i90AhYUns
Scenario
In this tutorial, your fictive company "InterPost" uses external resources for image retouch services - take on or more images and perform retouch on each of them.
Your on-prem data resides on network disk "vol" mounted on Linux server "alpha" @ "/net/vol".
Task data for subcontractors are located in one directory per subcontractor: "/net/vol/_TO_VENDORS/<E-mail>/".
The server are able to run Python scripts located in folder "/net/vol/_SCRIPTS".
When testing publish, we use the test mail address "test@interpost.com" and the result of task should end up in directory: "/net/vol/proj/task001/data/proj_task001_v001".
(Optional) For files that cannot be recognised, offer to upload into directory: "/net/vol/_FROM_VENDORS/<E-mail>/<YYYYMMDD>/".
Your task would be to setup access to a folder here they grab work as assigned and perform their task. When they are done, subcontractor should be able to publish back results directly into the correct folder structure, with metadata that should be saved in production database and relevant notifications.
Installing accsyn
The following guide is a short summary of the installation process described in detail here: accsyn Admin Manual
Register your domain @ https://signup.accsyn.com
Follow the guide to initialise workspace [interpost]
Install server; when the guide instructs you to install the accsyn daemon, download and install it on your current file transfer server [alpha].
Network; the guide will ask you to configure your firewall, add NAT port forwards 45190-45210 (tcp) to file transfer server "alpha". Note that Accsyn daemon DOES NOT listen to any ports 24/7, it will only start a listening process during file transfer init, software firewalled to accept incoming connections from remote client WAN IP only.
Root share; Browse to were the disk is mounted on server [/net/vol].
Finish installation.
By now you have a fully working file transfer solution that can be used by external user to receive and send back large file packages at high speed (i.e. an accelerated encrypted FTP server replacement). We are now going to continue and make file transfers submitted by Python scripts to enable an automised workflow.
Configuring accsyn
Share material to test user
Note: This can be done using the accsyn Python API in an automated way upon task assignment in your production database systems. For hints on how to do this, please check this tutorial: Tutorial - Automated Production Outsourcing.
Logon to your domain [https://interpost.accsyn.com], using the admin account you registered with above.
Go to ADMIN>Shares and click [ CREATE SHARE ].
Browse to directory "/net/vol/" and select "_TO_VENDORS", click [ NEXT ].
Pick a name for your share, "TO_VENDORS" and then click [ CREATE ].
You will be directed to sharing dialog, hit [ SHARE DIRECTORY ] button.
Create the folder "test@interpost.com" (you are beneath share "TO_VENDORS", full path will be "/net/vol/_TO_VENDORS/test@interpost.com").
Select the newly created folder and click [ NEXT ].
The user is not known to accsyn yet, choose "Invite new" option and enter their E-mail [test@interpost.com].
Make sure only Read is checked, we do not want them to upload here. For that they will be required to use Publish flow.
Click [ GRANT ACCESS ] when you are done.
User will now get an invite, be instructed on how to install desktop application, and be to see share "TO_VENDORS" and folder "test@interpost.com" only within their accsyn. You can now copy work task source data into this folder in order to make it available to subcontractor.
Write pre-publish hook
We are now going to write the first hook out of two, the "job-pre-publish-server" hook as it is named with accsyn. Here is were you can validate the filenames remote user has supplied and return appropriate feedback.
accsyn provides the JSON data in a temporary file and then runs your configured script supplying the path to this data file. It also provides you a path to were you should write the result back in JSON format. This mechanism in Accsyn is called a "hook".
In this tutorial, during the test, the user will drag-n-drop three files/directories resulting in the following publish data (this would be the data sent if using Python API for publish):
{
"hook":"job-pre-publish-server",
"user":"5d87825f045d0352d33435eb",
"user_hr":"test@interpost.com",
"files":[
{
"id":"96bc4b44-384b-497d-a119-3f07307627b6",
"filename":"proj_task001_v001",
"is_dir":true,
"size":200000,
"files":[{
"filename":"image.0001.tif",
"size":100000
},
{
"filename":"image.0002.tif",
"size":100000
}]
},{
"id":"44f29351-e870-4f9e-b329-a43759ed35c0",
"filename":"proj_task1_v001_preview.mov",
"size":1000
},{
"id":"1ff91107-d4d2-4de5-9d87-2b3ed1198e1c",
"filename":"proj_task001_v001_assets",
"size":20000,
"files":[{
"filename":"proj_task001_projectfile.xml",
"size":20000
}]
}
],
"size":221000
}
The pre-publish Python script need to:
Check that each file follows your naming convention - can be recognised and is not empty.
That all files are there for publish.
Return guidelines and what additional metadata needs to be input by user.
Using a Python capable text editor such as IDLE++ or Sublime, copy this script from out Github:
https://raw.githubusercontent.com/accsyn/publish-workflow/main/pre_publish.py
The script should be quite self-explanatory if you are familiar with Python. Here follow some explanations:
The "guidelines" are shown to user before they submit the publish, here you can help user by giving example of naming conventions.
By returning "comment":True, you required the user to enter comment metadata associated with publish. We will show later how this metadata can be extracted and stored. Remove this or set to false if you do not require this input.
By returning "time_report":True, you require the user to enter amount of time spent on task. Remove this or set to false if you do not require this input.
By returning "statuses":[..], you require the user to choose a status for publish. Remove this or set to None if you do not require this input.
You can of already in this stage update the task status / create initial version in your production database systems, to prevent a duplicate publish from same user or someone else.
The "ident" field can be customised by you and is not used by accsyn, here you can for example store ID of project, task and/or version if you like.
Finally, save the script to "/net/vol/_SCRIPTS/accsyn/pre_publish.py".
Change permission on the script to be executable [chmod 755 /net/vol/_SCRIPTS/accsyn/pre_publish.py].
Configure pre-publish hook
Now we need to tell accsyn were to find our pre-publish hook script:
Logon as admin and head over to ADMIN>SETTINGS>Hooks and enable "hook-job-pre-publish-server" hook.
Beneath Linux path, enter the following: "/net/vol/_SCRIPTS/accsyn/pre_publish.py ${PATH_JSON_INPUT} ${PATH_JSON_OUTPUT}".
Click [ SAVE ] to have settings saved.
Note: Make sure Python3 is installed at the server and that python3 executable is in the PATH, for script to function.
Your accsyn is now ready to accept publishes and have them uploaded into the correct location.
Write publish hook
We now should write the publish hook script that gets execute after the files and directories have been uploaded to your server. This is of course optional and can be omitted if you do not wish to save user input (metadata) or do any other workflow integrations.
Note: These two publish scripts can be combined into one, by for example adding a --pre command line argument or similar.
The data that comes in are identical to to data arriving to your previous pre-publish script, except that user input/metadata has been appended:
https://raw.githubusercontent.com/accsyn/publish-workflow/main/publish.py
Save the script to "/net/vol/_SCRIPTS/accsyn/publish.py".
Change permission on the script to be executable [chmod 755 /net/vol/_SCRIPTS/accsyn/publish.py].
Configure publish hook
Now finally we need to tell accsyn were to find our post-publish hook script:
Logon as admin and head over to ADMIN>SETTINGS>Hooks and enable "hook-job-publish-server" hook.
Beneath Linux path, enter the following: "/net/vol/_SCRIPTS/accsyn/pre_publish.py ${PATH_JSON_INPUT} ${PATH_JSON_OUTPUT}".
Click [ SAVE ] to have settings saved.
Windows
This tutorial described how to setup publish hooks on Linux/Mac OSX compatible system, this applies to a Windows server based deployment:
Install Python3 on Windows.
Edit the PATH system environment variable and append the path to were Python got installed [C:\Users\..\AppData\Local\Programs\Python\Python39]
Configure the pre-publish hook with python executable and UNC paths to scripts [python \\SERVER\share\scripts\pre_publish.py ${PATH_JSON_INPUT} ${PATH_JSON_OUTPUT}]. Repeat the process for the publish script.
Click [ SAVE ] to have settings saved.
Finalising
Finally, we test it:
Activate test account "testuser@interpost.com".
On a remote machine, install accsyn desktop app and logon as this user.
Manufacture a folder with test images, a preview and some asset.
Drag-n-drop the files onto accsyn and choose Publish.
If you run into internal errors, usually due to missing exec permission on scripts in Linux (chmod 755 the scripts), find clues @ ADMIN>Audits>Job.
Your accsyn is now all setup for having your subcontractors become integrated in your workflow, in a safe, fast and user friendly manner!
Troubleshooting
Any error that occur are displayed in the Job audit logs, log on to your workspace as a an admin and go to Admin>Audit>Job.
Log messages are also output in the accsyn deamon server log:
Linus: /var/log/accsyn/accsyn-daemon.log.txt
Windows: C:\Programdata\accsyn\log\accsyn-daemon.log.txt
Mac: /var/log/accsyn/accsyn-daemon.log.txt