Skip to main content

Overview of Plugin and Purpose

Integrating with Phrase TMS provides seamless content exchange between third-party systems and Phrase, which is critical for reducing manual labor, minimizing errors, and improving efficiency in translation processes. This integration is particularly beneficial for businesses with frequent updates and high-volume translation needs, enabling continuous localization and immediate availability of translated content in target languages. This guide will cover the use cases of a plugin and the best practices and paths for building the required plugin. The following resources will allow developers to quickly map the needs of their plugin to the requirements for building it. This guide should ultimately be used to help fast track the ability to build a plugin.

Concepts

Userflow

This section covers the overall user flow from project creation to exporting.
  1. Select content for localization
    • In the third party tool we select what content we want to select for localization.
  2. In TMS, create a new project
    • We want to create a project, which is a high-level container that groups related work, to store jobs. The project typically defines the scope, context, and configuration.
  3. Create job in TMS
    • We create a job via uploading content to TMS.
    • Job is the actual piece of work that is translated. Jobs lives inside projects, get assigned to translators, and are tracked for completion.
  4. Run localization on job
    • Vendors / providers are assigned to jobs to perform a workflow.
  5. Monitor the progress of the job from the plugin
    • Jobs can be represented as several job part (segments/parts). These parts can be used to see the progress of a job.
  6. Export completed job to third party
    • Once jobs reach a completed state, they can be exported. To export the job we need to begin an asynchronous download of the completed job. We then monitor the status of the download until it is ready. Once ready we then download the completed file.
  7. Mark job as delivered
    • We then need to update the status of the job to indicate that it has been delivered.

Job Lifecycle

This section covers the lifecycle of a job from inception to delivery.
  1. Creation and initialization
    • File is uploaded to TMS (via file pushed via connector, API call, manual upload, automation), and job is created.
    • Import settings, analysis, and pre-translation is applied here.
    • Status:
      • NEW: if job exists and file is imported successfully.
      • JOB_NOT_READY: if import fails or there is an error on update.
  2. Assignment and acceptance
    • Job is assigned to provider / vendor - which gets accepted or declined.
    • Status:
      • NEW: If still unaccepted.
      • ACCEPTED: If vendor has accepted job.
      • DECLINED: If job has been refused.
  3. In progress
    • Once accepted, the job moves through the workflow steps configured in the project template.
    • Source updates:
      • If continuous localization is active, the same job can be updated continuously. The job must not be in an error state. Each update overwrites the previous version.
  4. Completion and delivery
    • Once the workflow is completed, it will move into completed status.
    • The plugin sets delivered status to confirm it has been exported, or another valid status.
  5. Exception outcomes
    • Not all jobs finish successfully.
    • CANCELLED: Work is aborted.
    • REJECTED: The client / reviewer has rejected the output, often requires re-work.

Objects and Data Models

This section should include key data models for interacting with the Phrase API and key objects that can be found in the SDK or that should included within the plugin.
ModelJSON example
Project template{ "id": "pt-1", "uid": "pt-uid-1", "templateName": "Default Template", "sourceLang": "en", "targetLangs": \["de", "fr"\], "dateCreated": "2025-01-01T10:00:00Z" }
Project { "id": "proj-1", "templateUid": "pt-uid-1", "name": "Website Translation", "sourceLanguage": "en", "status": "NEW", "createdAt": "2025-01-02T10:00:00Z" }
Job{ "uid": "job-1", "projectUid": "proj-1", "status": "NEW", "sourceLang": "en", "targetLang": "de", "dateCreated": "2025-01-02T11:00:00Z" }
AsyncRequest { "id": "req-1", "jobUid": "job-1", "action": "PRE\_ANALYSE", "dateCreated": "2025-01-02T11:05:00Z" }
AsyncResponse { "requestId": "req-1", "dateCreated": "2025-01-02T11:06:00Z", "errorCode": null, "errorDesc": null, "acceptedSegmentsCount": 120 }

File Types

This section should include a table of supported formats with information like what is currently found. For further information see the documentation.
File typeUsesExample
XLIFF (1.2 and 2.0)XML Localisation Interchange File Format, commonly used for exchanging translation data between different tools.<?xml version="1.0" encoding="UTF-8"?><xliff xmlns="urn:oasis:names:tc:xliff:document:2.0" version="2.0"><file id="f1" original="short.txt"><unit id="u1"><segment><source xml:lang="en">Welcome to UltraWidget.</source><target xml:lang="de">Willkommen bei UltraWidget</target></segment></unit><unit id="u2"><segment><source xml:lang="en">Contact support at support@example.com.</source><target xml:lang="de">Kontaktieren Sie den Support unter support@example.com.</target></segment></unit> </file></xliff>
JSONJavaScript Object Notation, used for structured data interchange. { "welcome": "Welcome to UltraWidget.", "contact": "Contact support at support@example.com."
XMLExtensible Markup Language, offering versatile import options.<?xml version="1.0" encoding="UTF-8"?><root><welcome>Welcome to UltraWidget.</welcome><contact>Contact support at support@example.com.</contact></root>
MarkdownLightweight markup language with plain text formatting syntax.# Welcome Welcome to UltraWidget. Contact support at support@example.com.
Plain Text Unformatted text files.Welcome to UltraWidget. Contact support at support@example.com.

Plugin Catalogue

This section covers the high-level uses case of the plugin. Canonical, one off, live content, continuous localization workflows and required behaviors. Must, should, can capabilities of these plugin variants.

Canonical Plugin

StepRequired behaviourExpected outcome
Setup and authenticationConfigure Phrase credentials and environment; verify authorizationValidated connection with clear errors
Locale mappingMap external locales to Phrase locales; validate coverageDeterministic locale resolution
Content selectionSelect one or more items to translateStable identifiers and content snapshot
Submit for translationCreate projects and jobs; group submissionsIdempotent, grouped submission
Monitor and synchronizeTrack job progress asynchronouslyAccurate status visibility
Retrieve translationsImport completed translations to correct targetsSafe placement with retries
Corrections and updatesHandle corrections and source changesVersion continuity preserved
Cancel and declinePropagate terminal states safelyNo partial or stale updates
Preview (when applicable)Provide in-context preview for draft workflowsPredictable preview behavior
TroubleshootingSurface correlation IDs and logsEnd-to-end traceability

Plugin Catalogue

This section covers different scopes of the plugin. This catalogue covers a small set of standardized plugin types that capture the typical requirements of plugins and maps them to an expected set of features and non functional requirements. This catalogue helps teams decides
  • A sensible scope for their project.
  • What is required vs optional for production.
  • A reasonable acceptance criteria to define a production ready plugin.
Use caseTypical systemsCore characteristicPrimary complexity
One-Off TranslationsDesign tools, document tools, marketing assets, file managers, static campaign buildersSnapshot-in, translation-outMinimal versioning
Updating Live ContentCMS, headless CMS, knowledge bases, documentation platforms, PIM, DAM metadataDraft-to-publish workflowDraft vs published integrity, source changes
Developer-Driven Continuous LocalizationVCS, strings repositories, CI/CD, microservices, API feedsAutomated, high-change pipelinesParallel jobs, idempotency, staleness

Success Criteria and Business Requirements

This section defines what a successful plugin looks like - against functional and non-functional guarantees.

Prerequisites, for fast development

This section covers the information / data needed by the clients to effectively start building the plugin.
  • Phrase TMS base URL.
  • Phrase Authentication credentials for testing and authentication method chosen.
  • Delivery definition and definition of done.
  • Expected behavior of the plugin - see plugin catalogue.
  • Encoding and decoding conventions.
  • Supported file types.

Testable success criteria

This should cover a range of success criteria that can be used to access if they have successfully used this guide
  • (Time to) first successful round trip
  • Functional minimum
    • Create project, create job, monitor job status, download, update status - all working end to end
    • Locale mapping and validation
  • Operational minimum
    • Retry/backoff for transient errors
    • 429 handling
    • Timeout with safe retry mechanics
    • Strong observability and logging
  • Security minimum
    • Supported authenticated method
    • Tokens stored securely
    • Least privilege scopes
    • Audit trail of actions

How to build

This section covers the step by step guide for building the plugin. This section can be broken into two parts - a fast and a production ready track. Which can be used as a logical step-by-step guide for building a production ready plugin. We should be able to align the developement to the plugin catalogue options and the life cycle of the plugin. This guide will provide context and clear instructions for how the plugin is meant to integrate with Phrase API and the behavior of the plugin.

Fast Track Path

This section covers the fast track path to build an MVP plugin. This will cover the basic features / workflow of a canonical plugin - this will allow developers to rapidly build a base from which can be built on top with the additional features and concepts for hardening the plugin. This is aimed to be completed in a short space of time without much confusion or assistance whilst providing a strong foundation for understanding key concepts for the plugin.

Authentication

Firstly we must be able to authenticate with Phrase TMS to access the API. There is only one method for authentication, please see the authentication documentation for further details.
  • Users must first generate an API token in the User Profile/Access Tokens tab in Phrase Platform Settings page. This token must be stored securely, ideally within a secrets manager / vault. The API token and JWT should not be logged or committed to source control.
  • This API token must then be exchanged for a JWT, using the Phrase platform OAuth token endpoint.
    • Endpoint
      • POST /oauth/token
      • Query parameters:
        • grant_type:urn:ietf:params:oauth:grant-type:token-exchange
        • subject_token : API-TOKEN
      • Response:
        • Expected response: 200
            {
              "access_token":"GENERATED-JWT",
              "issued_token_type":"urn:ietf:params:oauth:token-type:access_token",
              "token_type":"Bearer",
              "expires_in":14399
            }
          
          • Error response: 400 - includes error code enum and human-readable error description. We should handle the error codes correctly and relay the relevant information to the user.
          {
            "error": "enum<string>",
            "error_description": "<string>"
          }
          
    • See documentation for more details.
  • This access code should be saved such that we can use the access token in later calls - passing it in the HTTP authorization header in the following format.
    Authorization: Bearer GENERATED-JWT. We should aim to cache the access token and its expiry and automatically attempt to refresh the token when near expiry, and on any 401 (unauthorized response codes)
  • When the JWT token expires, we must exchange the token again for a fresh token, we can use the expires_in variable to see how long is left.

Pseudocode example

  • Method exchangeApiTokenForAccessToken:
    • Makes the API call to exchange the API token for the access token
  • Method getAccessToken
    • Checks the cached access token exists and is valid
    • Returns the cached token if valid else it fetches a new access token and returns new token.
  • Example API call
    • Example API call that returns 401 response call where we then attempt to refresh the token once.
CACHE = { token: null, expiresAt: 0 }

Function: exchangeApiTokenForAccessToken  
{accessToken, expires_in} = POST /oauth/token
--header 'Content-Type: application/x-www-form-urlencoded'
--data 'grant_type=urn:ietf:params:oauth:grant-type:token-exchange'
--data 'subject_token=API-TOKEN'
#Expect 200, else if 400 return error to user
return {accessToken, expires_in}

Function: getAccessToken
now = time.now()
if CACHE.token != null and now < CACHE.expriesAt:
return CACHE.token
{accessToken, expires_in} = exchangeApiTokenForJwt()
cache.TOKEN = accessToken
cache.expiresAt = now + expires_in
return CACHE.TOKEN

# Example request with 401 response  
function authRequest(method, url, **kwargs):  
token = getAccessToken(API_TOKEN)
kwargs.headers = kwargs.headers or {}
kwargs.headers["Authorization"] = "Bearer " + token
resp = HTTP(method, url, **kwargs)
if resp.status == 401:
# token might have expired or was revoked → refresh & retry once
token = exchangeApiTokenForJwt(API_TOKEN).token
CACHE.token = token
CACHE.expiresAt = epoch_seconds() + data.expiresIn
kwargs.headers["Authorization"] = "Bearer " + token
resp = HTTP(method, url, **kwargs)  # single retry
return resp

Import Content

We must first create a project before we begin importing content. Projects act as containers for related jobs / work. To begin importing content and performing jobs we must create a project, which is a container for multiple files and jobs.
  • Create a project template
    • We create a project template for projects to standardize the workflow
    • Log into TMS and create a project template
    • Link: TMS page
  • Call the API to retrieve the project templates
    • We call this endpoint to retrieve the list of project templates to select the appropriate template for creating a project. We can pass in filters to select the relevant template.
      • Endpoint:
        • GET /api2/v1/projectTemplates
        • Query parameters
          • clientId<Integer>
          • clientName<string>
          • ownerUid<string>
          • domainName<string>
        • Response: Expect 200 response with pagination of project templates
            {
              "totalElements": 123,
              "totalPages": 123,
              "pageSize": 123,
              "pageNumber": 123,
              "numberOfElements": 123,
              "content": [
                {
                  "templateName": "<string>",
                  "sourceLang": "<string>",
                  "targetLangs": [
                    "<string>"
                  ],
                  "id": "<string>",
                  "uid": "<string>",
                  "owner": {
                    "firstName": "<string>",
                    "lastName": "<string>",
                    "userName": "<string>",
                    "email": "<string>",
                    "role": "SYS_ADMIN",
                    "id": "<string>",
                    "uid": "<string>"
                  },
                  "createdBy": {
                    "firstName": "<string>",
                    "lastName": "<string>",
                    "userName": "<string>",
                    "email": "<string>",
                    "role": "SYS_ADMIN",
                    "id": "<string>",
                    "uid": "<string>"
                  },
                  "dateCreated": "2023-11-07T05:31:56Z",
                  "domain": {
                    "id": "<string>",
                    "uid": "<string>",
                    "name": "<string>"
                  },
                  "subDomain": {
                    "id": "<string>",
                    "uid": "<string>",
                    "name": "<string>"
                  },
                  "costCenter": {
                    "id": "<string>",
                    "uid": "<string>",
                    "name": "<string>"
                  },
                  "businessUnit": {
                    "id": "<string>",
                    "uid": "<string>",
                    "name": "<string>"
                  },
                  "projectWorkflowSettings": {
                    "propagateTranslationsToLowerWfDuringUpdateSource": true
                  },
                  "workflowSteps": [
                    {
                      "name": "<string>",
                      "id": "<string>",
                      "uid": "<string>",
                      "order": 123,
                      "lqaEnabled": true
                    }
                  ],
                  "note": "<string>",
                  "client": {
                    "id": "<string>",
                    "uid": "<string>",
                    "name": "<string>"
                  }
                }
              ],
              "sort": {
                "orders": [
                  {
                    "direction": "ASC",
                    "property": "<string>"
                  }
                ]
              }
            }
          
    • Documentation
  • Create a project from a template
    • This endpoint is used to create a project - we pass the templateUid in as a path parameter to create the project with a template. We must give the name variable in the body. This endpoint then returns the the ProjectDto, which is an object representing the project.
      • Endpoint:
      • POST /api2/v2/projects/applyTemplate/{templateUid}
      • Path: templateUid <String>
      • Body: REQUIRED name<string>
          {
            "name": "<string>",
            "sourceLang": "<string>",
            "targetLangs": [
              "<string>"
            ],
            "workflowSteps": [
              {
                "id": "<string>"
              }
            ],
            "dateDue": "2023-11-07T05:31:56Z",
            "note": "<string>",
            "client": {
              "id": "<string>"
            },
            "businessUnit": {
              "id": "<string>"
            },
            "domain": {
              "id": "<string>"
            },
            "subDomain": {
              "id": "<string>"
            },
            "costCenter": {
              "id": "<string>"
            }
          }
        
      • Response: Expect 201 response with ProjectDto
          {
            "uid": "<string>",
            "internalId": 123,
            "id": "<string>",
            "name": "<string>",
            "dateCreated": "2023-11-07T05:31:56Z",
            "domain": {
              "id": "<string>",
              "uid": "<string>",
              "name": "<string>"
            },
            "subDomain": {
              "id": "<string>",
              "uid": "<string>",
              "name": "<string>"
            },
            "owner": {
              "firstName": "<string>",
              "lastName": "<string>",
              "userName": "<string>",
              "email": "<string>",
              "role": "SYS_ADMIN",
              "id": "<string>",
              "uid": "<string>"
            },
            "sourceLang": "<string>",
            "targetLangs": [
              "<string>"
            ],
            "references": [
              {
                "id": "<string>",
                "uid": "<string>",
                "filename": "<string>",
                "note": "<string>",
                "dateCreated": "2023-11-07T05:31:56Z",
                "createdBy": {
                  "firstName": "<string>",
                  "lastName": "<string>",
                  "userName": "<string>",
                  "email": "<string>",
                  "role": "SYS_ADMIN",
                  "id": "<string>",
                  "uid": "<string>"
                }
              }
            ],
            "mtSettingsPerLanguageList": [
              {
                "targetLang": "<string>",
                "machineTranslateSettings": {
                  "id": "<string>",
                  "uid": "<string>",
                  "name": "<string>",
                  "type": "<string>"
                }
              }
            ],
            "status": "NEW",
            "userRole": "<string>"
          }
        
      • Documentation
  • Create job
    • API call to create a job where the source file can be in the message body or downloaded from the connector.
    • Endpoint:
      • POST /api2/v1/projects/{projectUid}/jobs
      • Path: projectUid <String>
      • Body: source file (optional)
      • Response: Expect 200 response with Jobs[] and asyncRequest object
            {
              "unsupportedFiles": [
                "<string>"
              ],
              "jobs": [
                {
                  "uid": "<string>",
                  "status": "NEW",
                  "providers": [
                    {
                      "type": "<string>",
                      "id": "<string>",
                      "uid": "<string>"
                    }
                  ],
                  "targetLang": "<string>",
                  "workflowLevel": 123,
                  "workflowStep": {
                    "name": "<string>",
                    "id": "<string>",
                    "uid": "<string>",
                    "order": 123,
                    "lqaEnabled": true
                  },
                  "filename": "<string>",
                  "dateDue": "2023-11-07T05:31:56Z",
                  "dateCreated": "2023-11-07T05:31:56Z",
                  "updateSourceDate": "2023-11-07T05:31:56Z",
                  "imported": true,
                  "jobAssignedEmailTemplate": {},
                  "notificationIntervalInMinutes": 123,
                  "continuous": true,
                  "sourceFileUid": "<string>"
                }
              ],
              "asyncRequest": {
                "id": "<string>",
                "dateCreated": "2023-11-07T05:31:56Z",
                "action": "PRE_ANALYSE"
              }
            }
        
      • Documentation

Pseudocode example

This pseudocode example shows how the API calls are made to create a project and then a job.
#Call API for project templates
projectTemplates[] = GET /api2/v1/projectTemplates 
					-H "Authorization: Bearer accessToken"
					# Expect 200 response, handle 4xx and 5xx errors

if projectTemplates.isEmpty()
	throw new NotFoundException

# Find projectTemplateUid
projectTemplateUid = projectTemplates.findSuitableTemplate().getUid()

#Create project
projectDto = POST /projects/applyTemplate/{templateUid}
			-H "Authorization: Bearer accessToken"
			-d { name, sourceLanguage, targetLanguages }
			# Expect 201 response, handle 4xx and 5xx errors
			
projectUid = projectDto.getUid

#Create job
jobDto = POST /api2/v1/projects/{projectUid}/jobs
		-H "Authorization: Bearer accessToken"
		-d {content}
		# Expect 201 response, handle 4xx and 5xx errors
		
# Persist jobUid

Monitor Job progress

  • List job parts status
  • Call endpoint to check the status of each job part. Returns the statuses of each part of the job. There are other options for monitoring job progress however for simplicity we use job parts for the fast track method.
  • For the fast track implementation we should implement a simple polling system that sleeps in between calls.
  • Endpoint:
    • GET /api2/v1/projects/{projectUid}/jobs/{jobUid}/parts
    • Path:
      • projectUid<string>
      • jobUid<string>
    • Response: Expect 200 with PageDto of JobParts
        {
            "sort": null,
            "pageNumber": 0,
            "content": [
                {
                    "remoteFile": {
                        "encodedFolder": "",
                        "humanReadableFileName": "",
                        "encodedFileName": "",
                        "humanReadableFolder": ""
                    },
                    "importStatus": {
                        "status": "",
                        "errorMessage": 
                    },
                    "workflowStep": ,
                    "imported": ,
                    "dateCreated": "",
                    "serverTaskId": "",
                    "dateDue": ,
                    "targetLang": "",
                    "continuous": ,
                    "originalFileDirectory": ,
                    "innerId": "",
                    "owner": {
                        "userName": "",
                        "uid": "",
                        "id": "",
                        "firstName": "",
                        "lastName": "",
                        "role": "",
                        "email": ""
                    },
                    "uid": "",
                    "status": "",
                    "filename": "",
                    "split": ,
                    "sourceFileUid": "",
                    "providers": []
                }
            ],
            "numberOfElements": ,
            "totalElements": ,
            "pageSize": ,
            "totalPages": 
        }
      

Pseudocode example

This pseudocode example shows how the API calls are made to monitor the status of a job and we want to handle success and failure cases.
while (time < timeout)
	jobParts[] = GET /jobs/{jobUid}/parts
				-H "Authorization: Bearer accessToken"
				# Expect 200, else attempt retry
	if all jobParts in [COMPLETED]
		return job completed
	elif all jobParts in [FAILED, REJECTED, CANCELLED]
		return job failed
	sleep exponential_backoff_with_jitter_time

Exporting

After we confirm the job has finished processing we want to download the file and then update the status following export.
To perform the download we must do this asynchronously. We must first initiate the download, then check when the operation is complete to then download file.
  • Initiate async download of a target file.
    • Given the project and job uids, we initiate the download a target file - this endpoint returns an async request which we can use to monitor the status of the operation, and download the file given the async request id.
      • We must persist the asyncRequestID as this is used to perform the download operation.
      • See Recipe for asynchronous requests for more information
  • Alternatively, we can pass, an optional, callbackUrl in the body - which will get a notification when the operation is completed.
  • Endpoint:
    • PUT /api2/v3/projects/{projectUid}/jobs/{jobUid}/targetFile
    • Path:
    • projectUid<String>
    • jobUid<String>
    • Body: (optional)
        {
          "callbackUrl": "<string>"
        } 
      
    • Response: Expect 202 response with AsyncRequest object
          {
            "asyncRequest": {
              "id": "<string>",
              "createdBy": {
                "firstName": "<string>",
                "lastName": "<string>",
                "userName": "<string>",
                "email": "<string>",
                "role": "SYS_ADMIN",
                "id": "<string>",
                "uid": "<string>"
              },
              "dateCreated": "2023-11-07T05:31:56Z",
              "action": "PRE_ANALYSE",
              "asyncResponse": {
                "dateCreated": "2023-11-07T05:31:56Z",
                "errorCode": "<string>",
                "errorDesc": "<string>",
                "errorDetails": [
                  {
                    "code": "<string>",
                    "args": {},
                    "message": "<string>"
                  }
                ],
                "warnings": [
                  {
                    "code": "<string>",
                    "args": {},
                    "message": "<string>"
                  }
                ]
              },
              "parent": "<unknown>",
              "project": {
                "uid": "<string>",
                "name": "<string>"
              }
            },
            "reference": {
              "uid": "<string>",
              "type": "JOB"
            }
          }
      
    • Documentation
  • Download the target file based on an async request.
    • After we confirm the previous operation is complete - we then can download the target file.
    • This call will return the target file with translations.
    • NOTE: the asyncRequestID can only be used once.
    • Endpoint:
      • GET /api2/v2/projects/{projectUid}/jobs/{jobUid}/downloadTargetFile/{asyncRequestId}
      • Path:
        • projectUid<string>
        • jobUid<string>
        • asyncRequestId<string>
      • Query parameters: format: ORIGINAL or PDF
      • Response: Expect 200 response with application/octet-stream
      • Documentation
  • Edit job status
    • We call this endpoint to mark the job as delivered to indicate that the job has been successfully retrieved.
    • Endpoint
      • POST /api2/v1/projects/{projectUid}/jobs/{jobUid}/setStatus
      • Path:
        • projectUid<string>
        • jobUid<string>
      • Body: requestedStatus: The new job status. notifyOwner: A flag to notify
            {
                "requestedStatus": "NEW",
                "notifyOwner": true,
                "propagateStatus": true
            }
        
      • Response: Expect 204 response
      • Documentation

Pseudocode example

This pseudocode example shows how the API calls are made to initiate the async download of a target file, monitor its status, and download the target file once it’s ready.
# Initiate async download
asyncRequest = PUT /api2/v3/projects/{projectUid}/jobs/{jobUid}/targetFile
				-H "Authorization: Bearer accessToken"
				#Expect 202 response, handle 4xx, 5xx responses accordingly
			
# Monitor asyncRequest status
while (time < timeout)
	Optional asyncResponse GET /api2/v1/async/{asyncRequestId}
	if asyncResponse null
		sleep exponential_backoff_with_jitter_time 
	if asyncReponse.errorDetails not null
		throw exception
	return asyncResponse

# Download async target file

content = GET /api2/v2/projects/{projectUid}/jobs/{jobUid}/downloadTargetFile/{asyncRequestId}
-H "Authorization: Bearer accessToken"
#Expect 202 response, handle 4xx, 5xx responses accordingly

persist content()
mapTo3rdPartySystem()

# Update job status
POST /api2/v1/projects/{projectUid}/jobs/{jobUid}/setStatus
-H "Authorization: Bearer accessToken"
-d {"requestedStatus": "DELIVERED"}
#Expect 200 response, handle 4xx, 5xx responses accordingly

Fast Track checklist success criteria

  • A user can select content in a third party system.
  • The plugin can create a plugin in TMS project, using a template.
  • The plugin can upload source files and create a job within a project.
  • The plugin can monitor the job status to check for when it is complete.
  • The plugin can download the translated file.
  • The plugin can commit the localized content into the third party system.
  • The plugin is able to mark the job as DELIVERED status.

Golden path (Production ready)

This section will cover the golden path to build a production-ready plugin. This will build upon the MVP plugin with additional features to expand the functionality of the plugin. This will also explain different requirements based on the plugin catalogue. It will also explore how the plugin will become production ready by making the plugin more robust. Finally, developers should be able to use the acceptance criteria to evaluate the quality of the plugin. The key differences in this track are enabling another method for importing and exporting content to Phrase - via manually importing from TMS and pushing content back to the plugin. As well as exposing optional features that can improve the feature set of the plugin and how they fit in with the user flow. This track also covers more non functional requirements for the plugin to ensure that it is production ready.

Building The Golden Path Plugin

Authentication

Firstly, like in the fast track, we must be able to authenticate with Phrase TMS to access the API. There is only one method for authentication, please see the authentication documentation for further details.
  • Users must first generate an API token in the User Profile/Access Tokens tab in Phrase Platform Settings page.
  • Using the generated API token it must then be exchanged for a JWT, using the Phrase platform OAuth token endpoint.
  • Endpoint
    • POST /oauth/token
    • Query parameters:
      • grant_type:urn:ietf:params:oauth:grant-type:token-exchange
      • subject_token : API-TOKEN
    • Response:
      • Expected response: 200
        {
          "access_token":"GENERATED-JWT",
          "issued_token_type":"urn:ietf:params:oauth:token-type:access_token",
          "token_type":"Bearer",
          "expires_in":14399 
        }
        
      • Error response: 400 - includes error code enum and human readable error description. We should handle the error codes correctly and relay the information to the user.
        {
          "error": "enum<string>",
          "error_description": "<string>"
        }
        
    • See documentation for more details.
  • This access code should be saved such that we can use the access token in later calls - passing it in the HTTP authorization header in the following format.
    Authorization: Bearer GENERATED-JWT. We should aim to cache the access token and its expiry and automatically attempt to refresh the token when near expiry, and on any 401 (unauthorized response codes)
  • When the JWT token expires, we must exchange the token again for a fresh token, we can use the expires_in variable to see how long is left.

Create a Project

Like the fast track we must first create a project before we begin importing content. Projects act as containers for related jobs / work. To begin importing content and performing jobs we must create a project, which is a container for multiple files and jobs.
  • Create a project template
    • We create a project template for projects to standardise the workflow
    • This also allows us to apply a template when creating a project and reduces the amount of fields that we need to determine when creating a project from scratch
    • Log into TMS and create a project template
    • Link: TMS page
  • Call the API to retrieve the project templates
    • We call this endpoint to retrieve the list of project templates to select the appropriate template. We can also pass in specific filters in order to find the correct template.
    • We need to persist the TemplateUid of the selected template.
    • Endpoint:
      • GET /api2/v1/projectTemplates
      • Query parameters
        • clientId<Integer>
        • clientName<string>
        • ownerUid<string>
        • domainName<string>
      • Response: Expect 200 response with pagination of project templates
            {
              "totalElements": 123,
              "totalPages": 123,
              "pageSize": 123,
              "pageNumber": 123,
              "numberOfElements": 123,
              "content": [
                {
                  "templateName": "<string>",
                  "sourceLang": "<string>",
                  "targetLangs": [
                    "<string>"
                  ],
                  "id": "<string>",
                  "uid": "<string>",
                  "owner": {
                    "firstName": "<string>",
                    "lastName": "<string>",
                    "userName": "<string>",
                    "email": "<string>",
                    "role": "SYS_ADMIN",
                    "id": "<string>",
                    "uid": "<string>"
                  },
                  "createdBy": {
                    "firstName": "<string>",
                    "lastName": "<string>",
                    "userName": "<string>",
                    "email": "<string>",
                    "role": "SYS_ADMIN",
                    "id": "<string>",
                    "uid": "<string>"
                  },
                  "dateCreated": "2023-11-07T05:31:56Z",
                  "domain": {
                    "id": "<string>",
                    "uid": "<string>",
                    "name": "<string>"
                  },
                  "subDomain": {
                    "id": "<string>",
                    "uid": "<string>",
                    "name": "<string>"
                  },
                  "costCenter": {
                    "id": "<string>",
                    "uid": "<string>",
                    "name": "<string>"
                  },
                  "businessUnit": {
                    "id": "<string>",
                    "uid": "<string>",
                    "name": "<string>"
                  },
                  "projectWorkflowSettings": {
                    "propagateTranslationsToLowerWfDuringUpdateSource": true
                  },
                  "workflowSteps": [
                    {
                      "name": "<string>",
                      "id": "<string>",
                      "uid": "<string>",
                      "order": 123,
                      "lqaEnabled": true
                    }
                  ],
                  "note": "<string>",
                  "client": {
                    "id": "<string>",
                    "uid": "<string>",
                    "name": "<string>"
                  }
                }
              ],
              "sort": {
                "orders": [
                  {
                    "direction": "ASC",
                    "property": "<string>"
                  }
                ]
              }
            }
        
      • Documentation
  • Create a project from a template
    • This endpoint is used to create a project - we pass the templateUid in as a path parameter to create the project with a template. We must give the name variable in the body. This endpoint then returns the the ProjectDto, which is an object representing the project.
    • Validate the returned ProjectDto to confirm is as expected.
    • We need to persist the created ProjectUid.
    • Endpoint:
      • POST /api2/v2/projects/applyTemplate/{templateUid}
      • Path: templateUid <String>
      • Body: (Required) name<string>
            {
              "name": "<string>",
              "sourceLang": "<string>",
              "targetLangs": [
                "<string>"
              ],
              "workflowSteps": [
                {
                  "id": "<string>"
                }
              ],
              "dateDue": "2023-11-07T05:31:56Z",
              "note": "<string>",
              "client": {
                "id": "<string>"
              },
              "businessUnit": {
                "id": "<string>"
              },
              "domain": {
                "id": "<string>"
              },
              "subDomain": {
                "id": "<string>"
              },
              "costCenter": {
                "id": "<string>"
              }
            }
        
      • Response: Expect 201 response with ProjectDto
            {
              "uid": "<string>",
              "internalId": 123,
              "id": "<string>",
              "name": "<string>",
              "dateCreated": "2023-11-07T05:31:56Z",
              "domain": {
                "id": "<string>",
                "uid": "<string>",
                "name": "<string>"
              },
              "subDomain": {
                "id": "<string>",
                "uid": "<string>",
                "name": "<string>"
              },
              "owner": {
                "firstName": "<string>",
                "lastName": "<string>",
                "userName": "<string>",
                "email": "<string>",
                "role": "SYS_ADMIN",
                "id": "<string>",
                "uid": "<string>"
              },
              "sourceLang": "<string>",
              "targetLangs": [
                "<string>"
              ],
              "references": [
                {
                  "id": "<string>",
                  "uid": "<string>",
                  "filename": "<string>",
                  "note": "<string>",
                  "dateCreated": "2023-11-07T05:31:56Z",
                  "createdBy": {
                    "firstName": "<string>",
                    "lastName": "<string>",
                    "userName": "<string>",
                    "email": "<string>",
                    "role": "SYS_ADMIN",
                    "id": "<string>",
                    "uid": "<string>"
                  }
                }
              ],
              "mtSettingsPerLanguageList": [
                {
                  "targetLang": "<string>",
                  "machineTranslateSettings": {
                    "id": "<string>",
                    "uid": "<string>",
                    "name": "<string>",
                    "type": "<string>"
                  }
                }
              ],
              "status": "NEW",
              "userRole": "<string>"
            }
        
    • Documentation

Upload Preview Package (Optional)

During the flow we create a preview package before we create the job. When we create a preview package we receive an id that we can pass during job creation. See the recipe for Live preview to see how to implement this feature.

Import Content

For importing content for the golden path we want to have two methods - calling the API and exposing endpoints for listing content in TMS which then can be downloaded from TMS. Calling the API: This is the same as found in the fast track path.
  • Create job
    • This is an API call to create a job within a project. We must provide the projectUid in the path and the source file can be in message body or downloaded from connector.
    • This call returns the Jobs object and an asyncRequest object.
      • See monitoring job progress and the recipe for asynchronous request for how to handle the asyncResponse.
    • From the Jobs object we want to persist the (job)uid – this would allow us to perform further actions on the job.
    • NOTE: We can pass in a callback URL to receive a notification on the job status. This is different to the webhooks that can be set up globally.
    • NOTE: If using the live preview feature, we want to include the file id here too.
      • jobPreviewPackageFileUidRef in the Memsource header.
      • See the recipe fore live preview for further information.
    • NOTE: If using the continuous updating feature we want to include the flag.
      • See the recipe for continuous updates for further information.
    • Endpoint:
      • POST /api2/v1/projects/{projectUid}/jobs
      • Path: projectUid <String>
      • Body: source file (optional)
      • Response: Expect 200 response with Jobs[] and asyncRequest object
            {
              "unsupportedFiles": [
                "<string>"
              ],
              "jobs": [
                {
                  "uid": "<string>",
                  "status": "NEW",
                  "providers": [
                    {
                      "type": "<string>",
                      "id": "<string>",
                      "uid": "<string>"
                    }
                  ],
                  "targetLang": "<string>",
                  "workflowLevel": 123,
                  "workflowStep": {
                    "name": "<string>",
                    "id": "<string>",
                    "uid": "<string>",
                    "order": 123,
                    "lqaEnabled": true
                  },
                  "filename": "<string>",
                  "dateDue": "2023-11-07T05:31:56Z",
                  "dateCreated": "2023-11-07T05:31:56Z",
                  "updateSourceDate": "2023-11-07T05:31:56Z",
                  "imported": true,
                  "jobAssignedEmailTemplate": {},
                  "notificationIntervalInMinutes": 123,
                  "continuous": true,
                  "sourceFileUid": "<string>"
                }
              ],
              "asyncRequest": {
                "id": "<string>",
                "dateCreated": "2023-11-07T05:31:56Z",
                "action": "PRE_ANALYSE"
              }
            }
        
    • Documentation
Listing: An alternative method for importing content is to expose it to Phrase TMS such that it can be imported, more of a pull method via TMS. This method requires an endpoint that is able to list the content, a get endpoint to return the options of content, and encode it send it to Phrase TMS. This flow works in two parts - list and download.
  • List: Firstly, the list endpoint - a GET endpoint that should list all content that can be exported to TMS and formatted in any relevant directories, given the correct credentials.
    This endpoint should handle multiple error cases and respond with the correct response codes.
  • Download: The download endpoint - a GET endpoint given the id of a specific content should trigger the downloading process of the content. The download process should encode the content to a suitable format for uploading to TMS and return this content within the response body.
    This endpoint should handle multiple error cases and respond with the correct respond codes.

Optional: Assign Providers and Analysis

Once a job has been created we are able to assign providers and create analysis on these jobs. Using the jobUid we can use specific endpoints to trigger these actions. See the recipes for assigning providers and for analysis for how to implement.

Monitor Job Progress

For monitoring the jobs progress there are three main methods - listing the job parts status, web hook, async requests. In the fast track we introduced the listing job part status - however we recommend using the web hooks to receive notifications about the job progress or using the async requests to check the status and only using job parts as a fallback.
MethodWhere set upProsCons / When not to use
Job parts GETPlugin → TMSSimple, no infra requiredPolling, risk of rate limits
WebhooksTMS Settings or via APIPush‑based, scalableNeeds public URL & retry logic
AsyncRequestPer async API callBuilt‑in status & callbacksMust manage async IDs / polling
Creating and Subscribing to a Web Hook: We can create a, global account wide, web hook to subscribe to which would allow us to get push notifications on job lifecycle events. These include part completed, job completed, async request finished, etc. Webhooks are the recommended production mechanism as they are cheaper than polling, however, we should still implement a fallback option in cause of missed webhooks., this can be done by polling any async request or listing job parts endpoint. Webhooks consist of two parts: creating and subscribing and handling of the webhook. For further details see the documentation. Creating and Subscribing
  • From the Settings page on TMS, scroll down to the Integrations section and click on Webhooks.
  • From here we can add a new Webhook.
  • Provide a name, URL and security token if required.
    Specify a value to be included in either the x-memsource-token header or the Authorization header that allows the user to choose their preferred method for authenticating the webhook sender.
  • Select events.
  • Click Preview schema to view the webhook response when running.
  • Click Create webhook.
  • The webhook is added to the list on the Webhooks page.
  • We can then subscribe to this web hook and monitor it for any job status change events.
Handling
  • Authenticate / verify the incoming request (token / header).
  • Store the webhook event, we should aim to prevent duplicate deliveries that can be achieved by storing against a unique constraint.
  • Respond with 200 response for ACK response after persisting event - this confirms to the Phrase that the event was received.
  • Then we want to process the event - ideally we do this by adding the webhook event to an async processing queue.
We should add an eventual correctness fall back option to begin polling after specified amount of time. This can be done by maintaining a list of recently touched jobs with pending asyncRequests that we can begin polling after a specified time frame, eg 5 minutes.

Pseudocode example

function handleWebHook(Event event)
  verifyWebhook(event)
  storeEvent(event)
  eventMessage = createEventMessage(event)
  asyncProcessingQueue.push(eventMessage)
  return HttpStatus.200
For additional hardening for the webhook we should aim to monitor
  • 4xx / 5xx rates
  • Duplicate event rates
  • Rate of using fallback for finding event updates.
  • (If you are using async processing of webhook events) queue depth and worker error rates
Asynchronous Request: See the recipe for asynchronous requests to see how to implement this.

Exporting

Much like the fast track path - once the job has been completed we then want to download the file. In the fast track we introduced an pulling the new content from TMS via the Phrase API. However, we can also add an endpoint to push content through from TMS, this is normally used alongside the listing content feature via pulling the new content. Pull Content From TMS
  • As found in the fast track section.
  • Initiate async download of target file
    • Given the project and job uids we initiate the download a target file - this endpoint returns an async request which we can use to monitor the status of the operation, and download the file given the async request id.
      • We must persist the asyncRequestID as this is used to perform the download operation.
      • See Recipe for asynchronous requests for more information.
  • Alternatively, we can pass, an optional, callbackUrl in the body - which will get a notification when the operation is completed.
  • Endpoint:
    • PUT /api2/v3/projects/{projectUid}/jobs/{jobUid}/targetFile
      • Path:
        • projectUid<String>
        • jobUid<String>
      • Body: (optional)
        {
           "callbackUrl":
        }
        
    • Response: Expect 202 response with AsyncRequest object
      {
        "asyncRequest": {
          "id": "<string>",
          "createdBy": {
            "firstName": "<string>",
            "lastName": "<string>",
            "userName": "<string>",
            "email": "<string>",
            "role": "SYS_ADMIN",
            "id": "<string>",
            "uid": "<string>"
          },
          "dateCreated": "2023-11-07T05:31:56Z",
          "action": "PRE_ANALYSE",
          "asyncResponse": {
            "dateCreated": "2023-11-07T05:31:56Z",
            "errorCode": "<string>",
            "errorDesc": "<string>",
            "errorDetails": [
              {
                "code": "<string>",
                "args": {},
                "message": "<string>"
              }
            ],
            "warnings": [
              {
                "code": "<string>",
                "args": {},
                "message": "<string>"
              }
            ]
          },
          "parent": "<unknown>",
          "project": {
            "uid": "<string>",
            "name": "<string>"
          }
        },
        "reference": {
          "uid": "<string>",
          "type": "JOB"
        }
      }
      
  • Download the target file based on an async request
    • After we confirm the previous operation is complete - we then can download the target file.
    • This call will return the target file with translations.
    • NOTE: the asyncRequestID can only be used once.
    • Endpoint:
      • GET /api2/v2/projects/{projectUid}/jobs/{jobUid}/downloadTargetFile/{asyncRequestId}
      • Path:
        • projectUid<string>
        • jobUid<string>
        • asyncRequestId<string>
      • Query parameters: format: ORIGINAL or PDF
      • Response: Expect 200 response with application/octet-stream
      • Documentation
Push Content From TMS
  • This method requires us to expose an endpoint that can be called from TMS for handling importing content.
  • This should be a POST endpoint which can be called with the correct identifiers and content.
  • The logic of this endpoint should also be handle to handle the exporting process in the third party system - if a decoding processing is required.
Edit Job Status
Finally, we need to update the job status to mark the job as delivered in Phrase TMS.
  • We call this endpoint to mark the job as delivered to indicate that the job has been successfully retrieved.
    • Endpoint
      • POST /api2/v1/projects/{projectUid}/jobs/{jobUid}/setStatus
      • Path:
        • projectUid<string>
        • jobUid<string>
      • Body: requestedStatus: The new job status. notifyOwner: A flag to notify
            {
                "requestedStatus": "NEW",
                "notifyOwner": true,
                "propagateStatus": true
            }
        
      • Response: Expect 204 response
      • Documentation

Plugin Variants

FeatureCanonicalUpdating live contentContinuous Localisation
OAuth + refreshMustMustMust
Create project from templateMustMustMust
Upload job preview packageShouldMustCan
Create jobs (project-scoped)MustMustMust
asyncRequest handlingMustMustMust
WebhooksMustMustMust
Idempotent updateSourceShouldCanMust
Provider assignment & analysesShouldCanCan
Advanced observabilityMustShouldMust

Canonical Plugin (general-purpose)

The golden path covers the canonical plugin - general purpose plugin. It covers the minimum requirements of what is expected of the plugin for general use across multiple content types and workflows.
This plugin variant corresponds to the one-off translations in the plugin catalogue.

Updating Live Content Plugin

This plugin variant supports live content that is subject to continual source changes. This plugin must be able to detect changes to source content and update the job accordingly as well as support the in editor visual preview. Key features it must supports is continuous updates due to source changes and live preview features, on top of the canonical plugin. See recipes for continuous updates and live preview

Continuous Localisation

This plugin variant supports developer-driven continuous localization common for automated high-change pipelines. Key features this must support are idempotency, be able to handle frequent updates. See recipes for Idempotency to see how to implement.

Error Handling

This section covers some details that should be added to improve the error handling of the plugin.
  • Retry mechanism for transient HTTP errors (429, 5xx).
    • These responses should be retried with exponential backoff and jitter.
    • Phrase API has a request limit of 6000 requests per minute for logged-in users, for further information about API rate limits see the documentation.
  • 4xx validation errors should be related to the end user with actionable messages.
  • 5xx errors should be logged with the relevant error codes and messages. These cases should be handled gracefully and the user should be informed of the error.
  • Add idempotency to avoid duplicate actions.
  • Add alerting on repeat failures.

Documentation and Testing Requirements

This section covers requirements in documentation and testing to make the plugin production ready.
We need to have strong requirements here to ensure safeguards for product stability and trust in the plugin.

Documentation

The documentation of the golden path plugin should be clear and verifiable.
It should cover details such as:
  • An overview of the architecture.
  • How it has implemented the canonical lifecycle.
  • Error handling.
  • Observability.
  • Security methods.
  • Data models.
  • Versioning & Compatibility Policy.

Testing

We have testing requirements to ensure deterministic lifecycle behavior, operational resilience, and production stability. These can be covered in the form of unit, integration, end to ends tests. Testing must cover
  • Happy path.
    • Single file, single locale.
    • Large file or multiple target languages.
    • Life cycle transition tests.
    • Key user flows like creating a project, creating a job, monitoring progress, etc.
  • Error handling.
    • HTTP errors.
    • Validation errors.
    • Internal errors.
  • Security.
  • Failure cases.
    • Authentication issues.
    • Download issues.
    • Idempotency tests if applicable.
  • Common pitfall cases
    • 401/403: invalid token / permissions.
    • 429: rate limit hit.
  • Webhook failure cases.

Logging and Observability Requirements

The plugin must implement structured logging that is enough to support incident investigation, lifecycle tracking, and performance management. Each log must include the following, where applicable.
  • Timestamps.
  • Request id.
  • Correlation id.
  • Project uid.
  • Job uid.
  • Plugin instance id.
  • Endpoint details, HTTP method, responses status, duration.
  • Phrase action id.
Expose and monitor key metrics, such that we can monitor the performance of the plugin and create dashboards accordingly. This would also allow us to create key alerts. Some key metrics include.
  • Job creation success rate.
  • Average job creation latency.
  • Export / import durations.
  • 5xx error rates.
We should be able to define SLO requirements of the plugin.

Recipes / Additional Features Guide

This section includes guidance on adding specific features into the plugin, as well key information on hardening the plugin. This will cover when we need these features, how it changes the logic flow, and examples of implementation and expected outcomes.

Live Preview

A commonly wanted feature is Live preview - a HTML-based visual representation of localized content that is accessible within the editor and navigable by segments. It provides additional context and with its content being updated as segments are localized it feels live. This feature can be achieved in 2 ways - via HTML file or a preview reference package

HTML File for Localization

  • When creating a job in TMS, you have to specify the file format that you want to localize. If you choose HTML and upload HTML for localization, it will be rendered as a live preview in the editor. This is not necessarily possible for all CMSs as they often organize content into fields that they represent in complicated data structures which would be challenging to convert to HTML and then back.

Upload a Preview Reference Package When Creating a Job

Since using an HTML file is not always available we can instead upload a separate preview package
  • Create a ZIP file with assets (images, CSS, JS) and a single HTML which will be rendered by the editor.
    • The HTML needs to have the assets linked as relative links.
    • Absolute links (https://google.com) and assets linked in this way will be preserved but may not work correctly. This is intended to work around the issue of assets being unavailable via the Internet or requiring authentication for access.
  • Call the following endpoint to upload the preview package. We need to pass the projectUid as a path parameter and the preview package file as the body contents.
  • The endpoint returns the uId of the uploaded preview package.
  • Endpoint:
    • POST /api2/v1/projects/{projectUid}/jobPreviewPackage
    • Path: projectUid <String>
    • Body: Required: file<file>
    • Response: Expect 201 Response
          {
            "uid": "<string>",
            "fileUid": "<string>",
            "fileName": "<string>",
            "organizationRef": {
              "uid": "<string>",
              "name": "<string>"
            },
            "projectUidRef": {
              "uid": "<string>"
            },
            "taskId": "<string>",
            "jobUidRef": {
              "uid": "<string>"
            },
            "createdByRef": {
              "firstName": "<string>",
              "lastName": "<string>",
              "userName": "<string>",
              "email": "<string>",
              "role": "SYS_ADMIN",
              "id": "<string>",
              "uid": "<string>"
            },
            "dateCreated": "2023-11-07T05:31:56Z"
          }
      
    • Documentation
  • When creating the job using the Create Job endpoint (reference), add the id from the previous step as a parameter jobPreviewPackageFileUidRef in the Memsource header. Once the job is created, it should have a live preview created and visible in the editor.
We need to ensure that we are failing gracefully (report errors to the user) if the preview package is malformed.

Asynchronous Request

The Phrase API has support for asynchronous APIs. Some requests will respond with an asyncRequest object. "asyncRequest": {"id": "\<string\>","dateCreated":"2023-11-07T05:31:56Z","action": ""} Using the asynchronous APIs should always be preferred to the synchronous counter parts. Calling synchronous APIs, there is a chance of receiving timeout expired responses when processing large batches of files or even a single large file. Synchronous APIs should ideally only be used for small files and small scale integration. After calling an asynchronous API there are two ways of checking the status and retrieving the data - polling and callbacks.

Polling

After calling an asynchronous API, an instant response is received including the identifier request. Use this identifier to check the status of the request by calling the following endpoint to check the asyncResponse field - this field will be null until the request is complete or errors.
  • Endpoint
    • GET /api2/v1/async/{asyncRequestId}
    • Path: asyncRequestId <int>
    • Response:
      {
        "id": "<string>",
        "createdBy": {
          "firstName": "<string>",
          "lastName": "<string>",
          "userName": "<string>",
          "email": "<string>",
          "role": "SYS_ADMIN",
          "id": "<string>",
          "uid": "<string>"
        },
        "dateCreated": "2023-11-07T05:31:56Z",
        "action": "PRE_ANALYSE",
        "asyncResponse": {
          "dateCreated": "2023-11-07T05:31:56Z",
          "errorCode": "<string>",
          "errorDesc": "<string>",
          "errorDetails": [
            {
              "code": "<string>",
              "args": {},
              "message": "<string>"
            }
          ],
          "warnings": [
            {
              "code": "<string>",
              "args": {},
              "message": "<string>"
            }
          ],
          "acceptedSegmentsCount": 123
        },
        "parent": "<unknown>",
        "project": {
          "uid": "<string>",
          "name": "<string>"
        }
      }
      
    • Documentation
Due to this, we would want to set up a polling system to poll the endpoint until the request is complete - for retrying this endpoint we must ensure that we use exponential back off and jitter. However, we must abide by the Phrase API limits found here in the documentation. Pseudocode example This pseudocode example shows how we can implement the polling logic.
#Polling asyncRequest

#Create job (example async API)
{jobDto, asyncRequest} = POST /api2/v1/projects/{projectUid}/jobs
		-H "Authorization: Bearer accessToken"
		-d {content}
		# Expect 201 response, handle 4xx and 5xx errors

#Poll async status
while (time < timeout)
	Optional asyncResponse GET /api2/v1/async/{asyncRequestId}
	if asyncResponse null
		sleep exponential_backoff_with_jitter_time
	if asyncReponse.errorDetails not null
		throw exception
	return asyncResponse

Callback

As a response to the drawbacks of the polling approach to asynchronous requests, support for callbacks in all asynchronous APIs is supported. When calling an asynchronous request, specify a URL (in the callbackUrl parameter) that is requested after the work initiated by the asynchronous request is complete. Callbacks are requested via HTTP POST calls, and the data is passed on in the body encoded as JSON. The JSON object always contains:
  • Information about the asynchronous request (the same as when calling getAsyncRequest).
  • Detailed information about the result of the action, such as a full analysis or job details.
{
   "asyncRequest": {
       ...
  }
 "analyse": {
   ...
  }
}
If a callback URL is not accessible, the request is repeated after 2, 4, 8, 16, and 30 minutes until 10 retries have failed. Callback URL must respond with the 200 OK HTTP status code to be considered successful.

Pseudocode example

#Callback method
#Create job (example async API)
jobDto = POST /api2/v1/projects/{projectUid}/jobs
		-H "Authorization: Bearer accessToken"
		-d {content, callbackURL}
		# Expect 201 response, handle 4xx and 5xx errors


#Callback endpoint
@PostMethod
callbackMethod(String bodyJson){
	asyncResponse = parse(bodyJson, AsyncResponse.class)
	handle(asyncResponse)
	returns 200
}

Assign Providers

This feature is used to assign linguists or MT providers from the template to the job. We perform this action by calling the following endpoint. We need to call this endpoint with a template and project id. This will assign the providers from the template to the project.
  • Endpoint:
    • POST /api2/v1/projects/{projectUid}/applyTemplate/{templateUid}/assignProviders
    • Path:
      • templateUid <String>
      • projectUid <String>
    • Response: Expect 200 response
      {
        "jobs": [
          {
            "uid": "<string>",
            "status": "NEW",
            "providers": [
              {
                "type": "<string>",
                "id": "<string>",
                "uid": "<string>"
              }
            ],
            "targetLang": "<string>",
            "workflowLevel": 123,
            "workflowStep": {
              "name": "<string>",
              "id": "<string>",
              "uid": "<string>",
              "order": 123,
              "lqaEnabled": true
            },
            "filename": "<string>",
            "dateDue": "2023-11-07T05:31:56Z",
            "dateCreated": "2023-11-07T05:31:56Z",
            "updateSourceDate": "2023-11-07T05:31:56Z",
            "imported": true,
            "jobAssignedEmailTemplate": {},
            "notificationIntervalInMinutes": 123,
            "continuous": true,
            "sourceFileUid": "<string>"
          }
        ]
      }
      
  • Documentation
Alternatively, we can assign the provider to specific jobs instead of the overall project - see documentation.

Analysis

We can initiate analysis on a job to perform word count and other forms of analyses. We can call the following endpoint with the job uids and analysis options to trigger analysis to be performed, please see the relevant documentation to see the full range of options.
  • Endpoint:
    • POST /api2/v1/analyses/byProviders
    • Body:
    • Required: jobs[]
      {
        "jobs": [
          {
            "uid": "<string>"
          }
        ],
        "type": "PreAnalyse",
        "includeFuzzyRepetitions": true,
        "separateFuzzyRepetitions": true,
        "includeConfirmedSegments": true,
        "includeNumbers": true,
        "includeLockedSegments": true,
        "countSourceUnits": true,
        "includeTransMemory": true,
        "includeNonTranslatables": true,
        "includeMachineTranslationMatches": true,
        "transMemoryPostEditing": true,
        "nonTranslatablePostEditing": true,
        "machineTranslatePostEditing": true,
        "name": "<string>",
        "netRateScheme": {
          "id": "<string>"
        },
        "compareWorkflowLevel": 8,
        "useProjectAnalysisSettings": true,
        "callbackUrl": "<string>"
      }
      
    • Response: Expect 200 response
      {
        "analyses": [
          {
            "asyncRequest": {
              "id": "<string>",
              "createdBy": {
                "firstName": "<string>",
                "lastName": "<string>",
                "userName": "<string>",
                "email": "<string>",
                "role": "SYS_ADMIN",
                "id": "<string>",
                "uid": "<string>"
              },
              "dateCreated": "2023-11-07T05:31:56Z",
              "action": "PRE_ANALYSE",
              "asyncResponse": {
                "dateCreated": "2023-11-07T05:31:56Z",
                "errorCode": "<string>",
                "errorDesc": "<string>",
                "errorDetails": [
                  {
                    "code": "<string>",
                    "args": {},
                    "message": "<string>"
                  }
                ],
                "warnings": [
                  {
                    "code": "<string>",
                    "args": {},
                    "message": "<string>"
                  }
                ],
                "acceptedSegmentsCount": 123
              },
              "parent": "<unknown>",
              "project": {
                "uid": "<string>",
                "name": "<string>"
              }
            },
            "analyse": {}
          }
        ]
      }
      
    • Documentation

Notifications

We can add logic to notify assigned users about the new job. We can hit the following endpoint to trigger notifications against the given job for a specific email template.
  • Endpoint:
    • POST /api2/v1/projects/{projectUid}/jobs/notifyAssigned
    • Path parameters: projectUid <String>
    • Body:
      {
        "jobs": [
          {
            "uid": "<string>"
          }
        ],
        "emailTemplate": {
          "id": "<string>"
        },
        "cc": [
          "<string>"
        ],
        "bcc": [
          "<string>"
        ]
      }
      
    • Response: 204 is successful case
    • Documentation

Continuous Updates

This feature allows for automatic detection of content change which automatically gets sent for translation and synced back - this occurs in small segments instead of larger batches.
This works by setting up a system which is able to to detect what has changed and when. The content is then pushed to the API as grouped jobs, or if they are attached to a continuous project.
Automatic delivery is performed after the translation process is complete - these files are pushed back via web hooks or APIs.

Why we want this

Companies may be releasing frequently across several languages and have large and continuously updating content bases. Continuous localization enables faster go to market, less PM overhead, scalability.

How to build

  • Create logic to monitor content for changes.
  • Call the following Phrase endpoint. We need to provide the projectUid in the path and the file name in the Content-Disposition header. We provide the job file directly in the body. We can must also provide metadata - the jobUids reference and any optional settings. Note: see documentation of endpoint to see job restrictions and additional settings.
  • Endpoint
    • POST /projects/{projectUid}/jobs/source
    • Headers: Content-Disposition <String>
    • Path: projectUid <String>
    • Body: file<object>
      • Metadata
        {
          "jobs": [
            {
                "uid": "jobIn1stWfStepAndNonFinalStatusUid"
            }
          ],
          "preTranslate": false,
          "allowAutomaticPostAnalysis": false
          "callbackUrl": "https://my-shiny-service.com/consumeCallback"
        }
        
    • Response: Expect 200
      {
        "asyncRequest": {
          "id": "<string>",
          "dateCreated": "2023-11-07T05:31:56Z",
          "action": "PRE_ANALYSE"
        },
        "jobs": [
          {
            "uid": "<string>",
            "status": "NEW",
            "targetLang": "<string>",
            "filename": "<string>",
            "workflowLevel": 123,
            "workflowStep": {
              "name": "<string>",
              "id": "<string>",
              "uid": "<string>",
              "order": 123,
              "lqaEnabled": true
            }
          }
        ]
      }
      
    • Documentation
  • We can view the status of the job in multiple ways.
    • Callback url - we can set a callback url in the metadata and listen for the call back.
    • Asynchronous request - response includes an asynchronous request which we can monitor using the API to check if the job is complete
      • See the recipe for handling asynchronous requests.
    • Webhook - we can set up a webhook to receive updates and listen for the completed job
      • See golden path, monitoring job progress for more information..
  • After the job is completed, we want to download the new file
  • Listen to TMS Web hook for when the job is done, and download a new file. When the job is complete, we then download the target file and commit it.

Reporting

  • To help us provide reporting on the impact and value that your Plugin provides our customers please consider adding the following. While using the plugin to call TMS endpoint POST /api2/v1/projects/$project\_id/jobsfor creating a job. As a part of the Memsource header, you can append a sourceData object to the JSON:
{
  "targetLangs": [
    "de"
  ],
  "sourceData": {
    "clientType": "MY_PLUGIN",
    "clientVersion": "1.2.4",
    "hostVersion": "4.5.2"
  }
}
clientType identifies the plugin type clientVersion holds the version of your plugin hostVersion holds the version of third party system. Only the clientType value is required, clientVersion and hostVersion are optional and only should be used if relevant. The value used in clientType needs to be registered with Phrase. Please reach out to us to register your plugin as a recognized client.

Idempotency

We should add idempotent behavior to the plugin for all create and delivery operations, especially if retry operations have been implemented. This will help with prevent duplicate actions and inconsistent status which could lead to an unreliable plugin. Key operations to be idempotent are
  • Creating a project
  • Uploading a file
  • Creating a job
  • Exporting a job
  • Updating a job status

Additional Resources

This section covers additional content that has been created that supports the developers ability to create the plugin.
  • Postman collection. A postman collection has been created with all the relevant requests to the Phrase API with headers and example payloads. Developers can use this as a sandbox for validation, testing, and learning about the API. Additionally, this should also serve as a reference point for the API when developing the plugin.

Reference Appendix

Glossary

TermDefinition
ProjectProjects hold the main components of a translation project (jobs, translation memories, and term bases). Files must be assigned to a project before translation.
Project TemplateA project template is used in project creation. It maintains repeating project criteria for regular clients, projects shared with Vendors, or projects created via the submitter portal. Templates streamline submission and reduce errors.
JobA job represents a file for translation into specified target language(s). Translating a single file from a source into two target languages creates two jobs: Job 1 Translation into English. Job 2 Translation into Italian.
WorkflowWorkflow represents the document’s life cycle in translation; a sequence of steps leading to the desired output. Workflow steps have a hierarchical order. Editing a segment in one step propagates it to higher steps.
XLIFFXLIFF stands for XML Localization Interchange File Format. It is a standard XML format understood by localization providers and preferred for data exchange in the translation industry.
LocaleLocale combines language and geographic region (usually a country) where terminology adapts for specific content and design. Source is the original language; Target is the translated language.
Machine translationMachine translation (MT) is software that translates content without human intervention. It provides quick translations but lacks the quality of human translation.
Markup languageMarkup language includes web languages (HTML, XML, XHTML) that specify formatting, layout, and style using <tags>. For example, <b>this phrase is bold</b>. Translators translate content inside tags.
Segment(ation)Segmentation splits translations into smaller, relevant chunks. It improves translator efficiency, enriches translation memory, and enhances localization of longer texts.
Translation memoryTranslation memory (TM) is a database storing previous translations and suggests translations for future projects to save time.

Endpoint Glossary

ActionMethodPhrase API EndpointsDocumentation
End-user authenticationPOST/api2/v1/auth/loginLink
List project templatesGET/api2/v1/projectTemplatesLink
Get project template detailsGET/api2/v1/projectTemplates/Link
Create project from templatePOST/api2/v2/projects/applyTemplate/Link
Create jobPOST/api2/v1/projects//jobsLink
Assign providers from templatePOST/api2/v1/projects//applyTemplate//assignProvidersLink
Notify assigned usersPOST/api2/v1/projects//jobs/notifyAssignedLink
List job parts statusGET/api2/v1/projects//jobs//partsN/A
Initiate async downloadPUT/api2/v3/projects//jobs/targetFileLink
Download target fileGET/api2/v2/projects//jobs//downloadTargetFile/Link
Edit job statusPOST/api2/v1/projects//jobs//setStatusLink
Upload a preview reference package when creating a jobPOST/api2/v1/projects//jobPreviewPackageLink
Get async request statusGET/api2/v1/async/Link
Assign analysisPOST/api2/v1/analyses/byProvidersLink
Update source contentPOST/projects//jobs/sourceLink
Create project from scratchPOST/web/api2/v3/projectsLink
Create webhookPOST/web/api2/v2/webhooksLink
Get job detailsGET/web/api2/v1/projects//jobs/Link
Get job workflow step This endpoint only returns which workflow steps project contains and in response is workflowLevel property saying if its 1st, 2nd, 3rd, … (biggest is the last one).GET/web/api2/v1/projects//jobs//workflowStepLink
List projectsGET/web/api2/v1/projectsLink
Get project detailsGET/web/api2/v1/projects/Link

Error Codes

Error codeDescriptionMeaningSolution
401UnauthorisedAuthentication failed. The request lacked necessary authentication tokens or the provided token is invalid.Include the API token correctly in the Authorization header. If using OAuth, verify the token has not expired and the OAuth flow is correctly implemented.
403ForbiddenThe authenticated user lacks permission to access the requested resource.Check user permissions and roles in Phrase TMS. Ensure API calls originate from a user with adequate privileges.
404Not FoundThe requested resource does not exist.Verify the endpoint URL and resource identifiers (e.g., project ID, job ID) are correct.
429Too Many RequestsThe API request rate limit has been exceeded.Implement exponential backoff in retry logic. Ensure requests conform to rate limits by sending smaller batches spaced appropriately.
500Internal Server ErrorAn unexpected server error occurred.This often is temporary. Implement retry logic with exponential backoff. If persistent, contact Phrase support with request details.
400Bad RequestThe request was invalid or cannot be served.Verify request parameters, headers, and body conform to API specifications. Common issues include missing required parameters or incorrect types.
503Service UnavailableThe server is unavailable due to overload or maintenance.Implement retry logic with exponential backoff. Check the Phrase status page or contact support for service availability.