Posts tagged processes

LiveCycle ES2: XMLForm.exe terminated abnormally with error code {3}

Issue

If you are using LiveCycle to process PDF documents you may encounter problems displaying/converting forms or PDF documents, accompanied by exceptions similar to the following in the server log:

ProcessResour W com.adobe.service.ProcessResource doProcessExitCleanup BMC024: Service XMLFormService: Process ProcessResource@f1f45(name=XMLForm.exe,pid=0) terminated abnormally with error code {3}

XMLFormAgentW E com.adobe.livecycle.formsservice.logging.FormsLogger logMessage ALC-OUT-002-004: Unable to find service: XMLFormService, error: Connection to failed service.

---

Read the complete post at David's Blog.

Process ID value comes up as “-1″

- Ameeth Palla, Technical Account Manager @ Adobe

Issue: Recently I worked on an issue for a customer who reported a problem with Process ID value becoming “-1″.

Scenario: Customer had a process where they were naming the output file using a combination of “Process ID + time-stamp”. They reported that the value of the output filename was coming up as “-1+time-stamp” for some files.

Trouble-shooting: Upon reviewing their process, the design was – A long-lived process takes an input from a watched-folder end-point and checks to see if the incoming file is a TIFF, MS Word document or XML data file. If the input was any other format other than XML, then these files would be converted to PDF’s and the output file would be named as “Process-ID+timestamp” and stored in a directory. If the input file was an XML data file then it would be sent to a sub-process and merged with a form-template and flattened and then named as “Process-ID+timestamp” and stored in the same output directory. By running some tests, we could see that the file name would come out as “-1+timestamp” only when the input file was an XML data file.

---

Read the complete post at Adobe LiveCycle Blog.

Short Lived or Long Lived, that is the question.

Chris Trubiani

I’ve seen on a few occasions that there is some confusion on what the differences are between the short lived and long lived processes and when it’s appropriate to use one process type vs the other.  In this post I’ll talk about how the two work and some of the differences between them, and also talk a bit about when I choose to use one or the other.  In the end, you, as the developer will know your application and be in the best position to make the decision on which is right for you.  Hopefully, this post will arm you with enough knowledge to be able to make the best informed decision you can.

So what is the difference between the two?  A generic description you’ve probably already heard (or inferred from the names) is that short lived processes are for doing things that will complete quickly (or short running processes) and long lived processes are for doing things that will take a long time to complete (or long running processes).  However, there’s a lot more to it than just that.

Note: This post assumes the user has a license for Process Management.  Without this, use of long lived processes are restriction by the EULA.

Invocation Modes

The first thing to talk about is how these two process types are usually invoked.

Short Lived:  By far the most common way to invoke short lived processes is synchronously.  So in this case the caller invokes the process and waits until a response or output is returned from the process.  However, it is also possible to asynchronously invoke a short lived process.  You may have seen this termed as fire and forget in developer tooling like Workbench.  In this case the caller invokes the process and then moves on without waiting for a response.

Long Lived:  Long lived processes have only 1 method of invocation and that is asynchronously.  This is one differentiating factor, if you need a synchronous invocation of a process then you must use a short lived process.

Threading Model

The discussion of how things look at thread level actually depends on the invocation mode being used (synchronous vs asynchronous) and not strictly on short vs long lived processes.  However, since the vast majority of short lived processes are synchronously invoked and all long lived processes are asynchronously invoked you can almost draw a 1-1 relationship between thread behavior and process type.

In synchronously invoked processes what you will see is that the execution of the process will occur completely within the same thread as the invocation occurred.  This leads to a couple of interesting notes:

  1. There is more than one thread pool involved.  Since the thread where process execution occurs depends on the thread in which invocation occurred there are multiple possible pools from which the thread can come from.  For example, if you invoke the process via a SOAP request, you’ll be using a thread for process execution from the application servers web thread pool.  However, if you were to invoke the process using Workbench, the process execution would occur in a thread from within the application servers workmanager thread pool.
  2. For the entire lifecycle of the invocation the thread being used will never be released back into the pool.  What does this mean?  It means that for everything that is done prior to invocation, to the invocation, to the process execution, to everything that may occur after the invocation returns is all happening in the same thread, and that thread is not available to be used for anything else.  If you follow the general guideline that short lived processes should only be used for processing that doesn’t take much time, this isn’t likely to cause you any problems.  However, there is no technical limitation stopping you from using a short lived process for something that actually takes a long time to process.  For example, you might have a short lived process that takes 4 hours to finish.  In that case the thread you are using is unavailable for other work for that whole time.  As you might imagine, if you throw enough load at the system doing this, it would not be very difficult to exhaust the entire thread pool being used, and effectively “bring down” a portion of the application server.

For asynchronous invocations the process execution does not occur within the thread where the invocation occurs.  The process execution occurs inside another separate thread (one from the workmanager thread pool in ES2).  On top of that any asynchronous steps that are hit during execution of the process will result in the thread being released back into the pool for use.  This happens because the invocation of the asynchronous step’s operation is done asynchronously as well.

As you can see, while synchronous invocations have the advantage of not incurring any overhead of having to switch threads, asynchronous invocations provide a better capacity for sharing system resources among the entire load on the system and there are times when this would be preferable to just choosing the more efficient route.

Note: A common misconception is that each step in a long lived process runs in an asynchronous manner.  This has not been the case since version 8 where synchronous branches are used as the default for long lived processes.  This means that the execution of the process will occur in a synchronous fashion, within the same thread, until an asynchronous step is hit.  After the asynchronous step is done the process execution will again continue in a synchronous fashion until the next asynchronous step is hit.

Transactional Behavior and Database Use

The next thing to look at is the transactional differences between short lived and long lived processes and also their use of the database. 

Short lived process execution occurs entirely within one transaction.  This means that if anything goes wrong during execution of the process the entire transaction and all the steps that occurred previously will be rolled back.  Because of this, “retrying” a short lived process literally means invoking it again from scratch.

Note: There are further options for the transactional behavior of short lived processes that can be configured.  For example, whether to allow the process execution to occur within a parent transaction, or whether to force the creation of a new transaction to encompass the process execution.  I won’t go into these details as it could be a topic for a post all on it’s own, but will instead focus on the general behavior.

There is no default use of the database by short lived processes.  So unless there is a step in the process that explicitly writes to the database then nothing is written into it.

Long lived processes will have 1 transaction created and committed/rolled back per step in the process.  Unlike with short lived processes, when something goes wrong in a long lived process it is only the current step that gets rolled back and not the entire process.  When this occurs the step is then marked as stalled in the workflow engine, allowing you to go back and retry the step at a later date should you so desire.

This is made possible, because long lived processes will track the values of its process variables at each step in the process within the database.  This generally occurs within a database table named tb_pt_process_name, where process_name is the actual name of the process.  This means that there is additional overhead associated with long lived processes coming from it’s need to maintain it’s state inside the database.

Because the of the transaction behavior of these two process types there may be times when it is more desirable to use a long lived process even though it may not be strictly required.  Imagine the situation where a process loops around a dataset that is passed in as input and performs some operations on each item in the set.  Let’s say for each item the operations performed require 30 seconds of processing time, and the number of items in the dataset can be between 1 and 10000 items (numbers may be exagerated to illustrate the point :) ).  In that case the running time of this process will be between 30 seconds and 83.33 hours.  Normally the default transaction timeout value for an application server is around 5 mintues.  You can change the timeout value for the short lived process specifically, but in this case setting it to 84 hours wouldn’t be something I’d recommend.  Here it would be better to use a long lived process where you don’t have to worry about transaction timeout values.

So which should you choose?

Should you choose to design your process as a short lived process or long lived process?  In the end, it’s up to you, but when I’m faced with this decision here’s what I do:

I always design a process as short lived, unless I have to make it long lived.

Here’s some examples of times when I’d decide I have to make it long lived:

  • The process contains an asynchronous step (Wait and User services are the two that come to mind out of the box).
  • When the processes execution time is significant and you want to avoid using up threads in the thread pool used by the invoker.
  • When the process’ execution time is completely variable and unpredictable and you want to avoid setting an extremely high transaction timeout value.
  • When you want to have the granularity to to stall and retry on a step by step basis in the process.
  • When you want a data trail of the process variables to be kept in the database for tracking or reporting (this would bring up another question of whether to use the data we store for reporting or design your own reporting mechanism, which is outside the scope of this post.  In general, my recommendation would not be to use our data  for your reporting, it likely isn’t structured in a way you’ll find suitable).

Hopefully the information presented here will be helpful and provide you with the necessary background to make your decisions on process design and whether to use a short lived or long lived process.

——-
Original article at http://blogs.adobe.com/livecycle/2011/05/short-lived-or-long-lived.html.

Short-lived process and the utilization of native processes

Livecycle uses native processes for Forms and Output related operations, more on this specific topic you can find here.

This blogpost will go into detail about how short-lived processes utilize these native processes and how you can optimize this.

To demonstrate this, I am using the following process.

 

 

 

 

It has a setValue operation, then a renderPDFForm operation and it ends with an executeScript.

The executeScript has the following contents:

System.out.println( “After the render operation” );

In this process, renderPDFForm is using the native process XMLForm. If there is no running XMLForm process when you invoke this process, it will start a new one. What you also can do is to kill the existing processes and then invoke the process and you will also see that the process it started again.

By default you will have a maximum of 4 XMLForm processes running on your system, and these are allocated via a pooling mechanism.

To have more information on this allocation/deallocation process, apply the following setting to your JVM :

-Dcom.adobe.bmc.tracePoolerActivity=true

When you now invoke the process you will something like this in the log :

[com.adobe.service.ResourcePooler] *****http-0.0.0.0-8080-2 trying to obtain lock on com.adobe.service.ResourcePooler@1cf0bb for resource allocation
[com.adobe.service.ResourcePooler] *****http-0.0.0.0-8080-2 allocated ProcessResource@f6b087(name=XMLForm.exe,pid=1392), Pool: {ProcessResource@f6b087(name=XMLForm.exe,pid=1392)=true}, initializing=false, poolsize=1, max=4
[STDOUT] After the render operation
[com.adobe.service.ResourcePooler] *****http-0.0.0.0-8080-2 deallocated ProcessResource@f6b087(name=XMLForm.exe,pid=1392), Pool: {ProcessResource@f6b087(name=XMLForm.exe,pid=1392)=false}, initializing=false, poolsize=1, max=4

So in normal English this would be :
1. Looking for an available XMLForm process
2. Allocating process with pid=1392 for this short-lived process
3. The message from the executeScript
4. Deallocating the process, and returning it back to the pool

At first sight you would say this is working as expected, and it is.

But the thing to look for is why the deallocating is taking place AFTER the executeScript.

In a short-lived process the allocation of the native-process is maintained for the whole transaction, this means when the short-lived process has completed the native-process is returned to the pool.

Of course in our example having a simple executeScript will not extend the duration of the lock, but imagine you do send out an email and afterwards execute webservice call.

Having these steps in your process will extend the lock where it doesn’t need to.

What is the impact?

If you are locking the native process longer than needed then you can end up with reaching the maximum number of native process in use. When that happens any subsequent operations that use the native process will have to wait until a native-process is returned back to the pool.

In that case what you will see is that processes are taking longer to complete, but you won’t see an increase of CPU-usage on the machine.

This locking effect on native processes can also lead to a deadlock.  Consider the following process:

Here the process actually uses the XMLForm native process twice, once to render a form, and then, conditionally, an XMLForm native process may be used indirectly to carry out a flatten operation on the form.  What can happen in this instance is that a lock is first obtained during the render, then when the flatten operation is performed indirectly by Assembler, it will attempt to find another different XMLForm native process that is not already locked.  When multiple instances of this process run concurrently, all available native processes can be locked in the first step, with none available to carry out the second step and a deadlock will result.  When this happens the deadlock is eventually broken when the transaction times out.

What is the solution?

What we want to achieve is to return the native-process to the pool as soon as possible, so for example when the render-operation has completed.
The way to do this is to create a wrapper process around renderPDFForm, this wrapper process contains only one step and has a transaction setting “REQUIRES NEW”.

 

 

 

 

This process is now called from the process, instead of doing a call directly to renderPDFForm.

 

 

 

 

If we use this subprocess in our example, then you will see the following output in the logfile:

[com.adobe.service.ResourcePooler] *****http-0.0.0.0-8080-2 trying to obtain lock on com.adobe.service.ResourcePooler@1cf0bb for resource allocation
[com.adobe.service.ResourcePooler] *****http-0.0.0.0-8080-2 allocated ProcessResource@f6b087(name=XMLForm.exe,pid=1392), Pool: {ProcessResource@f6b087(name=XMLForm.exe,pid=1392)=true}, initializing=false, poolsize=1, max=4
[com.adobe.service.ResourcePooler] *****http-0.0.0.0-8080-2 deallocated ProcessResource@f6b087(name=XMLForm.exe,pid=1392), Pool: {ProcessResource@f6b087(name=XMLForm.exe,pid=1392)=false}, initializing=false, poolsize=1, max=4
[STDOUT] After the render operation

As you can see now in the output the native-process is returned back to the pool BEFORE it starts the next step.
The wrapper process has been created for renderPDFForm, but this also applies for the other operations of the FormsService and OutputService.

 

——-
Original article at http://blogs.adobe.com/livecycle/2011/05/short-lived-process-and-the-utilization-of-native-processes.html.

Access custom Microsoft Office properties using LiveCycle services

Samartha Vashishtha

Marcel van Espen, over at the Dr Flex and Dr LiveCycle blog, explains how you can create a LiveCycle process to access custom Office properties. His blog post also includes a useful example.

“Within LiveCycle Workbench ES, one of the services in the common category that you can use is ‘Export XMP’. This service will extract all the available metadata from a PDF document. If you have converted a MS-Office document to a PDF document, you will be surprised what metadata is also converted. All these properties now become accessible.”

 

Read the complete post here.

——-
Original article at http://blogs.adobe.com/ADEPhelp/2011/04/access-custom-microsoft-office-properties-using-livecycle-services.html.

Hot off the press! Adding tabs to Adobe LiveCycle Workspace ES2

In LiveCycle Workspace ES2 (version 9.0.0.2), tabs are available for you to start new processes, view tasks that are assigned to you, and track tasks and processes. What if you wanted to add your own tab to enhance it? Check out a new article by Nithiyanandam Dharmadass that describes how to add navigation tabs to Workspace here.

——-
Original article at http://blogs.adobe.com/ADEPhelp/2011/06/hot-off-the-press-adding-tabs-to-adobe-livecycle-workspace-es2.html.

Configuring email notifications for the Managed Review & Approval Solution Accelerator

- Gilbert Yu

The Adobe Managed & Approval Solution Accelerator 9.5 is a wonderful solution for automating reviews for documents in your organization. One of the handy features  of the solution is email updates for automated reviews. For example, emails are automatically sent for these scenarios:

  • When a reviewer completes a review or review stage.
  • When an approver approves a document.
  • When reviewers or approvers are added to or removed from a review.
  • When a review or review stage completes.

This requirement is necessary for organizations that have regulated review and approval workflows. However, in non-regulated environments, this requirement may be a distraction to users because of the number of emails  that can be sent in reviews that involve significant number of people.

Alexandra Phillips has provided an article to describe how to configure  the emails that are sent using the Solution Template provided with the Managed Review & Approval Solution Accelerator. Check out the article here.

——-
Original article at http://blogs.adobe.com/ADEPhelp/2011/07/configuring-email-notifications-for-the-managed-review-approval-solution-accelerator.html.

Workspace Common Variables

Would you like your Workspace users to be able to see all process instances or tasks related to a particular customer name that they participated in? Would you like to refine the number of items displayed in your To Do or Tracking lists based on a particular value?

ADEP Process Management 10.0 has introduced a new feature called Common Variables which will allow you to do this. The overall purpose of this feature is to enhance the Workspace UI experience by allowing users to see and filter on information that is common across all processes.

In three easy steps, you can be using this new feature.

  1. Create the variables in a system application in Workbench.
  2. Set them in your process using a Set Value service.
  3. View them in Workspace, as values in a Search Template or as column headings.

Note: This feature is intended to be used with simple variable types only such as string, boolean, date, date-time, int, long and short. Only simple types will display in Workspace.

1) Create your Variables in Workbench

You will define your common variables in a new system application called Process Manager (common-variables).

  1. Do a File->Get Application in Workbench to download the process.
  2. Check out the CommonVariables process.
  3. Add  a variable in the usual way.
  4. Save the process and Deploy the application.

In the screenshot below, I have added a string variable called custName with a Title of “Customer Name.” The title is what users will see in Workspace.

Note that by default, the variable is marked “Searchable” and “Visible in UI”. Searchable allows you to add the variable to a search template and Visible in UI allows the variable to be exposed as a column heading. Also, since this is a common variable, the ability to mark it as Input/Output/Required has been removed.

 

2) Set Common Variables in your Process in Workbench

Now that you have created your variable, it will be available to all processes. You just need to set it in whichever process you want to use it in. The contents of these variables at runtime will be unique per process instance, just like any other process variable.

To set your common variable, you will use the XPath builder in the Set Value service. In the screenshot below, you will note that in addition to the Process Data node there is now a Common Data node to display common variables.

In my example, I have a field in my form named OrderedByCompany. The Workspace user will fill in this field before submitting the form. I will then take this value out of my form data (I picked this “Expression” from the Process Data node in the XPath builder) and map it into my common variable called custName (I picked this “Location” from the Common Data node).

When the process is run, these values will be set in the same way as normal process variables.

 

3) Viewing your Common Variables in Workspace

Now that your process has been run and the common variable has been set, there are multiple ways that you can see the common variable values in Workspace. They will appear as column headings in To Do and Tracking and as values returned in a Search.

Column Headings – To Do

Another new feature that was added in ADEP Process Management 10.0 is the ability to set your column preferences directly on the page you are working on. This applies to To Do and Tracking. You will use this feature to see the value of your common variables in Workspace.

Navigate to To Do. Select your queue, and make sure that you are in List View. Select the Manage Column Headings button in the top-right. Select your common variable from the list (mine was Customer Name) and select OK.

Your common variable will now appear as a column….with the values that were set when the process was run. Note that my queue in the screenshot below is set to “Show All” processes…I don’t have to select a specific process to see common variables. Since they appear across all processes, this may help to refine your To Do list.

Below, you will see that Maple Trust appears three times, once for the PurchaseOrder process and twice for the MortgageApplicationStart process.

If I filter on “Maple”, my list below will show three tasks from different processes that have the Maple Trust customer name in common. This has filtered the other 6 tasks from my list.  (Note that clearing the filter brings back the other 6 tasks.)

 

Column Headings – Tracking

The ability to see common variables also applies to Tracking. This may provide more context when looking for a particular process instance. You can filter on the variables in the same way that I detailed above in the To Do page.

 

Values From a Search

Common variables may also be used in a Search Template. Using my example, this will allow a user to search for a particular customer name across different processes and will show them the tasks that they have participated in.

The ADEP administrator will create a Search Template that contains the common variable. When they create the template, the common variables will appear automatically under the Process Variables section. (For regular process variables you first have to select the process where they are defined. Common variables do not require this extra step.)

In my example, my Search Template asks the user to provide a Customer Name.

I entered Maple Trust and all tasks with Customer name Maple Trust are returned.  In my case there were two processes that used this common variable and search results were returned for both processes.

 

Common Variables Best Practices

  • When you are ready to move your Process Manager (common-variables) application from development to production via an lca, make sure you don’t change the application name and process name. The common variables feature depends on this application name/process name being Process Manager (common-variables)/CommonVariables.
  • Do not create complex variable types (lists, maps etc) for use as common variables. Although Workbench allows you to do this, the feature was intended to be used with simple types only. Only simple types will display in Workspace.
  • Limit the number of common variables in your system to reduce possible performance implications.
  • It is not recommended to version the Process Manager (common-variables) process.
  • Only delete a common variable from the CommonVariables process if you are sure it is not referenced in any process. As with regular process variables, referencing a deleted variable will cause your process instance to stall.  If you are unsure whether it is referenced, a better practice would be to uncheck the “Visible in UI” and “Searchable” properties on the variable. This means the variable will no longer display in Workspace or be available for Searching.
  • Default values can be used in common variables.  Unless you override it in a SetValue service, the default value will be used.
  • The CommonVariables process is a system process. Use it only to define your variables. Do not add any steps to it, invoke it or reference it as a sub-process.

Additional Information

  • Record & Playback is currently not supported for common variables.

——-
Original article at http://blogs.adobe.com/ADEP/2011/08/workspace-common-variables.html.

How to Invoke LiveCycle Forms from an Existing XDP Form in Acrobat/Reader or Browser Plugin and Get Another Form Rendered Back to the Client

Lately I have had a request from a customer of mine who wanted to modify existing XDP forms (ie. change a label or a field value) on the fly without going in LiveCycle Designer (ie. the procedure would imply costs for hiring the dev department).

His idea was to have a Form A, in which he would be able to specify the changes he desires to have in the Form B, submit the form  A to a LC orchestration which would apply the changes and render the Form B back in Acrobat/Reader or even the browser plugin.

Here I am only covering the call to LC and the rendering back to the client.

Note: I am using LCES 2 SP2 (9.0.0.2) running on Jboss.

So we have Form A that could look like this:

Note: i will explain the submit URL later on.

As we can see we have a few fields that would mean something to Form B and as an end user we will open Form A in Acrobat/ Reader or even the browser plugin to enter the value we want to see in Form B.

We need to go in LC Workbench to create an orchestration which will render Form B:

We can see I used the RenderPDFForm service operation and created a variable (type document) for the input Form and to make things simple I had put Form B in the application  structure:


The key element for the invocation of the orchestration from Form A is to rely on the REST protocol (http post):

We can find the right URL for the call by selecting the Default startpoint properties:

Since I going to run the test on the same machine where Livecycle is running the URL looks like this:

http://localhost:8080/rest/services/test/renderForm:1.0

Note: “test” is the name of my application  and “renderForm” is my process (orchestration) and 1.0 is its version.

This is the URL I put in the submit button in Form A (see first screenshot).

In order to make the call successful, we need to create variables to match the fields in Form A: Name, FormContent and MainParagraph.

Of course, in the scenario where you want to modify Form B with Form A fields values, you would need those variables to apply the desired changes.

Note: by matching i meant the variables name and type (most of the time it would be string but you can have list as well).

Here I am only rendering the Form B without any changes so I did not bother adding more activities in my orchestration which would utilise those variables.

Once the orchestration and the forms have been saved and the application deployed on the server, all we need is to open Form A in the client of our choice, here I used Internet Explorer so we can see the URL at the top.

I click on the button “open form via REST” and the login request pops up and i use my LC credentials to access:

Once logged in Form B is appearing in the same window:

Note: When using Reader or Acrobat, it will open a new window for Form B.

No need to Reader Extend Form A to make it work hence it works in Reader standalone and plugin.

——-
Original article at http://blogs.adobe.com/ADEP/2011/08/invoke_forms_from_xdp.html.

Workspace Common Variables

Would you like your Workspace users to be able to see all process instances or tasks related to a particular customer name that they participated in? Would you like to refine the number of items displayed in your To Do or Tracking lists based on a particular value?

ADEP Process Management 10.0 has introduced a new feature called Common Variables which will allow you to do this. The overall purpose of this feature is to enhance the Workspace UI experience by allowing users to see and filter on information that is common across all processes.

In three easy steps, you can be using this new feature.

  1. Create the variables in a system application in Workbench.
  2. Set them in your process using a Set Value service.
  3. View them in Workspace, as values in a Search Template or as column headings.

Note: This feature is intended to be used with simple variable types only such as string, boolean, date, date-time, int, long and short. Only simple types will display in Workspace.

1) Create your Variables in Workbench

You will define your common variables in a new system application called Process Manager (common-variables).

  1. Do a File->Get Application in Workbench to download the process.
  2. Check out the CommonVariables process.
  3. Add  a variable in the usual way.
  4. Save the process and Deploy the application.

In the screenshot below, I have added a string variable called custName with a Title of “Customer Name.” The title is what users will see in Workspace.

Note that by default, the variable is marked “Searchable” and “Visible in UI”. Searchable allows you to add the variable to a search template and Visible in UI allows the variable to be exposed as a column heading. Also, since this is a common variable, the ability to mark it as Input/Output/Required has been removed.

 

2) Set Common Variables in your Process in Workbench

Now that you have created your variable, it will be available to all processes. You just need to set it in whichever process you want to use it in. The contents of these variables at runtime will be unique per process instance, just like any other process variable.

To set your common variable, you will use the XPath builder in the Set Value service. In the screenshot below, you will note that in addition to the Process Data node there is now a Common Data node to display common variables.

In my example, I have a field in my form named OrderedByCompany. The Workspace user will fill in this field before submitting the form. I will then take this value out of my form data (I picked this “Expression” from the Process Data node in the XPath builder) and map it into my common variable called custName (I picked this “Location” from the Common Data node).

When the process is run, these values will be set in the same way as normal process variables.

 

3) Viewing your Common Variables in Workspace

Now that your process has been run and the common variable has been set, there are multiple ways that you can see the common variable values in Workspace. They will appear as column headings in To Do and Tracking and as values returned in a Search.

Column Headings – To Do

Another new feature that was added in ADEP Process Management 10.0 is the ability to set your column preferences directly on the page you are working on. This applies to To Do and Tracking. You will use this feature to see the value of your common variables in Workspace.

Navigate to To Do. Select your queue, and make sure that you are in List View. Select the Manage Column Headings button in the top-right. Select your common variable from the list (mine was Customer Name) and select OK.

Your common variable will now appear as a column….with the values that were set when the process was run. Note that my queue in the screenshot below is set to “Show All” processes…I don’t have to select a specific process to see common variables. Since they appear across all processes, this may help to refine your To Do list.

Below, you will see that Maple Trust appears three times, once for the PurchaseOrder process and twice for the MortgageApplicationStart process.

If I filter on “Maple”, my list below will show three tasks from different processes that have the Maple Trust customer name in common. This has filtered the other 6 tasks from my list.  (Note that clearing the filter brings back the other 6 tasks.)

 

Column Headings – Tracking

The ability to see common variables also applies to Tracking. This may provide more context when looking for a particular process instance. You can filter on the variables in the same way that I detailed above in the To Do page.

 

Values From a Search

Common variables may also be used in a Search Template. Using my example, this will allow a user to search for a particular customer name across different processes and will show them the tasks that they have participated in.

The ADEP administrator will create a Search Template that contains the common variable. When they create the template, the common variables will appear automatically under the Process Variables section. (For regular process variables you first have to select the process where they are defined. Common variables do not require this extra step.)

In my example, my Search Template asks the user to provide a Customer Name.

I entered Maple Trust and all tasks with Customer name Maple Trust are returned.  In my case there were two processes that used this common variable and search results were returned for both processes.

 

Common Variables Best Practices

  • When you are ready to move your Process Manager (common-variables) application from development to production via an lca, make sure you don’t change the application name and process name. The common variables feature depends on this application name/process name being Process Manager (common-variables)/CommonVariables.
  • Do not create complex variable types (lists, maps etc) for use as common variables. Although Workbench allows you to do this, the feature was intended to be used with simple types only. Only simple types will display in Workspace.
  • Limit the number of common variables in your system to reduce possible performance implications.
  • It is not recommended to version the Process Manager (common-variables) process.
  • Only delete a common variable from the CommonVariables process if you are sure it is not referenced in any process. As with regular process variables, referencing a deleted variable will cause your process instance to stall.  If you are unsure whether it is referenced, a better practice would be to uncheck the “Visible in UI” and “Searchable” properties on the variable. This means the variable will no longer display in Workspace or be available for Searching.
  • Default values can be used in common variables.  Unless you override it in a SetValue service, the default value will be used.
  • The CommonVariables process is a system process. Use it only to define your variables. Do not add any steps to it, invoke it or reference it as a sub-process.

Additional Information

  • Record & Playback is currently not supported for common variables.

——-
Original article at http://blogs.adobe.com/ADEP/2011/08/workspace-common-variables.html.

Go to Top