Oracle IPM 10g and Imaging 11g Content Migrations

One of the things we’ve always done a lot of here at ImageSource is migrations, it’s definitely one of our core competencies. For a little more information on our approach to migrations, you can review my earlier post here. Lately the migrations have been more focused because most of them involve moving content out of the Oracle IPM 10g or Oracle Imaging 11g products. Basically, Oracle IPM 10g has reached end-of-life and Oracle Imaging 11g was the terminal release for the product. So essentially the product line is dead. The IPM 10g product was something we worked with for many years so we have a wealth of knowledge on its ins and outs. IPM was a feature rich but older product stack and it was in need of a bit of an overhaul. However, when Oracle rewrote the product as Imaging 11g there were a lot of key and important features that didn’t make the cut.

Because of everything I’ve mentioned, businesses running on these particular Oracle ECM platforms have had to make decisions for their long term ECM vision or roadmap. I have worked with a number of clients on technology evaluations and the like to help determine their roadmap, but that’s a blog post for another time. One of the key pieces to any ECM roadmap for a company performing these solution changes is the migration of their content from the Oracle IPM 10g or Imaging 11g systems that they are replacing. Luckily we have to the tools and knowledge to make these migrations as straightforward as possible, the tools being ILINX Export and ILINX Import.

There are a number of options with ILINX Export but in short, we use it to export all content and metadata out of a source system for it to be migrated into whatever destination system is necessary. By default, ILINX Export retrieves the content from the source system in exactly the same format it was in when it was added to the original system. By exporting the content out in its native format, a customer can always keep a copy of the original data and any data manipulation or file conversions can be done downstream. ILINX Export does have the ability to convert files to PDF but I generally do image conversion when importing the content into the destination system. Utilizing our knowledge of the Oracle products, we have plenty of options when extracting the content from them. For example:

  • Only migrate certain applications.
  • Only migrate content created after or before a certain date.
  • Only migrate the content that falls within certain criteria: for example a specific business unit, a set of document types, or virtually any criteria that can be identified with the content metadata.
  • Split the content up so content that meets certain criteria goes to one destination and content meeting other criteria goes else ware.
  • Retain the IPM or Imaging annotations. These can be flattened into the documents, but I only recommend that in certain instances. If the client is migrating to ILINX, we can migrate the annotations as an overlay into the new ILINX system.
  • There are many options on the format of the data when it is exported from the source system. ILINX Export can output the metadata to text or XML files with complete control of the format, delimiter, field order, layout, and size of those files. That flexibility allows for the creation of input files in a format that can work for just about any destination system.
  • The metadata can also be written directly to SQL to support long term storage or manipulation if necessary.
  • Scheduling the export to run during off-hours to keep load off the servers while clients are using the old system.
  • Detailed auditing of the entire process to help with reporting, compliance, and troubleshooting.
  • Many more.

Once all the requirements surrounding the migration have been defined and execution has started, one of the next steps is importing that content and metadata to the destination ECM system. I’ll go over next steps along with more on the export portion of a migration in a follow on post. If you have any questions about migrating content out of a system or ILINX Export reach out to us for a demo or discussion.

Transferring ILINX Release Configurations When Upgrading

Starting with ILINX Capture v6, the Release configurations are stored within the ILINX database. In ILINX Capture v5x, the ILINX Release configurations were stored in XML files on a disk. ILINX Capture called ILINX Release using a SendAndReceivedReply IXM. The change to store the settings within the ILINX database is very useful for a number of reasons: Release settings are part of the batch profile allowing for simpler migrations between environments, Release is much easier to configure, all configurations are in the database, etc. However, this change can create some extra work when upgrading from ILINX Capture 5x to ILINX Capture 6x. Because of the different architecture, ILINX Release needs to be completely reconfigured for the existing batch profiles. In addition, the Release XML doesn’t change, but there is a shortcut that can be taken. After you have upgraded ILINX Capture to v6, you’ll notice a new IXM in the palette: ILINX_Release_IXM_Icon

The existing ILINX workflow will likely have a SendAndReceiveReply IXM on the map that the 5x version of ILINX Capture used to call ILINX Release. Most likely, it would look like this:
SendAndReceiveReply_IXMTo configure ILINX Release for ILINX Capture 6x, the SendAndReceiveReply IXM will need to be removed from the map and a Release IXM must be dragged onto the workflow map in its place. Once the new Release IXM is on the map, it will need to be configured. This is where the shortcut can be taken. Instead of having to manually enter in the correct URLs, map the metadata values, and configure any other settings, do this:
Configure and save Release with some place holder settings: I normally leave the settings at default and enter in the bare minimum:

  • Job Name
  • User Name
  • Password
  • Batch Profile
  • Release Directory

Once ILINX Release configuration is saved and the workflow map is published, there will be a new entry in the ILINX Capture database Capture WorkflowAppSettings table. The CaptureWorkflowAppSettings.SettingsXML column is where the Release configuration is stored. Now it’s time to update the SettingsXML column with the XML from the ILINX Release 5x job settings file. The Release job should be on the ILINX Release 5.x server at c:\ProgramData\ImageSource\ILINX\Release\Settings\Jobs. The only caveat here is to be sure to place single quotes around the XML content. Here is what the SQL update statement would look like:

update [ILINX CAPTURE DATABASE].[dbo]. [CaptureWorkflowAppSettings]
set SettingsXml = ‘COPY AND PASTE ALL TEXT FROM 5.4 OR PRIOR RELEASE JOB SETTINGS FILE HERE’
where settingsID = ‘APPROIATE ID HERE’

Following this procedure can save some time if upgrading an ILINX Capture 5x system that has a lot of batch profiles. A lot of the time spent on the upgrade could be in the ILINX Release configuration. If I was upgrading a system with only a few batch profiles, I would probably just reconfigure them. If I was upgrading a system with a lot of batch profiles, I would go through the above steps to save some time.

John Linehan
Sr. Systems Engineer
ImageSource, Inc.

Failover Cluster Troubleshooting

There’s nothing quite like logging in to a customer’s system first thing Monday morning only to be greeted with this:

Cluster_report

I discovered this when I wasn’t able to log into the customer’s ILINX Capture implementation. The logged error (failure to locate the SQL Server) led me to take a look at the SQL Server’s configuration to confirm that its service was not running on either node of the cluster, and the error I got when trying to start that (a clustered resource could not be activated) led me to check on the clustered resources themselves.
Continue reading

Implementing SQL FILESTREAM Part II

Last month I wrote about enabling SQL FILESTREAM with ILINX Content Store. After discussing this with a few people, I think I should share some more information and reiterate a couple points.

For Existing Applications:
As I mentioned before, the decision to enable FILESTREAM should be done during the planning phase. If you perform this process on an application with a lot of content, it can be a very time costly endeavor with a big performance impact to the server. Also, after the move from BLOB to FILESTREAM, you could have a fragmented database. The BLOB to FILESTREAM process can definitely be done on an existing system, just be sure to plan accordingly and allow for sufficient time.

After step #10 of my previous blog post (all the data is copied and you have deleted the BLOB column), you will notice that the database file size hasn’t decreased. This is remedied easily enough be executing a DBCC CLEANTABLE command. The DBCC CLEANTABLE command will reclaim the space from the dropped variable length column. For example, if your database is named ILINX_CS and your application is named Sample Application, the query to do this is:

DBCC CLEANTABLE ('ILINX_CS','[dbo].[Sample Application]',10000) Continue reading

Storing content outside of SQL Server for ILINX Content Store using SQL FILESTREAM

By design, ILINX Content Store stores documents within the SQL database as BLOBs. There are many advantages to this design (security, performance, etc.) but sometimes there is a reason to store the documents outside of the SQL database. SQL Server has a method to do this called FILESTREAM. FILESTREAM integrates SQL Server with the NTFS file system by storing varbinary(max) data outside of the SQL database. FILESTREAM uses the NT system cache for caching file data: this helps reduce any effect that FILESTREAM data might have on Database Engine performance. The SQL Server buffer pool is not used; therefore, this memory is available for query processing.

One of the main reasons to implement FILESTREAM would be because your documents are generally larger than 1MB in size, storing them outside the database can have a performance advantage. If these are TIFF documents, then this 1MB threshold would be on a per-page basis. This is due to how ILINX Content Store stores TIFF documents. By design, ILINX Content Store splits multipage TIFFs into single pages to allow for users to perform actions on single pages of a document: things like a reorder of pages, single page delete, or rotation. Continue reading

Indexing Tables in Kofax-Based Environments

We recently had a customer who needed to migrate off of an aging and highly customized Capture/indexing/workflow one-off solution. At the center of many of their form types in this system was a repeatable field collection object that functioned much like how you would expect a .NET DataTable control to function – values could be added horizontally to the current “row”, and at the end of it you could hit enter and a new “row” would be added. As you moved through, you also had the ability to validate the line item as a whole. In other words, nothing too out-of-the-ordinary.

Unfortunately, this stood out as a red flag for both myself and my coworker when we first saw it, since we were migrating the client to Kofax Capture. There’s nothing inherently wrong with Kofax’s flagship product, in fact it is an excellent tool for getting content where it needs to be, often in record time. One thing it doesn’t do well out-of-the-box, however, is table fields. Defining one looks normal enough, but when you actually get the chance to index them, each column ends up being a standard index field. Needless to say, turning the table 90 degrees counter-clockwise and forcing keyers to manually delimit values is not an ideal experience, especially when 99% of your form is tables that need to be indexed. Continue reading

Registering DLLs in COM with WiX for creating an MSI installation package for a Kofax Custom Panel

I was working on a project recently for a customer that was upgrading their Kofax versions and making some enhancements to a custom Kofax panel that we had written for them some time ago. Like any good developer, I migrated the code for the custom panel to the latest version of Visual Studio I had, (in this case, Visual Studio 2012). I had finished development and was discussing installation when the customer requested an MSI package to install the custom panel. Unbeknownst to me, Visual Studio 2012 had dropped their support for the easy, drag and drop, built in set up and deployment project to create MSI’s.

In doing some research, I found many developers had migrated to using the open source WiX product to create MSI packages, (http://wix.codeplex.com/). One can download WiX and integrate it directly into Visual Studio. Everything was fairly straight forward on following their tutorials except for one snag: in order to get the custom Kofax panel to install correctly, I had to register the custom DLLs as COM Components, not in the GAC. After a lot of head scratching, I finally figured out that I could use Heat (one of the WiX tools) to create a registry file of the DLLs to include in my WiX set up project. You can find out more about Heat here: http://wixtoolset.org/documentation/manual/v3/overview/heat.html. After the file was generated I was able to take the output of the Heat generated file and include it in my WiX install project to register the necessary DLLs. To do this, I followed these steps: Continue reading