WebServices API DiagramWe all know the speed of change in the world of technology but the capture industry has seen more than its share in the past 3 years.  This is an industry that dates to the mid 1980s and change has been moderate over those years.  For the first 20 of those years, the big changes were moving from hardware-enabled image processing boards to imaging toolkits to inclusive batch capture applications.  Just as capture was becoming a mainstream enterprise application, many capture vendors began to be acquired by larger ECM (Transact Content Management) players (EMC/Documentum buying Captiva, IBM/Filenet buying Datacap, Lexmark/Perceptive buying Brainware, etc.)  But this did little to improve the elements of capture technology.

A big shift started just 3 years ago as pioneering capture visionaries understood the technology changes we all faced could be applied to capture software.  For the first time, it seemed possible that thin client capture was possible and that using the cloud as a repository was not just a possibility but increasingly the best choice for many users.

If we think about the recent changes, it opens us to the new world of possibilities.  For instance, if an organization were to start today to look at a capture solution, what would they consider essential:

Thin client vs thick client:  For definitions, thick clients require a large software footprint in a PC.  The software is often loaded via a CD or downloaded from a server on the local area network.  Conflicts between Windows components and the installed software meant IT staff is often needed to overcome these problems.  By comparison, thin clients allow users to simply point to a web address and access the same application that is hosted on a network server, either on the cloud or on-premises behind the company’s firewall.  So any browser would work meaning that no longer is capture restricted to Windows PC.  For the first time, Linux workstations as well as Apple Macs have the same capabilities as Windows users.  There used to be trade-offs here but t seems odd for me that anyone would start today with a thick client solution.  The past considerations centered on bandwidth, throughput, speed, security, integrity, user acceptance, and more but from what we see (and what our customers are telling us), thin clients have won this battle.  In fact, 2012 seemed to be the turning point when acceptance became universal. Even browser-based scanning which seemed a dream only a few years ago due to the large size of images is now a smart option for users as is data validation remotely (which has opened up this task to large labor pools around the world).  The best systems today are totally thin, including the image intensive review screens and system administration.

Cloud:

Feature set:  Whereas former capture solutions scanned, improved image quality, read barcodes, and possibly did OCR (Optical Character Recognition), this is simply no longer enough.  The standard solution today should have all that plus OCR for dozens of languages, robust internal workflow, impressive content-based document identification and separation using only one or two samples without templates, easy mouse-driven image grouping and re-classification, external database interfaces, standard scripting for custom user needs, email and fax integration, and much more.  Since many enterprise users have multiple ECM systems installed, capture support for the new industry standard called CMIS (Content Management Interoperability Services) is a must since all modern ECMs support this universal connectivity standard.  Lastly, security is essential since confidential documents are often captured.  So a 3 tier architecture (web tier, application tier, and data tiers) with multiple firewalls is a requirement for government, financial services, insurance, and most other users.

Business models are changing too:  Legacy products charge a high initial license fee and annual maintenance which makes systems hard to cost justify in less than 2-3 years or longer.  The new model charges no initial license fee but just an annual subscription fee to use the software and receive updates and support.  This new approach means customers can justify very valuable software in months, not years.  It places extra pressure on vendors to support and enhance their software since at each renewal, the customer has a choice whether to stay or move to another capture solution.  Clearly this is good for users.  The price of legacy systems is generally based on images processed and often the number of data fields recognized.  This has always been a cause of concern since many times, users have only vague if any idea of the volumes they will process in the next twelve months.  If it’s a totally new deployment, volumes could rise dramatically and users would need to seek additional funding from management.  The better approach is to charge a nominal fee based on the number of servers and then let users process all they can.  Budgeting is simpler and far easier to explain to user management.

Web service APIs:  A recent addition to a requirements list would be the ability to interface directly with a capture system from within the line of business application.  For instance, a financial system could send an image transparently to a capture system and would learn this is an invoice with the specific data elements that users need.  In this way, user training is reduced and separate processing steps are reduced. This tight integration will likely become industry standard in the future but it is already available in some systems.

Where does this lead?  To user empowerment.