Image by: peshkov, ©2016 Getty Images

    A very dear Singaporean friend of mine taught me Wayne Gretzky’s saying, "A good hockey player plays where the puck is. A great hockey player plays where the puck is going to be." This is great food for thought in electronic content management (ECM).

    Indeed, nowadays, chief information officers (CIOs) need to advise their chief executive officers (CEOs) on what to do with their ECM procedures and policies, and their task is not easy at all. They are players in a very fast hockey game, where the uniform of each player is not standard and, therefore, very difficult to distinguish between who is playing for them and who is the adversary playing against them. The public, on the other hand, is very noisy, making it hard for CIOs to understand what is being shouted or to whom they will support. It is quite difficult to see where the puck is now—let alone where the puck is going to be. There are a number of trends CIOs must pay attention to that may help forecast the game.

    Security is clearly one of the most important issues. I dare say, it is more important than tools to quickly find documents or records. These are designs that will ensure the accountability for decisions concerning those document or records or, if you will, a secure tracking history of who knew about those documents or records and the follow up of that knowledge. This is important from the organizational perspective, but then, the information technology (IT) manager must assure legal auditors that those documents, records and respective history logs were not tampered with. Having said this, it brings to mind the legal obligations of having to keep records accessible for a period of time, and obviously, accessible will mean readable, even considering that technology evolves. This is where the plot thickens and where we need to make considerations for digital preservation: Files are read by software (which evolves), software is produced for a particular operating system (which also evolves) and this operating system is, most of the time, based on a particular range of hardware (which also evolves). This is not as trivial as the “simple” analog visual preservation, which allows us to read a 5,000-year-old papyrus.

    Tools come and go, and this is another issue to think about. There are very specific tools made by companies, and usually, these tools are very good. I define good by being fully configurable with no need to develop code, with open interfaces so that other tools can read and process the same data, having what it takes to integrate with other tools, being language-independent and with minimal user training necessary. Large software companies have a wide range of products, and therefore, they are likely to stay in the market longer, but on the other hand, their tools are not as specific in certain markets. Dedicated companies will have very specific tools, but they will have to grow quickly globally in order to pass on some guarantee concerning the availability of their products. Obviously having said this, I don't mean that newcomers shouldn't be considered. Newcomers must be evaluated by the benefits of their products, and then, it is up to us to decide whether it is a good bet. Nevertheless, until the bet is confirmed, operations should be based on a “known” tool, even if it's not new.

    Needs are a funny concept to put into perspective. Something I have learned (sometimes, the hard way) is that nobody needs something that they don't know exists. Yes, there are basic needs: I need to know that a document exists in my organization; I need to know where the document I need is located; and I need to know that a document will flow automatically among the relevant stakeholders. We are also faced with needs that arise from published standards. There are European standards, UK standards, US standards, Australian standards and so on. All these standards were meant to help companies and governments to implement their solutions, but technology evolved so much faster than these publications. So today, they may be noisier than helpful in the general approach for solutions. Then, there are, of course, the specific needs for an organization. These needs have to do with specific business rules, with specific integration among applications and what not.

    Many companies decide that they will develop their own solution internally. In my view, this is usually the result of deficient project management, either in terms of requisites of functionality or in terms of negotiation with an existing and reliable supplier. We should use products out of the box, preferably those products with a global implementation and surely products with highly flexible configuration (no code development) capabilities. I'm not saying that code customizations cannot be done; they can, but it should only be done to the bare minimum. For instance, a document management tool needs to send data collected during an invoice approval to the enterprise resource planning (ERP) system. At the end of the associated workflow, there is a customized tool that calls a web service from the ERP to send in the data. This is small stuff and, if properly documented, will not place the organization in any kind of jeopardy if the people who developed that small code suddenly decided to do something else. Needs are satisfied by tools, but there are also needs that will arise from tool functionality; the latter are the needs people don't know they have until they check what the solution may provide.

    Searches are on the rise. We want to search for anything. Metadata, full text, audio within videos, the works. Social networks are fashionable not only for social purposes but also as a problem-solving tool in a business-to-business (B2B) environment. Although answers can be provided from an ad-hoc perspective, those answers should somehow be properly archived in order to ease the task of answering similar question. What social network should be used and how will be a decision for the exceptional players.

    Big Data was certainly one of the buzzwords of 2015. To make it simple, Big Data is the accumulation of files, records and documents and the accompanying data that we can squeeze from them. The analysis of that data is usually performed by specialized tools in order to provide specific knowledge. The Internet of things (IoT) is one of the big providers to the Big Data concept and can provide help within a wide range of contexts related to records management, from social networks to protecting critical information.

    In order to be an excellent player and, hence, skate to where the puck is going to be, we need to practice, practice, practice. Practice makes perfect. We must read; we must see what is happening with providers and customers; we must develop our own network. We must share our knowledge so that we can all benefit from 2016.

    Joao Penha-Lopes specializes in document management since 1998. He holds two postgraduate degrees in document management from the University Lusofona (Lisbon) and a PhD from Universidad de Alcala de Henares (Madrid) in 2013, with a thesis studying the economic benefits of electronic document management (EDM). He is an ARMA collaborator for publications and professionally acts as an advisor on critical information flows mostly for private corporations. Follow him on Twitter @JoaoPL1000.

    Most Read  

    This section does not contain Content.