Picking a winning EDRMS team

“I tried that but it didn’t work. I’ve tried everything and it does not matter what I do, I can’t get the executive to budge, nor the sponsor to really believe in the project”, the records manager opined down the phone. I almost let out an audible sigh as I continued on with the interview, one of many conducted as part of our research sponsored by Records and Information Management Professionals of Australasia (RIMPA) into what are the drivers behind successful and failed Electronic Document and Records Management Systems (EDRMS) implementations.

What I had just heard was a common refrain from failed implementations and even from implementations which were deemed successful from our previous quantitative survey but, upon a qualitative analysis of the actual implementation, were not as successful as first thought. Nor was this refrain an isolated indication that skill gaps in the implementation team were having a negative impact on the sustainability of EDRMS implementations.

Picking an EDRMS implementation team is not a trivial task. There is a clear need to have the right skills on the team, as evidenced by Change Factory and Linked Training quantitative research in 2011, which surveyed 107 organisations who implemented an EDRMS.

Although it is not necessary to have specific roles with the skills, the skills themselves must be available in the team membership.

Whilst we use the broad term, ‘skills’ quite a lot in our articles and research findings, it is important to be able to describe and assess the more specific attributes of attitude, knowledge and skills when picking your EDRMS implementation team.

An example of what we mean with regard to attitude is that team as a whole must be consistent in their approach to people and the tactics used to create the environment for change. If the team is very fragmented in their approach and the tactics used by individuals appear incongruent, the validity of what the team is saying will be severely diminished.

One team we know had some members who had an attitude that people would just do as they were told because the records management policy was signed off by the Managing Director and the culture of the organisation was one of control and command. Other members were of the view that whilst that was so, the key to driving individual adoption was to show people how they benefitted. The result was that the attitude that drove the engagements with end-users was quite different. One part of the team, in fulfilling the task of engagement, asked fewer questions than the other, gave much more concrete direction and spent less time educating people about the benefits of an EDRMS and good recordkeeping practices. The consequence was a disjointed view amongst end users about what was driving the program and a disbelief of there being any coherent rationale for adopting the system. It would have been better to adopt one of the tactics completely than have a half-way house based on different attitudes to the change.

Knowledge needs to be measured in terms of its depth. For example, it is one thing to be able to describe well how to drive a car. It is another to able to describe how to drive a formula one racing car on specific circuits around the world. The depth of knowledge is quite different. It is the same when describing something like training needs analysis. It is one thing to be able to describe in general terms what processes might benefit from the functionality in an EDRMS. I could describe some of the processes whilst sitting here typing this article, e.g. recruitment, procurement, accounts payable, asset management and talent management. It is another level of depth of knowledge to know what the process steps in a particular business unit of an organisation are that could benefit from using ‘Actions’ to manage and control workflow for instance.

Often we find people nominated for positions who by virtue of their title such as Change Manager are somehow expected to be bestowed with the knowledge of an experienced change manager. One organisation had a change manager who was expected to know how to create a script for an animated video and understand how to deconstruct images, voice overs, sounds, texts and animation from eLearning and use them in the reconstructed animated video, because she was the ‘Change Manager’ and had the responsibility of managing communications in her job description. She had never been involved in the creation of a video before, nor eLearning. Of course, she struggled (as did the video company) to translate what the project manager, to whom she reported, wanted into reality.

The right skills is undoubtedly the broadest category to evaluate. Skills, like knowledge, are assessed at different levels of depth. For example, two different people conducting a training needs analysis using the same methodology will end up at very different results both of the data collection and interpretation phase and of the analysis and training model design phase.

In the data gathering phase the differing ability in building rapport, asking questions in a facilitative style, being able to ‘sense’ when information is incomplete or missing, comprehending how information given connects together and being able to determine the criteria by which the information should be sifted for a given project will result in vastly different information sets being delivered.

In the analysis and training model design phase, the difference between a workable model where end-users and support staff are educated to create a sustainable outcome may be dependent on different levels of capability to:

  • Cascade the goal of the project into learning objectives.
  • Evaluate and describe the culture of the organisation.
  • Determine the current skill sets of end users and support staff and its impact on training model design and change management tactics to be used.
  • Evaluate the training technology available and its impact on training model design.
  • Determine the internal and external budget available and its impact on the training model design.
  • Evaluate leadership strength and style and determine its impact on the timelines of training and degree and nature of change management tactics to be used.

It is, therefore, insufficient when building an EDRMS team to ‘tick the box’ when it comes to assessing attitude, knowledge and skills. The questions we ask of our team should be detailed and by their answer clearly identify gaps which we have to fill or at least create contingency plans for the risks created by the gaps.

To download our comprehensive checklist, click here.

 

© 2012 Change Factory and Linked Training


Comments are closed.