DME Forensics Blog

Validation Versus Performance Verification

October 27,2014 /

Technical Posts /

0 Comments

So you just got in the latest and greatest version of XYZ software. This will be a useful tool in your workflow and, according to the manufacturer’s specifications, will greatly increase your capacity and efficiency. You install the software on your workstation and now you’re ready to go. Or are you?

For starters, let’s talk about the difference between “validation” versus “performance verification”. These words are often used interchangeably and they can have different meanings in different disciplines. These terms in the biology and drug chemistry worlds have a different meaning than in the digital and multimedia evidence (DME) disciplines. How can that be? Let’s first start with some general definitions.

A validation is an intensive and systematic testing of a methodology, system, tool or process with known standards and controls to ensure the tested item meets accepted performance criteria and produces results within a statistical significance and known quantity. In simpler terms, this is a very thorough testing to ensure the tool works and produces consistent and accurate results in many different situations. Within the Digital and Multimedia evidence field, this type of testing is not feasible or even possible in some situations.

So you have your shiny new tool, without validation, how will you know it is going to work correctly? This is where performance verification comes in. Performance verification is a simple and logical test utilizing your workflow with known samples on the workstation you plan to process evidence. This is to determine if the hardware/software produces consistent and expected results. Essentially, does it work on my system the way the manufacturer says it will?

Although hardware and software manufacturers perform extensive testing before the release of their product, they cannot account for every variable in your workflow or on your workstation. What type of video card are you using? How much RAM do you have? What operating system updates have you applied?, etc. There is no way a manufacturer can test every variable.

One of the critical ingredients in a performance verification is test media. You must have a known starting point in order to know what the expected outcome should be. When you run your test media through the tool you then can compare the actual results versus the expected results. Let’s use an example to make this clear.

Let’s say you just got a new video screen capture program. According to the manufacturer, this tool will capture every single frame without duplication or any loss of video from the DVR player. A simple way to test this would be to create a simple set of test media on your nonlinear editor. Most nonlinear editors come with a set of known test files, for example, NTSC bars and tone file. You can import this file and set it to a specific raster size (amount of horizontal and vertical pixels) and frame rate. You then export this in different file formats. You then use these files to test your new tool. Open the test files, capture the data according to the manufacturer specifications and then review the data. You can use other tools from your toolbox such as QuickTime Pro, VirtualDub, GSpot, your original nonlinear editor, or any other previously verified tool to evaluate the file you captured. For this example, if the raster size (number of horizontal and vertical pixels), the number of frames and the duration of the footage is the same, then the tool performed as expected on your workstation. It has passed the performance verification.

By using multiple tools to evaluate the results of your test ensures the accuracy of your results. Most of us know in the criminal justice world, “if it isn’t written it didn’t happen” so it is recommended you write a very simple report to document your test media, your procedure, other tools you used to verify the results, the results, and any limitations you may have found.

Now let’s put this in a real world context. My previous employer upgraded our workstations and purchased high performance computers with 32 GB of RAM, SSD hard drives, built-in RAID, dual video cards, 6 core processors and so on. Before conducting live casework, I ran a simple performance verification on the typical tools I was going to be using in casework. When I ran my screen capture test on known video files, I noticed that after around two minutes of capture the program started dropping frames. How can this be? I have a more than enough hardware – how can I be dropping frames?

If I did not have known test samples, I would have never known this was occurring. After much testing and troubleshooting we were able to identify the issue. The screen capture tool would not operate properly with all processing cores active. As soon as we disabled 4 cores, the program worked as expected.

A couple key points about performance verification to make your life easier:

  1. Keep it simple! There is no need to develop elaborate and complicated tests and test media.
  2. Make sure your test media reflects your typical evidence you will encounter during casework.
  3. Verify your results using at least two different verified tools/software.
  4. Briefly document what you did.

In digital and multimedia evidence, validation is not required or in some cases even possible. However, conducting a simple performance verification on your tools (hardware and software) will allow you to be confident that when you are working on your evidence the results you obtain are consistent and accurate. This will be of great benefit if your organization is accredited (or plans to be accredited) as well as if you have to testify in court on your results. A little bit of work on the front end will save you an enormous amount of time in the long run.

Written by Jason Latham