Loading Strategies

From truxwiki.com
Jump to navigation Jump to search

The way you load data into Truxton mostly depends on the amount of data you have to load. If you're one of those kinds of people that likes to type, you can use the command line. From Windows File Explorer, you can right button on a file or folder and load it into Truxton. You can even script the load.

Single Media

When you have just a few pieces of media to load, you can use the Desktop Gui or Windows Explorer (right-button Load...). Manually starting a load is for single-user environments where everything is on a single machine. The amount of data is a couple of hard drives, half a dozen phones, etc.

You can also load from the command line.

"C:\Program Files\Truxton\Loader\Load.exe" WinXP.E01

A Walk Through

  1. You seize a device
  2. You make an image of it
  3. You load that disk image into Truxton
  4. You examine the Truxton output using the Analyst Desktop
  5. Once Truxton's exploitation is finished, read the generated reports
  6. Arrest scumbag

Several Media

When you get to the point where you have more than several hard drives to process at a time, a dedicated loader machine is used. At this level, overloading the machine becomes a problem. There's too much data to load and not enough time to sit and watch the loaders. Using a Load List will allow you to specify the media to load and how many simultaneous load processes to use. This allows you to balance the resources of your machine with the amount of data to load.

A Walk Through

  1. You seize several devices
  2. You make an images of all of them
  3. You create a load list
  4. You load the list file just like any other file into Truxton
  5. You examine the Truxton output using the Analyst Desktop
  6. Once Truxton's exploitation is finished, read the generated reports
  7. Arrest scumbags

Hundreds of Media

When you have a never ending stream of incoming media, you need something even more automatic than load lists. Multiple machines dedicated to loading are pressed into service. Like other ETL processes, the loader has its own message queue that loading tasks can be sent to. When a load process on one of those machines finishes, it will grab another load task from the message queue. One way of dealing with the volume is to use a tool for automating workflows such as NiFi. If you have a no-human-in-the-loop architecture, you can automate the publishing of freshly loaded media.

A Walk Through

The following is an example of an imaginary workflow.

  1. People have seized many devices
  2. Images of the devices have been made and copies put onto external hard drives
  3. The hard drives are transported to a central facility
  4. As the images arrive, they are copied from the arrival media onto rack storage
  5. Once safely on rack storage and verified, they are moved to some sort of folder dedicated to loading into Truxton
  6. Something watching for incoming files puts them onto a load queue
  7. Truxton monitors the queue and spawns loaders accordingly
  8. Teams of analysts examine the Truxton output using the Analyst Desktop
  9. Once Truxton's exploitation is finished, read the generated reports
  10. Drop bombs on scumbags

Quality Controlled

In this strategy, you have broken the exploitation process into separate loading and publishing phases. Data is loaded into a separate system, checked by a human for any errors or misconfiguration. There is a separate server that the investigators use to analyze the Truxton output. If the freshly loaded data looks good, the human will publish the freshly loaded media from the loader machine to the investigation server. Often times, the loader machine is reset to an initial state before loading the next media. When using this strategy, depot storage is concern. You don't want to move the depots if you don't have to. A shared storage solution is usually best where the loader machines and analysis server have the same view of storage. The analysis Truxton has no knowledge of the depot files from the loader machines until the media has been published to the analysis server. This allows the quality control technicians to freely delete depot files from bad loads without interfering with analysts.

A Walk Through

The following is an example of an imaginary workflow.

  1. People have seized many devices
  2. Images of the devices have been made and copies put onto external hard drives
  3. The hard drives are transported to a central facility
  4. As the images arrive, they are copied from the arrival media onto rack storage
  5. Someone reviews the incoming media for completeness, relative priority, surrounding meta data (who gave it to us and why), etc.
  6. When it has been determined that the data should be loaded, it is processed on an independent instance of Truxton (can be multi-node)
  7. Once Truxton's exploitation is finished, someone reviews the load logs, browses through the data using the Analyst Desktop. If no problems are found the media is exported.
  8. The processed media is then imported into the enterprise wide Truxton (or a Truxton dedicated to the analysis team)
  9. Teams of analysts examine the Truxton output using the Analyst Desktop
  10. Send drones towards scumbags

Integrated into Your Existing Enterprise Workflow

This scenario is one where you already have your own enterprise level system and want to use Truxton to feed it. You don't want all of your users to have Truxton. You just want the results of Truxton's processing to be put into your system.

A Walk Through

Here's the steps a script would perform to use Truxton as an extraction tool.

  1. Create a new virtual machine from a base virtual machine image with Truxton installed on it.
  2. Place the media to be loaded onto a share that the virtual machine can access.
  3. Create a load list containing the media meta data and place it in the media share.
  4. Execute Load.exe on the new virtual machine and give it the path the load list.
  5. Monitor the status of the load waiting for it to finish
  6. Execute a script to extract files or artifacts in a format of your choosing to a place on the share
  7. Export the loaded media to a shared folder (TPIF)
  8. Destroy the virtual machine

This workflow ensures no data contamination since all media is loaded on a virgin machine. Should a user on your enterprise system want to see the media that an interesting artifact came from, they can import the media (TPIF) into their instance of Truxton.

Field Exploitation

Sometimes you may have a one-person office far away from a server room. In this case, data is loaded in the field. The user will export the media without the depots and send the archive file (tmpf) to a central office. This file is tiny compared to the size of the exploited media. Analysts at the central office import the data into a Truxton database instance of theirs and review the data. They will not have any file contents but they will be able to see everything else (artifacts, tags, file meta data, etc). The fielded system can be configured to automatically generate this archive.

This strategy is useful when you have limited bandwidth between offices or sites. Sometimes it is better to push exploitation to the edge and send back the results rather than raw data. The raw data and depots can be forwarded later (via FedEx or when you get to a broadband connection).

A Walk Through

The following is an example of an imaginary workflow.

  1. Someone in a small town seized several devices
  2. They were able to image them
  3. They loaded the images into Truxton
  4. They reviewed the artifacts and reports in the Analyst Desktop and determined that higher level agency should look at it
  5. They emailed the Media Summary Reports to state analysts who requested the Truxton data immediately
  6. The investigation was exported from Truxton and, due to the remote nature of the town, transmitted via dial-up modem to the capital city
  7. The state level organization imported the investigation into their Truxton, correlated the artifacts with the rest of their holdings and requested the rest of the media
  8. The small town copies the depot files and images to an external hard drive and fly them to the capital

History Major

If you are not a technical person at all, Truxton has an Easy Button. It presumes that you have one piece of seized media per folder in another folder on an external drive. It will navigate all of the sub-folders looking for forensic images, move Truxton's databse to that external drive load all of the media into an investigation, generate the reports, then pop up a browser to a top level web page for that investigation. The GUI desktop is not needed. The user can dismount the Easy Button load, FedEx the external drive to a central office. All of the reports are in the results folder.

Triage

If you are in a hurry you can perform a Turbo Triage load. This type of load will ignore freespace and carving. It will load only those files and folders that you have specified. The names and patterns are stored in the [TriageFile] table in the database. Any file or folder that doesn't match an entry in that table will be ignored.