While on vacation Joe saw something weird happen on his machine, and thinks he might be owned. From the comfort of your desk: collect common persistence mechanisms and submit the binaries to your bulk malware analysis pipeline, grab a netstat, a process listing, and check recent browsing history. See something interesting? Grab a process listing from memory, collect deleted files, find the badness. Now check every machine in your fleet for the same malware within 30 minutes.
Use cases like this pushed Google to start work on GRR, an open-source remote live-forensics system, back in 2011. For the past three years we've been using it to analyze Joe's machine and do all of the above. Recently, we've added the ability to write and share simple definitions for forensic artifacts and perform large scale binary collection to hunt for badness across the fleet.
Greg will introduce GRR capabilities with some use cases and discuss the difficulties of running the tool across different environments. He will explain and demonstrate GRR artifact collection as well as talk about some of the aspects that make artifacts powerful but challenging to implement. He'll finish with a discussion of future directions for artifacts, integration with other open source forensics projects, IOCs, and the GRR project in general.