# mergecap -w /dev/stdout file1.pcap file2.pcap file3.pcap | tcpdump -r - -w output.pcap host 192.168.1.10mergecap reads the list of files at the end as input and writes them out to /dev/stdout, where tcpdump reads them in and writes the result to output.pcap after applying the filter (host 192.168.1.10).
Friday, October 10, 2014
Use tcpdump to Filter and Merge Multiple pcap Files
The other day I had a couple dozen pcap files (each just under 1 GB in size) that I wanted to filter the traffic of one host out of. A couple different options come to mind - merge the pcap files together and then filter, or filter each pcap separately and then merge the results together. Both of these are pretty sloppy ways of doing this if you don't do it in one line:
Wednesday, October 1, 2014
Single Line Base64 Decoder
If you have a chunk of Base64 encoded data and want to decode it, the quickest method is usually to find some online decoder. If you're worried about the sensitivity of the data or don't have access to a web browser or even the Internet you'll want to decode it locally.
To do this you'll need perl (should be installed on most linux distros). Given any file containing only Base64 encoded text, ex:
(NOTE - the file must contain ONLY Base64 encoded text - any existing decoded data will break the process)
To do this you'll need perl (should be installed on most linux distros). Given any file containing only Base64 encoded text, ex:
$ file base64_fileThe following command will decode the text:
base64_file: ASCII text, with CRLF line terminators
$
(NOTE - the file must contain ONLY Base64 encoded text - any existing decoded data will break the process)
$ perl -MMIME::Base64 -e 'print decode_base64(join("",<>))' < base64_file >outputIf done correctly the output file should contain the decoded data.
$ file output
output: HTML document, ASCII text, with CRLF line terminators
$
Subscribe to:
Posts (Atom)