Monday, October 27, 2014

Back to basics - Convert ICS to HTML and CSV

The discreet nature of calendar entries make seeing the over all picture or in investigations seeing a pattern of events is very difficult.  We need to be able to see the events in chronological order in a single document that we can use as a report or chart the values for easy understanding of events for non-technical professionals.

One of the most useful and versatile applications when it comes to Internet communication.  In this blog, I will explore the capability of this tool to convert .ics files, that is the only format that Google Calendar exports.

I also created a video to accommodate this blog post: http://youtu.be/WbBRhP6VXbs

So, in order to follow this process, you need to download and install Thunderbird, https://www.mozilla.org/en-US/thunderbird/download.

Login to your Google Calendar and create a new calendar.

Add new schedules to the new calendar and export the calendar as an .ics file.  Notice in the exported .ics file below the date and time stamps are not very user friendly to read, so it might need to be manually converted to make sense to non-technical professionals.  On the other hand, the HTML and CSV exported files below show the date and time stamps displayed in user friendly format that is easy to report and charted for easy interpretation without any manual conversion or risk of human error.


Import the .ics file into Thunderbird's Lightning add-on, that adds the calendar feature to Thunderbird.

Export the calendar as .ics, .html, or .csv format.


The HTML document can be directly used as a report, but the CSV format gives more flexibility to analyze the data or create chart to show clear patterns of events. 



Thus, digital forensics is about pattern recognition, but pattern can not emerge in some cases in its native format.  So, we need to focus on software capability to import certain file types and explore applications capability to export the data into different format that can aid our analysis and help identify patterns to solve cases.  

Back to basics - SQL and XSS

This post is accompanied by a video explaining this process and you can do about it.

http://youtu.be/-W3efiMT8H0

Sample web page to test Javascipts in browser.  Save the following code in a text file, name it test.html ad open it in your browser to see what it does.

<HTML>
<HEAD>>
              <script> window.open('http://zoltandfw.blogspot.com/','_blank')</script>
              <script> alert(document.cookie)</script>
              <script> alert("Your account has been compromised, please call (111)222-3333 to report!!!")               </script>
</HEAD>
<BODY>
              Just a test for JavaScripts
</BODY>
</HTML>

Sample log file entries showing details on what information might be collected in log files to investigate after the fact or monitor for real-time response.  

141027  7:39:45  122 Connect root@localhost on 
 122 Init DB badbank
 122 Query SELECT userid, accountnumber FROM badbank_accounts WHERE username='zoltan' AND password='9f1c050c2b226c2154d17a3ff9a602f6'
 122 Quit
141027  7:41:55  123 Connect root@localhost on 
 123 Init DB badbank
 123 Query SELECT userid, accountnumber FROM badbank_accounts WHERE username='zoltan' -- ' AND password='d41d8cd98f00b204e9800998ecf8427e'
 123 Quit
141027  8:00:30  124 Connect root@localhost on 
 124 Init DB badbank
 124 Quit
 125 Connect root@localhost on 
 125 Init DB badbank
 125 Quit
141027  8:42:47  126 Connect ODBC@localhost as  on 
 126 Query select @@version_comment limit 1
141027  8:42:55  126 Query show databases
141027  8:43:26  126 Query SELECT DATABASE()
 126 Init DB Access denied for user ''@'localhost' to database 'badbank'
141027  8:43:41  126 Quit

...

141027  9:04:20  130 Query select * from badbank_transactions
141027  9:05:22  213 Connect root@localhost on 
 213 Init DB badbank
 213 Query SELECT balance FROM badbank_accounts WHERE userid=61
 213 Quit
141027  9:05:37  214 Connect root@localhost on 
 214 Init DB badbank
 214 Query SELECT balance FROM badbank_accounts WHERE userid=61
 214 Query SELECT userid FROM badbank_accounts WHERE username='victim1'
 214 Query UPDATE badbank_accounts SET balance=balance-1 WHERE userid=61
 214 Query UPDATE badbank_accounts SET balance=balance+1 WHERE userid=60
 214 Query INSERT INTO badbank_transactions (userid,time,withdrawn,transactor,transfernote) VALUES (61,NOW(),1,60,'<script> alert(document.cookie)</script>')
 214 Query INSERT INTO badbank_transactions (userid,time,deposited,transactor,transfernote) VALUES (60,NOW(),1,61,'<script> alert(document.cookie)</script>')
 214 Quit
141027  9:05:41  215 Connect root@localhost on 
 215 Init DB badbank
 215 Quit
 216 Connect root@localhost on 
 216 Init DB badbank
 216 Quit

Sunday, October 26, 2014

Back to Basics - Information Assurance - Robots.txt

Note: If you like these blog posts, please click the +1 !

In some cases, you might need to, so called, crawl a web site to gather keywords or email addresses. Web sites can utilize the use of robots.txt files to prevent simple automated crawling of the entire website or part of it. The robots.txt file gives instructions to web robots about what not allowed on the web site using the Robots Exclusion Protocol. So, if a website contains a robots.txt like:

User-Agent: * 
Disallow: / 

This robots.txt will disallow all robots from visiting all pages on the web site. So, if a robot would try to visit a web site http://www.domain.topdomain/examplepage.html, then robots.txt in the root of the website http://www.domain.topdomain/robots.txt will not permit the robot to access the website. The robots.txt file can be ignored by many web crawlers, so it should not be used as a security measure to hide information. We should also be able to ignore such a simple security measure to investigate or to test web site security. I have mentioned in previous web posts the tool called wget that is a very useful tool to download a web page, website, or malware from the command line. This simple tool can also be configured to ignore the robots.txt file, but by default, it respects it, so you need to specifically tell the tool to ignore is directions.

wget -e robots=off --wait 1 -m http://domain.topdomain 
FINISHED --2014-10-26 11:12:36-- 
Downloaded: 35 files, 22M in 19s (1.16 MB/s) 

While not using the robots=off option will result in the following results.

wget -m http://domain.topdomain 
FINISHED --2014-10-26 11:56:53-- 
Downloaded: 1 files, 5.5K in 0s (184 MB/s)

It is clear to see in this example that we would have missed 34 files by not being familiar with this simple file and its purpose.

Using "User-Agent: * " is a great option to block robots of unknown name blocked unless the robots use other methods to get to the website contents. Let's try and see what will happen if we use wget without robots=off.

 
As you can see the User-Agent is set to wget/1.11( default Wget/version ), so as you can see in the list below, a robots.txt with the content list below would catch this utility and prevent it from getting the website contents.

Note: The orange highlighted three packets are the 3-way handshake, so the request for the resources with the User-agent settings is the fist packet following the three-way handshake.  That might be a good pattern for alarm settings.

wget also has an option to change the user-agent default string to anything the user wants to use.

wget --user-agent=ZOLTAN -m http://domain.topdomain 



As you can see in the packet capture, the user-agent was overwritten as the option promised, but the website still only allowed a single file download due to User-agent: * that captured the unknown string.  So, robots.txt can help protecting the website to a certain extent, but the -e robots=off option did get the whole website content even though the packet contained an unmodified User-agent settings.

robots.txt can have specific contents to keep unsafe robots away from a web site or to provide basic protection from these "pests":   ( This list is not exhaustive, but it can be a good source to learn about malicious packet contents and a good resource for further reading on each one of these software tools. )

User-agent: Aqua_Products
Disallow: /

User-agent: asterias
Disallow: /

User-agent: b2w/0.1
Disallow: /

User-agent: BackDoorBot/1.0
Disallow: /

User-agent: Black Hole
Disallow: /

User-agent: BlowFish/1.0
Disallow: /

User-agent: Bookmark search tool
Disallow: /

User-agent: BotALot
Disallow: /

User-agent: BuiltBotTough
Disallow: /

User-agent: Bullseye/1.0
Disallow: /

User-agent: BunnySlippers
Disallow: /

User-agent: Cegbfeieh
Disallow: /

User-agent: CheeseBot
Disallow: /

User-agent: CherryPicker
Disallow: /

User-agent: CherryPicker /1.0
Disallow: /

User-agent: CherryPickerElite/1.0
Disallow: /

User-agent: CherryPickerSE/1.0
Disallow: /

User-agent: CopyRightCheck
Disallow: /

User-agent: cosmos
Disallow: /

User-agent: Crescent
Disallow: /

User-agent: Crescent Internet ToolPak HTTP OLE Control v.1.0
Disallow: /

User-agent: DittoSpyder
Disallow: /

User-agent: EmailCollector
Disallow: /

User-agent: EmailSiphon
Disallow: /

User-agent: EmailWolf
Disallow: /

User-agent: EroCrawler
Disallow: /

User-agent: ExtractorPro
Disallow: /

User-agent: FairAd Client
Disallow: /

User-agent: Flaming AttackBot
Disallow: /

User-agent: Foobot
Disallow: /

User-agent: Gaisbot
Disallow: /

User-agent: GetRight/4.2
Disallow: /

User-agent: grub
Disallow: /

User-agent: grub-client
Disallow: /

User-agent: Harvest/1.5
Disallow: /

User-agent: hloader
Disallow: /

User-agent: httplib
Disallow: /

User-agent: humanlinks
Disallow: /

User-agent: ia_archiver
Disallow: /

User-agent: ia_archiver/1.6
Disallow: /

User-agent: InfoNaviRobot
Disallow: /

User-agent: Iron33/1.0.2
Disallow: /

User-agent: JennyBot
Disallow: /

User-agent: Kenjin Spider
Disallow: /

User-agent: Keyword Density/0.9
Disallow: /

User-agent: larbin
Disallow: /

User-agent: LexiBot
Disallow: /

User-agent: libWeb/clsHTTP
Disallow: /

User-agent: LinkextractorPro
Disallow: /

User-agent: LinkScan/8.1a Unix
Disallow: /

User-agent: LinkWalker
Disallow: /

User-agent: LNSpiderguy
Disallow: /

User-agent: lwp-trivial
Disallow: /

User-agent: lwp-trivial/1.34
Disallow: /

User-agent: Mata Hari
Disallow: /

User-agent: Microsoft URL Control
Disallow: /

User-agent: Microsoft URL Control - 5.01.4511
Disallow: /

User-agent: Microsoft URL Control - 6.00.8169
Disallow: /

User-agent: MIIxpc
Disallow: /

User-agent: MIIxpc/4.2
Disallow: /

User-agent: Mister PiX
Disallow: /

User-agent: moget
Disallow: /

User-agent: moget/2.1
Disallow: /

User-agent: mozilla/4
Disallow: /

User-agent: Mozilla/4.0 (compatible; BullsEye; Windows 95)
Disallow: /

User-agent: Mozilla/4.0 (compatible; MSIE 4.0; Windows 2000)
Disallow: /

User-agent: Mozilla/4.0 (compatible; MSIE 4.0; Windows 95)
Disallow: /

User-agent: Mozilla/4.0 (compatible; MSIE 4.0; Windows 98)
Disallow: /

User-agent: Mozilla/4.0 (compatible; MSIE 4.0; Windows ME)
Disallow: /

User-agent: Mozilla/4.0 (compatible; MSIE 4.0; Windows NT)
Disallow: /

User-agent: Mozilla/4.0 (compatible; MSIE 4.0; Windows XP)
Disallow: /

User-agent: mozilla/5
Disallow: /

User-agent: MSIECrawler
Disallow: /

User-agent: NetAnts
Disallow: /

User-agent: NetMechanic
Disallow: /

User-agent: NICErsPRO
Disallow: /

User-agent: Offline Explorer
Disallow: /

User-agent: Openbot
Disallow: /

User-agent: Openfind
Disallow: /

User-agent: Openfind data gathere
Disallow: /

User-agent: Oracle Ultra Search
Disallow: /

User-agent: PerMan
Disallow: /

User-agent: ProPowerBot/2.14
Disallow: /

User-agent: ProWebWalker
Disallow: /

User-agent: psbot
Disallow: /

User-agent: Python-urllib
Disallow: /

User-agent: QueryN Metasearch
Disallow: /

User-agent: Radiation Retriever 1.1
Disallow: /

User-agent: RepoMonkey
Disallow: /

User-agent: RepoMonkey Bait & Tackle/v1.01
Disallow: /

User-agent: RMA
Disallow: /

User-agent: searchpreview
Disallow: /

User-agent: SiteSnagger
Disallow: /

User-agent: SpankBot
Disallow: /

User-agent: spanner
Disallow: /

User-agent: suzuran
Disallow: /

User-agent: Szukacz/1.4
Disallow: /

User-agent: Teleport
Disallow: /

User-agent: TeleportPro
Disallow: /

User-agent: Telesoft
Disallow: /

User-agent: The Intraformant
Disallow: /

User-agent: TheNomad
Disallow: /

User-agent: TightTwatBot
Disallow: /

User-agent: Titan
Disallow: /

User-agent: toCrawl/UrlDispatcher
Disallow: /

User-agent: True_Robot
Disallow: /

User-agent: True_Robot/1.0
Disallow: /

User-agent: turingos
Disallow: /

User-agent: URL Control
Disallow: /

User-agent: URL_Spider_Pro
Disallow: /

User-agent: URLy Warning
Disallow: /

User-agent: VCI
Disallow: /

User-agent: VCI WebViewer VCI WebViewer Win32
Disallow: /

User-agent: Web Image Collector
Disallow: /

User-agent: WebAuto
Disallow: /

User-agent: WebBandit
Disallow: /

User-agent: WebBandit/3.50
Disallow: /

User-agent: WebCopier
Disallow: /

User-agent: WebEnhancer
Disallow: /

User-agent: WebmasterWorldForumBot
Disallow: /

User-agent: WebSauger
Disallow: /

User-agent: Website Quester
Disallow: /

User-agent: Webster Pro
Disallow: /

User-agent: WebStripper
Disallow: /

User-agent: WebZip
Disallow: /

User-agent: WebZip/4.0
Disallow: /

User-agent: Wget
Disallow: /

User-agent: Wget/1.5.3
Disallow: /

User-agent: Wget/1.6
Disallow: /

User-agent: WWW-Collector-E
Disallow: /

User-agent: Xenu's
Disallow: /

User-agent: Xenu's Link Sleuth 1.1c
Disallow: /

User-agent: Zeus
Disallow: /

User-agent: Zeus 32297 Webster Pro V2.9 Win32
Disallow: /

User-agent: Zeus Link Scout
Disallow: /

Saturday, October 25, 2014

Back to Basics - Intellectual Property

This post is about practicing critical thinking when it comes to intellectual property cases and to track down old or previous websites using copyright material. We can also us this technique to locate images where only the portion of the image is used or relevant to the case.


Sunday, October 19, 2014

Back to basics - Time revisited

It is very strange why PowerShell would return a date where the year is wrong ( 0x451a1eb0d869cc01).

PS> get-date 129594868675516997
Saturday, September 3, 0411 1:27:47 AM

Try more than just one value and see if there is a pattern of miscalculation.

PS> get-date 128989761240000000
Friday, October 2, 0409 4:55:24 PM

Always use a tool to validate your findings and keep you on the right track. Here, I'm using FTK Imager to decode the date/time of a little endian value to make sure I get the same results by hand and to identify any mistakes I might make with the manual conversion.  This way, I can also double check if PowerShell interprets the values correctly.





The same value returns the correct date and time if used in this format

PS>[datetime]::fromfiletime("129594868675516997")

Friday, September 2, 2011 8:27:47 PM

UTC time is also returns the correct date and time.  It seems like that is also what the get-date is trying to do.

PS> [datetime]::fromfiletimeUTC("129594868675516997")

Saturday, September 3, 2011 1:27:47 AM


Converting the hex values in Excel and working with the rounded scientific notation should not be used due to the rounding error.

PS> [datetime]::fromfiletime("1.29595E+17")

Saturday, September 3, 2011 12:06:40 AM

If you know the epoc of a time, then you can easily adjust PowerShell to give you the correct time from the epoc by adding the origin to the datetime.

[timezone]::CurrentTimeZone.ToLocalTime(([datetime]'1/1/1970').addseconds("1337329458"))


Friday, May 18, 2012 3:24:18 AM

Microsoft counts 100-nanoseconds, so the time value needs to be divided by 1e7 to get the second values from the epoc time.  129594868675516997/1e7 = 12959486867.55169

[timezone]::CurrentTimeZone.ToLocalTime(([datetime]'1/1/1601').addseconds(12959486867.55169))


Friday, September 2, 2011 8:27:47 PM

Thus, analyzing Mozilla Firefox, we can examine places.sqlite database for downloaded applications in the moz_annos table.  We can see values under content column like:
(C:\Users\<UID>\AppData\Roaming\Mozilla\Firefox\Profiles\<random>.default-<random>)

{"state":1,"endTime":1413773919879,"fileSize":4210920}

Based on the given file size ( in Bytes ), we can correlate an exfiltrated file even if its name was changed.  In order to find the time ( tracks it in milliseconds, so divide the value by 1000 ) when the exfiltration was completed we can run PowerShell with the Unix epoc date:

[timezone]::CurrentTimeZone.ToLocalTime(([datetime]'1/1/1970').addseconds("1413773919.879"))


Sunday, October 19, 2014 9:58:39 PM

This value can be verified by Decode
( http://www.digital-detective.net/digital-forensic-software/free-tools/ )



Thus, testing, verification, and validation should be part of every analysis especially before a new tool or a tool update is implemented.  Risk management is as important part of forensic analysis as technical knowledge.

Back to basics - Drive transfer rate

Maybe it is not relevant to most investigators, but knowing your devices and your hardware can help in determining how long an acquisition or indexing of an evidence might take.  Measuring the performance of the storage devices are just as important as analyzing a case for relevant evidence.  You have to be detailed enough and have the drive to understand technology in order to move toward becoming an expert.  The first step of education is to ask questions and find the best answers possible, but not by "googling" for answers other did.

In this case, we examine out storage device transfer rate in a USB 2.0 and in USB 3.0 ports.  I'm lucky enough to have both of these ports on my laptop to test these ports, but if you ignore the port speed then you will never know why sometimes you get better performance.

USB 1.x supports rates of 1.5 Mbit/s (Low-Bandwidth) and 12 Mbit/s (Full-Bandwidth).

USB 2.0 supports higher maximum signaling rate and limited to effective throughput of 280 Mbit/s.  The port is usually black, but the USB symbol might be the best way to distinguish the port types. In the image below, I have a USB 3.0 on the left side while only a USB 2.0 on the right side.  Thus, plugging a device in one port vs. the other will have a huge performance difference.





USB 3.0 ( SuperSpeed mode ) usable data rate of up to 4 Gbit/s. A USB 3.0 port is usually colored blue, and is backwards compatible with USB 2.0.  In the image below, you can see that it will not matter which port to use on this side of the laptop since both of the ports are USB 3.0.





You can see in Windows what port the device is plugged in.


So, what are the effective transfer rates on actual devices and not just in theory.  There are many ways to test performance and most of them will not result in very accurate results, but will give a good indication of device transfer rates to calculate with.  In many cases, the approximation of data transfer rate is good enough to calculate and prepare a quote for clients.

One way is to use the Windows System Assessment Tool ( winsat )  utility to do this test.  Since we are talking about sequential writes of the data, we can test the sequential write rate of E:\ drive, in my case, like this.

winsat disk -seq -write -count 6 -v -drive E

Sequential reads are just as easy to test.


winsat disk -seq -read -count 6 -v -drive E

Another way would be to use SQLIO Disk Subsystem Benchmark Tool.

You can create a script to test the performance of the drive with many different configurations in order to find the optimal settings.

I have the following in my batch file:

"C:\Program Files (x86)\SQLIO\sqlio" -kW -s10 -frandom -o8 -dE -b8 -LS -Fparam.txt 
timeout /T 10
"C:\Program Files (x86)\SQLIO\sqlio" -kW -s360 -frandom -o8 -dE -b64 -LS -Fparam.txt
timeout /T 10
"C:\Program Files (x86)\SQLIO\sqlio" -kW -s360 -frandom -o8 -dE -b128 -LS -Fparam.txt
timeout /T 10
"C:\Program Files (x86)\SQLIO\sqlio" -kW -s360 -frandom -o8 -dE -b256 -LS -Fparam.txt
timeout /T 10
"C:\Program Files (x86)\SQLIO\sqlio" -kW -s360 -frandom -o8 -dE -b512 -LS -Fparam.txt
timeout /T 10
"C:\Program Files (x86)\SQLIO\sqlio" -kW -s360 -fsequential -dE -o8 -b8 -LS -Fparam.txt
timeout /T 10
"C:\Program Files (x86)\SQLIO\sqlio" -kW -s360 -fsequential -o8 -dE -b64 -LS -Fparam.txt
timeout /T 10
"C:\Program Files (x86)\SQLIO\sqlio" -kW -s360 -fsequential -o8 -dE -b128 -LS -Fparam.txt
timeout /T 10
"C:\Program Files (x86)\SQLIO\sqlio" -kW -s360 -fsequential -o8 -dE -b256 -LS -Fparam.txt
timeout /T 10
"C:\Program Files (x86)\SQLIO\sqlio" -kW -s360 -fsequential -o8 -dE -b512 -LS -Fparam.txt
timeout /T 10
"C:\Program Files (x86)\SQLIO\sqlio" -kR -s360 -frandom -o8 -b8 -dE -LS -Fparam.txt
timeout /T 10
"C:\Program Files (x86)\SQLIO\sqlio" -kR -s360 -frandom -o8 -dE -b64 -LS -Fparam.txt
timeout /T 10
"C:\Program Files (x86)\SQLIO\sqlio" -kR -s360 -frandom -o8 -dE -b128 -LS -Fparam.txt
timeout /T 10
"C:\Program Files (x86)\SQLIO\sqlio" -kR -s360 -frandom -o8 -dE -b256 -LS -Fparam.txt
timeout /T 10
"C:\Program Files (x86)\SQLIO\sqlio" -kR -s360 -frandom -o8 -dE -b512 -LS -Fparam.txt
timeout /T 10
"C:\Program Files (x86)\SQLIO\sqlio" -kR -s360 -fsequential -dE -o8 -b8 -LS -Fparam.txt
timeout /T 10
"C:\Program Files (x86)\SQLIO\sqlio" -kR -s360 -fsequential -o8 -dE -b64 -LS -Fparam.txt
timeout /T 10
"C:\Program Files (x86)\SQLIO\sqlio" -kR -s360 -fsequential -o8 -dE -b128 -LS -Fparam.txt
timeout /T 10
"C:\Program Files (x86)\SQLIO\sqlio" -kR -s360 -fsequential -o8 -dE -b256 -LS -Fparam.txt
timeout /T 10
"C:\Program Files (x86)\SQLIO\sqlio" -kR -s360 -fsequential -o8 -dE -b512 -LS -Fparam.txt

The param.txt file does not have anything else, but a single line showing where to copy teh file to, in this case to E: drive since that is the drive I'd like to test.

e:\testfile.data 2 0x0 100

The testfile.dat was created with dcfldd like this:

C:\>dcfldd-1.3.4.x86win32\dcfldd.exe pattern=61 of=tesfile.data bs=8388608 count=1

The results can be then added to a spreadsheet to chart the data for easier analysis.


USB 3.0 performance.

USB 2.0 performance.

The best performance results are highlighted with read, but we can see that USB 3.0 have much less latency issues than USB 2.0, so we should definitely use USB 3.0 whenever we can.

So, no matter how obvious the outcome is or how much you know about technology, you should always aim to find a way to test your devices and have performance data available to chart your results to see a pattern that might not emerge just by looking at a data itself.  This is the process of determining an answer by empirical data analysis.  You can never get closer to a scientific thinking unless you realize the power of testing and measuring.  This way, you will always be confident of your conclusions since these are data points you have created, documented, and analyzed.  

Let me know if you any better ways to have a reliable testing of storage device performance.




Appendices

A. System Environment

> Command Line 'winsat  disk -seq -write -count 6 -v -drive E'
> DWM running... leaving it on
> System processor power policy saved and set to 'max performance'
> Running: Feature Enumeration ''
> Gathering System Information
> Operating System                        : 6.3 Build-9600
> Processor                               : Intel(R) Core(TM) i7-4702HQ CPU @ 2.
20GHz
> TSC Frequency                           : 0
> Number of Processors                    : 1
> Number of Cores                         : 4
> Number of CPUs                          : 8
> Number of Cores per Processor           : 4
> Number of CPUs Per Core                 : 2
> Cores have logical CPUs                 : YES
> L1 Cache and line Size                  : 32768  64
> L2 Cache and line Size                  : 262144  64
> L3 Cache and line Size                  : 6291456  64
> Total physical mem available to the OS  : 15.9 GB (17,078,214,656 bytes)
> Adapter Description                     : Intel(R) HD Graphics 4600
> Adapter Manufacturer                    : Intel Corporation
> Adapter Driver Provider                 : Intel Corporation
> Adapter Driver Version                  : 10.18.10.3345
> Adapter Driver Date (yy/mm/dd)          : 2013\10\31
> Has DX9 or better                       : Yes
> Has Pixel shader 2.0 or better          : Yes
> Has LDDM Driver                         : Yes
> Dedicated (local) video memory          : 0MB
> System memory dedicated as video memory : 0MB
> System memory shared as video memory    : 1792MB
> Primary Monitor Size                    : 1600 X 900  (1440000 total pixels)
> WinSAT is Official                       : Yes
Mode Flags = 0x02000001
Disk Number = 2
Iterations = 6
IO Count = 1000
Sequential IO Size = 65536

Random IO Size = 16384

B. Drive tested

C:\>wmic diskdrive get name, size, model
Model                           Name                Size
WD My Passport 0748 USB Device  \\.\PHYSICALDRIVE2  2000363420160

C. User Manual and downloads






Tuesday, October 14, 2014

Advanced topics - Search by FileTime

This is in progress, but the main idea is that we should be able to find FileTime ranges in $MFT, FAT DE, and in many SQLite databases, or log files by directly searching the stored time stamp.

We can use PowerShell to give us a range of FileTime values for a particular date range that will allow us to search the evidence for artifacts that we might not even realize yet, but stores the time stamp in its structure.

PS C:\> (Get-Date -Date "2014-10-14T00:00:00").ToFileTime()
130577364000000000
PS C:\> (Get-Date -Date "2014-10-14T23:59:59").ToFileTime()
130578227990000000
PS C:\> [convert]::tostring((Get-Date -Date "2014-10-14T23:59:59").ToFileTime(),16)
1cfe834debe7180
PS C:\> [convert]::tostring((Get-Date -Date "2014-10-14T00:00:00").ToFileTime(),16)
1cfe76bb4ed4800

So, now that we know a time stamp range, we can reverse the time stamps to little endian, if needed, and locate values matching the range.

The image below is just a sample of a simple regular expression based search for a pattern matching a time range.



Also, get date and time by entering the fileTime value:

PS > Get-Date 129442497539436142
or
PS > [datetime]::FromFileTime("129442497539436142")

What did I do? - Pattern is the key

In this case, you need to decide if this case has anything that might be relevant to illegal animal trade.

Our hypothetical law states that it is illegal to own and store any image of animals with feather or fur.

You are given the suspect's computer and you see the following in the C:\temp\images folder.  The scope of the investigation restricts you to this simple folder, so you need to write your report examining data in the given folder.

You can ask for any other information you'll need to find the answer.


Remember computers store data visible, deleted, or hidden format that are either clear text, encoded, or encrypted and generated by operating system, application, or users.  So, as you analyze the image above, write your conclusion from these data characteristics in mind.

DON'T CHEAT, THE VIDEO IS FOR CHECKING IF YOU ARE ON THE RIGHT TRACK!!!





Monday, October 13, 2014

What did I do? - Google search

Sometimes you might think that it would be valuable to validate your findings and establish a base for your opinion.  There are many analysis of user actions without validating against actual user actions, so here we go.  I will give you specific scenarios with screenshot of relevant evidence data here on this blog, but you will also be able to watch a video of the actual actions I perform that generated the relevant data.  Depending on the activity type, I can even provide the relevant evidence specific artifact if needed.  All times displayed on the video will be Central Time with the actual daylight offset applied.

So, what was I doing last night?  Create your theory and sequence of events that I have performed.  Then, look at the video to find out if you were correct with your analysis.  Pay attention to sequence of events and the timeline of actions performed.  Don't forget to predict how many times I have visited each websites and what links I have clicked.  Look at the URL bar to find patterns for each actions and compare them to the report to see any discrepancy.


DO NOT CHEAT, WATCH THE VIDEO AFTER YOUR THOROUGH ANALYSIS!!!

File to be analyzed if you want to confirm these findings by manual analysis or use
C:\Users\<UID>\AppData\Local\Google\Chrome\User Data\Default\History

System setup:
Windows 8.1 Pro
Intel Core i7
16GB RAM
64 bit OS

Google Chrome - Version 35.0.1916.153 m
ChromeHistoryView v1.17 - http://www.nirsoft.net/



Let me know how accurate you were in your prediction!!!

Saturday, October 11, 2014

Back to Basics - Security by Monitoring

Security vs. convenience or privacy vs. security or freedom vs. control?  Sometimes we have a hard time deciding what is better for us and what makes sense.  For those in the cybersecurity field, convenience is un-security, your privacy is protected by monitoring your activities to identify the normal patterns in order to alert for abnormal signs.  You can not have it both ways, you do need to give up control ( not freedom ) in order for some else to help you provide your with the desired level of security.  The fundamental premise of security is monitoring.  Think about your kids, you can not protect them unless you know where they are and what are their plans in order to be preemptive instead of reactive.  Without this kind of access to their lives, you could not provide preventative services, you will always be reactive to events and will be late to protect anyone.  It is not about losing freedom, but providing protective services so you can be productive and focus on your assigned tasks instead of reducing your productivity due to your lack of skills to protect yourself.  Keeping up with the skills required to provide meaningful services is a full time job, so you have to outsource that skill to someone else who is qualified for the job.  It is like mowing your own lawn since you do not want to give up control of your grass.  It is convenient, cheaper, and more efficient to let professionals handle trivial tasks.  Have you aver tried to do something yourself to save money and it ended up costing you more time and money than if you hired some else to do the job for you?  I think, every one has.

So, think about cybersecurity and monitoring not as a loss of freedom, but a service that allows you to focus on what you good at, but only give access to those who have a vested interest to protect you, not to profit from it.

Many times, people are afraid of government agencies and ignore the businesses.  Agencies like NSA has a vested interest to enforce laws and protect citizens, not to snoop or to profit from collected information.  Collection is part of providing security in a legally controlled manner where no on person has authority over all data and their usage.  On the other hand, businesses have a vested interest to continually and in real time monitor as many individuals as possible in order to provide advertisement or directed sales pitches.  They thrive on knowing you more tan you know yourself regardless of law or regulation, if the can profit from it, they will use this information to anyone who is willing to pay for it.  There is no write or wrong here since we use services mostly provided for free, thus we willingly give up privacy to our information.  Like I'm using this blog, so by the end of this blog, I will get advertisements based on words I use in this blog and websites I might mention.  When I click on save, the words will be indexed and associated with my id and a profile is built about me that will be marketed to anyone interested focusing customers like me.

So, while NSA might collect data on my international calls, it might be used to generate some basic profile about me and if I break the law, that information can be pulled and analyzed to find out what made me change or to act in a certain way.  For profit organizations are like a wild wild west, they hire the best of the best to find ways to figure out how to make me buy things I don't need.  They are interested in all my button clicks and even on clicks I was thinking about, but decided not to click.  All this is in real time and marketed for profit.  We never even bother to read policies on websites we sign up for and use.  We never question what businesses do with the information we share or where this information is stored or even who owns the data we publish on the web.

My point, is that cybersecuriy is about monitoring to protect you and that is what agencies do for you, so you can focus on creating your wealth in whatever business you are in.  Businesses are the entities that we should be more concerned about and limit what they do with information we provide.  After all, the Internet was created for information sharing and just because I'm being analyzed as I'm writing this, I did not give up my freedom to talk about what I feel strongly about.  I'm being analyzed to make sure we can reach many people in a secure and responsible way.

Technology pose challenges to those who provide services since data grows exponentially and it is harder to distinguish approved traffic from malicious traffic.  Monitoring activities allows intelligent systems to identify normal traffic and learn consistent behaviors.  I like to fill up my car at the same gas station and fill up to a value divisible by 10 plus $0.01.  Security is about establishing consistency, so if I see charges on my credit card for $50.01 at a gas station, I can see that it is normal, but a charge of $50.54 is not.  Now, that is a pattern that can be coded and entered into a system or an intelligent system can learn this pattern and alert for out of pattern charges.  I might make a mistake and fill up my can to $50.75, but that is just a false positive that I can handle even if I get alerted for that charge.

Security is consistency!

Learn about the type of monitoring software can do and think about the patterns that might help professionals in this field do their job effectively.  If you think about consistency and not monitoring, then you might appreciate monitoring and the purpose of cybersecurity.

http://youtu.be/RR3bS5g-KTE

Back to Basics - Little Endian in PowerShell

Reverse little endian value for 64 bit FileTime entries and show the actual time value.

The basic concept of the code below is the rule of binary ANDing.  Any number logically ANDed with 255 ( 0xFF ) will result of a number itself and any number logically ANDed with 0 will result in zero.

Let's see how that works with an example:
The number is: 01101011 01010110
0x00FF:          00000000 11111111
===========================
The result:        00000000 01010110

Thus, you can see that using binary operation, we can separate a value from a sequence of binary values.  Also, this can be done in a decimal format like 23 AND 255 = 23.  So, if we have a longer Base-16 value like the little endian 64bit FileTime, we can reverse it by logically ANDing it with 0x00000000000000FF or just 0xFF.  At that point, we'll end up with the last 8 bits or the right most byte value.  At that point, we can remove those bytes by shifting the values to the right 8 times.  At that point, the last byte will be the second right most byte in the original byte string.  So, we can just repeat the ANDing and shifting of values and adding the appropriate Base-256 values to the total result.

  

function revEndian{
param($a=0x23BBCCDDEEFFAA01)
$result=0

#Binary AND to identify the lowest byte value
$temp=$a -band 0xFF
#Shift the binary string to the right by 8 bits to replace the lowest byte value
$a=$a -shr 8

#Keep identifying and shifting the bytes to the right and calculating the proper Base-256 value

for($i=7; $i -ge 1;$i--){
    $result=$result+$temp * [math]::pow(256,$i)
    $temp=$a -band 0xFF
    $a=$a -shr 8
   }
#Binary OR to add the last byte to the final value
$result=$result -bor $temp
return ,$result
}

#Call the function with specified Little Endian FileTime value
$value=revendian(0xE87B9127C826CF01)
write-host "The value in Base-16 is:",("{0:x}" -f [convert]::touint64(($value)))
write-host "... and the date value of it is",([datetime]::FromFileTime($value))

Run the above script and you should see the following.

PS C:\> .\Convert.ps1
The value in Base-16 is: 1cf26c827917be8
... and the date value of it is 2/10/2014 7:25:31 PM