Sunday, August 27, 2017

Installing Dropbox in Kali

Please go to Dropbox page and download version for Ubuntu (file extension .deb) -- https://www.dropbox.com/install-linux.

This is going to download a file in this format:
dropbox_2015.10.28_amd64.deb

Then run:
wget https://www.dropbox.com/download?dl=packages/ubuntu/dropbox_2015.10.28_amd64.deb

After that:
sudo dpkg -i <package name>

Example package name:
download?dl=packages%2Fubuntu%2Fdropbox_2015.10.28_amd64.deb

Therefore:
sudo dpkg -i download?dl=packages%2Fubuntu%2Fdropbox_2015.10.28_amd64.deb

This will install the package.

For checking the GUI version of Dropbox just installed, please go to Usual Applications > Internet > and see the Dropbox appearing in there.

Tuesday, August 15, 2017

Further Carving

Not satisfied with Autopsy results, I have executed the application called foremost, and have got more files.

I run then Autopsy again, and found out that probably parameters chosen for the evidence09 were restrictive, as the second time, Autopsy capture the same amount of files with foremost.

The only issue is, foremost recover all the data not showing from which path the unallocated block was, therefore when using Autopsy, you can filter of what was under Internet Explorer for example, and ignored those files (most likely images downloaded temporarly whilst the user was browsing).

I've also had the curiosity of trying another tool, and this time I'm using PhotoRec. This one requires you mouting though the image. Therefore I did the following:

1) Create a mouting directory called /mnt/evi09mnt
2) Run the command to mount image:
mount -o ro,loop,offset=32256 evidence09_sdb.dd /mnt/evi09mnt
3) Download the PhotoRec latest version (this doesn't require installation, and you just have to download and unzip the files in a folder).
4) Run the file called ./photorec_static.

This will go and open the partition you want to test. You need to select a different folder for the results too.

In my test, 22420 files were recovered. I haven't filtered the files I wanted, so I went back and configured the following types of files to be recovered:

accdb (Access Data Base, as part of Office)
bk (MS Backup File)
bmp (BMP bitmap image)
doc (Microsoft Office Document: doc/xl/ppt/vsd/... for 3ds Max, MetaStock, Wilcom ES)
evt (Windows Event Log)
gif (Graphic Interchange Format)
http (HTTP Cache)
jpg (JPG picture)
key (Synology AES key)
mov/mdat Recover mdat atom as a separate file)
mov (mov/mp4/3gp/3g2/jp2)
mp3 (MP3 audio: MPEG ADTS, layer III, v1)
mpg (Moving Picture Experts Group video)
nsf (Lotus Notes)
one (Microsoft OneNote)
pcx (PCX bitmap image)
pdf (Portable Document Format, Adobe Illustrator)
png (Portable/JPEG/Multiple-Image network Graphics)
psb (Adobe Photoshop Image)
psd (Adobe Photoshop Image)
psf (Print Shop)
psp (Paint Shop Pro Image File)
pst (Outlook: pst/wab/dbx)
ra (Real Audio)
*rar (Rar archive)
reg (Windows Registry)
res (Microsoft Visual Studio Resource file)
riff (RIFF audito/video: wav, cdr, avi)
rm (Real Audio)
sqm (Windows Live messenger Log File)
tar (tar archive)
tif (Tag Image file Format and some raw file formats: pef/nef/dcr/sr2/cr2)
*tx? (Text files with header: rtf/xml/xhtml/mbox/imm/pm/ram/reg/sh/slk/stp/jad/url)
*txt (Other text files: txt/html/asp/bat/C/jsp/perl/php,py/emlx... scripts)
wks (Lotus 1-2-3)
xar (xar archive)
xml (Symantec encrypted xml files)
*zip (zip archive including OpenOffice and MSOffice 2007)


Monday, August 14, 2017

Confirming Serial Number of Hard Disk Image

Mount the image, then run the command lshw.

If this is not install, please run:

sudo apt-get install lshw

Further information here.

Mounting Images

For mounting images, first check the image structure, using the following command:

sfdisk -l evidence09_sdb.dd

You will obtain something similar to this:


The partition starts at the sector 63. This multiplied by it size (512) gets that the block we have to use is: 32256.

Therefore, the command to run is:

mount -o ro,loop,offset=32256 evidence09_sdb.dd /mnt/evi09mnt

Note: create the folder evi09mnt before.

Further information here


Thursday, August 10, 2017

Check hard disk activity

I all of sudden noticed that my hard disk is having a lot of activity, and I closed all my applications.

My VMWare box stopped working too. I'm concerned that someone accessed through my pc.

Therefore I'm going to monitor activity through iostat. Please install it first by doing the following:

sudo apt-get install iotop
sudo iotop --only

I can see that only myself is connected and root doing activities in background. Coincidently, s more responsive, but I can see that there are still activities running on behalf of vmware, where I already kill this window. 

Doing a:

sudo ps -ef | grep vwmare

I can see several tasks running on behalf this application, but they are just ports opened. 

I restarted everything again, and it seems to be working now. I'll keep monitoring.
 

Monday, August 07, 2017

Install and Uninstall in Linux: That is the Question (part 2)

As mentioned earlier in a previous post, installing packages can be complicated.

I just faced a new challenge, when trying to install WPS Office. First of all, when you access to the WPS download page, you can see a generic WPS Office for Linux link in there.

After clicking in Download, you have different options for Alpha21 for this year and last year. The options are .deb version and .rpm:



Just a bit of background on this RPM packages are precompiled and built for Red Hat Based Linux Distribution, and can be installed only using yum, Zypper and RPM based package managers,

Since Kali Linux is based on Debian, you cannot install an RPM package directly using apt or dpkg package managers.

As you can see in th elist above, the version for Debian is amd64. Doing an uname -a or uname -m, I can see my operating system is configured in x86_64 mode. For further information, please check this page here, that is quite handy...

Therefore, here are the steps to download and install WPS for x86_64:

1) download the version for amd64.deb listed above
2) rename it to something shorter, like wps-office.deb
3) sudo dpkg -i wps-office.deb
4) sudo apt-get -f install

This worked for me well.

Keyring Issue

I've just published another post where I mentioned that when trying to access to Google Chrome, I was being asked to put my keyring password.

I don't remember changing, and I was trying to use, what is my password to access to my box itself. Unfortunately this didn't work.

So if the password you remember doesn't work, you don't have too much alternative apart from deleting the password and keys there, and re-create this again.

For doing that, run:

sudo su
cd /home/aviola/.local
cp -r keyrings keyrings.backup
cd keyrings
rm login.keyring


You can perceive I did a backup. This is just in case one day you remember the password you use, or if you have any issues with other applications.

Then I access to Google Chrome again, and I setup with my password that I use to access to the box (which is quite complex). Once setting that up, I was able to login to Google Chrome afterwards without any issues.

Now, I'll start my other applications that I have to use for my disseration. I'm hoping everything else will work find, but if not, I'll definitely be writing back here!

Install and Uninstall in Linux: That is the Question (part 1)

I am still struggling, after all this time using Linux, to understand upgrades and ways to install and uninstall applications in my Kali Linux.

The issue probably is because I am using 3 operating systems at the same time: 1) Windows for Work; 2) Mac as base machine to connect to VDI at work or to use daily for research/entertainment; 3) Linux for University.

And let's face it: I haven't sat that much on my Linux box to do too much in these last few months on this computer, at all... This now changes, and this will be my friend for a month.

In Windows, to install you have to download the executable file or either go to Add/Remove Programs. For uninstalling, you can either choose the Uninstall executable to do all the job for you, or again go back to Add/Remove Programs for a complete cleanup. And for any upgrades, you can select the automatic upgrade that recommends you a new version when this is available.

To be honest, I even don't remember anymore if all these steps are correct, as my virtual machine at work don't allow me to install/uninstall applications freely, as the same for updating them accordingly. I'm relying on the IT guy/tools to do that remotely for me, and this is seemless.

In the Mac instead, you can install the app by just copying the file to your Apps 'icon', which basically copy the file you'll execute to an Applications folder, so you don't have all the files distributed everywhere. For uninstalling it, you just delete the file you execute, and voilà: the app is gone. Any upgrades are made through iTunes.

But in Linux... arghhh! For installing and uninstalling, it might be simpler as you just need to have in your had that the 'installer' is called apt-get, and then you just need to use functions to install and uninstall, if you obviously are 'installing' or 'uninstalling' the application from your box.

Example:
  
To install app:

apt-get install <app_name>

To uninstall app
apt-get uninstall <app_name>

However, if you want to upgrade your system, things get more complicated. Especially because not always is a very straight forward process, especially if the application you need to use requires different libraries, or you have public key issues to connect to the catalogs to download these libraries for example.

For checking the library, you need to do an apt-get update. This allows you to refresh your library against the global catalog for anything new. When I run this command to write this blog, I managed in fact to get the following error:


This is because under /etc/apt/sources.lists.d folder, I had the following files there:
- google-chrome.list
- playonlinux.list

These are extra libraries that were attached to my original library, and that every time I run apt-get update, these two are attempted to be updated as well, but it seems that the public keys used before are now unexistent for, for some reason.

I could try to sort the issue out, but first I don't want to lose too much time, and secondly, one is for Google Chrome, which if I break it, it's easy to fix. The other one I don't even remember when I did install that in the first place. So I simply decided to deleted these files from there, and after running the command apt-get update again, I didn't get the errors any longer.

However though, when I tried to check if Chrome was still working, I started to get a message asking for my keyring password to be able to use this application properly. I put the password I'm using for accessing to my box, which was the one I remembering setting up (bear in mind that it has been a while that I didn't connect to this box, as mentioned above), and this didn't work.

Therefore, I created another blog for explaining how to fix this issue here.

Once updating the library, then the other step is actually run the apt-get install, to install any new update found. This time I had nothing to upgrade, but once I have and if any issues come up, I'll update this post again.

You can also use the Package Manager for Linux called Synaptic. For installing that, you can run the following:
sudo apt-get install synaptic
sudo synaptic 

You can check for any updates by clicking in Reload, and then for downloading and installing, you can use the icon that says Mark All Upgrades.

Sunday, June 04, 2017

I was having several issues when trying to install applications such as LateX and others on my computer.

The errors I was obtaining were related to the grub-efi-amd64 and a possible library not found.
Setting up grub-efi-amd64 (2.02~beta3-5) ...
/var/lib/dpkg/info/grub-efi-amd64.config: 1: /etc/default/grub: et#: not found
dpkg: error processing package grub-efi-amd64 (--configure):
 subprocess installed post-installation script returned error exit status 127
Errors were encountered while processing:
 grub-efi-amd64
E: Sub-process /usr/bin/dpkg returned an error code (1)
Visiting several websites, I've found the following recommendation:

sudo apt-get purge grub\*
sudo apt-get install grub-efi
sudo apt-get autoremove
sudo update-grub
 
With this, I'm removing completely anything related to grub and then reinstalling and updating it afterwards.

When trying again to download and install the TexLive packages (according the instructions provided here), I no longer obtain any errors, and I was able to run TexMaker with no issues:

aviola@kali:~$ sudo apt-get install texlive-full
Reading package lists... Done
Building dependency tree       
Reading state information... Done
texlive-full is already the newest version (2016.20170123-5).
0 upgraded, 0 newly installed, 0 to remove and 532 not upgraded.

aviola@kali:~$ sudo apt-get install texmaker
Reading package lists... Done
Building dependency tree       
Reading state information... Done
texmaker is already the newest version (4.5-1).
0 upgraded, 0 newly installed, 0 to remove and 532 not upgraded.
aviola@kali:~$ 

Sunday, April 30, 2017

Grub and Gnome

Grub is the multi-loader and allow you to load multiple configurations or operating systems. Further general information about Kali can be read here.

Gnome is the official Ubuntu flavor (theme) and as being from the same family, Kali presents this as the official theme once installed.

For configuring this, please follow the instructions from this page.


Sunday, April 23, 2017

Autopsy

I downloaded and installed Autopsy in a Windows 7 that is running under virtual machine under VMWare. Please see my earlier post regarding how to configure VMWare under Kali Linux.

Once having the evidence in a dd image, you can go to the menu and create a new case under Autopsy. Once creating the case, you can add the data source. As explained earlier my VM is linked to Kali download folder, so the dd images are recognised from my main machine.

I have 10 images that I collected and that I am analysing one by one. The expected time per analsys is about 1 day. Once obtaining the results. Once completing the results, I will create a new post.

Hash Database Help

As per the Sleuthkit.org page, there are hash databases that can be used to identified known good and known bad files, by using the MD5 or SHA-1 checksum value.

The different databases are:

  • NIST NSRL
  • Ignore
  • Alert

Ignore and Alert databases require the investigator to create them. Instead, the NSRL one already contains a source of files that can be found in operating systems and software distributors. Therefore I will use the NIST NSRL database.

Because this does not require to be created, I still have to attach the downloaded the database and index it before it is used.

Following instructions from the Autopsy page1 and page2 I have first downloaded the file NSRL database from the Sourceforge page. For more configuration here.

Once downloaded the file, extract the files. You should be able to see 2 index files plus a Word document with instructions.

After extracting the file, you can go to Autopsy (now I have updated to 4.3.0) and go to Tools > Options > Hash Databases. Select the option Import database and then select the path used when you extracted the files.

In the path, you need to select the idx file and then click in Open. Under Type of database, please select Known (NSRL or other) option. This would show the NSRL database appearing in the list. Click in Apply and OK to complete.

Now, go to case, and select a new one case... proceed as a new case.

Sunday, January 22, 2017

Testing tools based on a NIST image

National Software Reference Library (NSRL) and the National Institute of Standards and Technology (NIST) had work together in a project for collecting software from various sources and incorporate file profiles computed from this software into a Reference Data Set (RDS) of information.

The RDS can be used by law enforcement, government, and industry organisations to review files on a computer by matching file profiles in the RDS. This will help to alleviate much of the effort involved in determining which files are important as evidence on computers or file systems that have been seized as part of criminal investigations.

The RDS is a collection of digital signatures of known, traceable software applications. There are applications hash values in the hash set which may be considered malicious, i.e. steganography tools and hacking scripts.

Basically the idea is to load this products and understand if the tools that I decided to use do not change the or alter the evidence. Further information about the project and the NSRL can be found here.


Computer Forensics Tools Introduction


This was created by NIST in order to create a testing program for computer forensics tools. The main goal is to determine how well these tools perform core forensics functions such as imaging drivers and extracting information from devices. Further information can be found here.

For creating image, as mentioned earlier, I decided to use the tool called DCFLDD, and tests against this product are listed under the page 12. This says that there are some issues with this tool when there is some anomaly found in the disk. The complete accurate statement says the following:

  • When a drive with faulty sectors was imaged (test case DA-09) the tool failed to completely acquire all readable sectors near the location of the faulty sectors. In test case DA-09, a source drive with faulty sectors was cloned to a target drive. Readable sectors that were near faulty sectors on the source drive were not acquired. The tool wrote zeros to the target drive in place of these sectors.
  • When a drive with faulty sectors was imaged (test case DA-09) the data cloned to the target drive became misaligned after faulty sectors were encountered on the source drive. For example, sector 6,160,448 on the target drive cont ained the contents of sector 6,160,392 from the source, sector 6,160,449 on the target contained the contents of source sector 6,160,393, and so on. The size of the offset or misalignment between the data on the source and target drives grew as more faulty sectors were encountered on the source.
Full report is not longer available on the cyberfetch.org website.

For  the deleted file recovery, I decided to use The Sleuth Kit (TSK) / Autopsy. The report for this product is listed under page 216, and it says that under certain circumstances, the information cannot be recovered successfully from the image. The causes for this might be:

  • The data are no longer present in the image, e.g., overwritten.
  • Sufficient meta-data to locate the data is not present or reachable.
  • The algorithm implemented by the tool does not use the meta-data that locates the missing data.
  • The implementation of the tool algorithm is incorrect.

Report is also no longer available on the cyberfetch.org website.