Social Icons

Showing posts with label web test. Show all posts
Showing posts with label web test. Show all posts

Sunday, August 03, 2014

Fierce Domain Scan by FIERCE @ Kali Linux

1.   This post gives a stepped screen shot version of a relatively unknown but powerful tool known as Fierce. It is a perl script written by rsnake. Fierce tries multiple techniques to find all the IP addresses and hostnames used by a target. Fierce is meant specifically to locate likely targets both inside and outside a corporate network.A very detailed explanation with ease is given at http://ha.ckers.org/fierce/

2.  To use Fierce, navigate to Information Gathering | DNS Analysis | Fierce.
Fierce will load into a terminal window as shown in the following screen shot.



DOMAIN INFORMATION GROPER : DIG@Kali LINUX

1.    Most high-value targets have a DNS name associated to an application. DNS names make it easier for users to access a particular service and add a layer of professionalism to their system. For example, if you want to access Google for information, you could open a browser and type in 74.125.68.138 or type www.google.com

(Click on image to enlarge)
2.  DNS information about a particular target can be extremely useful to a Penetration Tester. DNS allows a Penetration Tester to map out systems and subdomains. To use Dig, open a command prompt and type dig and hostname, where hostname represents the target domain. 

3.  Dig lookups will show the DNS records for the given host or domain. This gateway allows lookups for network address, mail exchanger, name servers, host information, arbitrary strings and zone of authority records. Please leave the server field blank to query a properly configured internet DNS cache.Dig will use your operating systems default DNS settings to query the hostname.You can also configure Dig to query custom DNS servers by adding @ to the command. The example in the following screen shot illustrates using Dig on http://www.hacklabs.com/

 
4.   The -t option in Dig will delegate a DNS zone to use the authoritative name
servers. We type dig -t ns http://www.hacklabs.com/ in the example in the
following screen shot:

5.  We see from the results we have two authoritative DNS servers for the domain http://www.hacklabs.com/; they are ns51.domaincontrol.com and ns51.domaincontrol.com

6.   Thanks to book Web Penetration Testing with Kali Linux by Joseph Muniz & Aamir Lakhani

HTTrack : Clone a Website@KALI LINUX

1.    This post will introduce you with a well known tool to clone a website ..the tool is known as HTTrack...though is inbuilt into Kali but older versions may not have it... The purpose of HTTrack is to copy a website.It allows a Penetration Tester to look at the entire content of a website, all its pages,and files offline, and in their own controlled environment. Needless to emphasize on the importance and usefulness of having a copy of a website that could be used to develop fake phishing websites, which can be incorporated in other Penetration Testing toolsets.To install HTTrack if not already inbuilt in Kali, open a Terminal window and type in the following as shown in the following screenshot.

apt-get install httrack 

(Click on image to enlarge)

(Click on image to enlarge)

(Click on image to enlarge)

2.  Firstly we will create a directory to store the copied website. The following
screenshot shows a directory created named testwebsite using the mkdir command.

3.   To start HTTrack, type httrack in the command window and give the project
a name, as shown in the following screen shot:

(Click on image to enlarge)

(Click on image to enlarge)
 4.   The next step is to select a directory to save the website. The example in the
following screen shot shows the folder created in the previous step /root/
testwebsite
, used for the directory:

(Click on image to enlarge)
5.   Enter the URL of the site you want to capture. The example in the following
screen shot shows www.hackershandbook.org. This can be any website. Most attacks use a website accessed by clients from your target, such as popular social media websites or the target's internal websites.The next two options are presented regarding what you want to do with the captured site. Option 2 is the easiest method, which is a mirror website with a wizard as shown in the following screen shot:

(Click on image to enlarge)
6.  Next, you can specify if you want to use a proxy to launch the attack. You can also specify what type of files you want to download (the example in the following screen shot shows * for all files). You can also define any command line options or flags you might want to set. The example in the following screen shot shows no additional options.Before httrack runs, it will display the command that it is running. You can use this command in the future if you want to run httrack without going through the wizard again. The following screen shots show hhtrack cloning www.hackershandbook.org:

(Click on image to enlarge)

(Click on image to enlarge)
7.   After you are done cloning the website, navigate to the directory where you
saved it. Inside, you will find all your files and web pages, as shown in the
following screen shot:
(Click on image to enlarge)
8.   Thanks to book Web Penetration Testing with Kali Linux by Joseph Muniz & Aamir Lakhani

Thursday, March 05, 2009

Fight of the Browser's : ACID3 TEST

1.         I recollect the first interaction with the web with the services of Internet Explorer 3.0 some time in 1997.Then there was a alternative of Netscape Navigator 3.0 only.So between the two in those days,there was sinewy competition and I would appreciate that the choice between the two mattered owing to individual’s personnel choices. I liked browsing with Netscape Navigator and Ujjwal (my only IT savvy friend then…) liked it with IE 3.0. 

2.         So there were no standards then to decide upon which one is actually better. Time graduated to the early 2000’s and in came a lot many browsers each arrogating to be the best amongst lot. These included Amaya, Konquerer, Phoenix, Galion and the list is endless. Anyone interested in knowing about these unheard names today may just google and inquire about them…..now on way to superannuation!!! 

3.         Those which endured the decade long fight of the browsers included Safari, Opera, Mozilla to name a few and few new one’s to include Google Chrome, Flock. 

4.         Now after a decade of evolution , few standards spranged up to finally one make adjudicate about which one is actually better.The standard I am going to mention is ACID3 TEST. Nothing to do with Sulphuric or Hydrochloric acid. 

5.         Acid3 is a test page from the Web Standards Project that checks how well a web browser follows certain web standards, especially relating to the Document Object Model and JavaScript.

 

6.         When successful, the Acid3 test displays a step by step increasing percentage counter with colored rectangles in the background. I tried on mine ie Google Chrome and it scored 78/100(Picture In Set).The percentage displayed is based on the number of sub-tests passed. It is not representing an actual percentage of conformance as the test does not keep track of how many of the tests were actually started (100 is assumed). In addition to these the browser also has to render the page exactly like the reference page is rendered in the same browser. 

7.            Further to this I inquired about why Chrome scored 78,it was because prior to conduct of test it was supposed to be set to its original default settings which I did not do. So I surfed few forums on what has been the score of other browsers. This lead me to conclusion that as on date Chrome scores over others at least in ACID3 test and being a sincere Chrome user my self...believed the same quickly....thanks for wiki for info!

Powered By Blogger