1 00:00:06,647 --> 00:00:07,971 - [Instructor] In the mapping phase, 2 00:00:07,971 --> 00:00:11,610 we can take a few different approaches. 3 00:00:11,610 --> 00:00:14,096 We can do a manual crawl. 4 00:00:14,096 --> 00:00:16,352 We can also do an automated crawl, 5 00:00:16,352 --> 00:00:19,965 which, most time, is called spidering, 6 00:00:19,965 --> 00:00:22,110 and we can take a hybrid approach, 7 00:00:22,110 --> 00:00:24,594 doing a manual and automated. 8 00:00:24,594 --> 00:00:26,471 We'll talk more about that in a minute. 9 00:00:26,471 --> 00:00:28,847 Remember, the idea of the mapping phase 10 00:00:28,847 --> 00:00:32,062 is to uncover all of the corners 11 00:00:32,062 --> 00:00:34,825 of the application we want to attack. 12 00:00:34,825 --> 00:00:36,566 Sometimes the approach is determined 13 00:00:36,566 --> 00:00:38,706 by the actual application. 14 00:00:38,706 --> 00:00:42,761 For instance, if the application is using a technology 15 00:00:42,761 --> 00:00:44,955 that is not understood or compatible 16 00:00:44,955 --> 00:00:48,277 with an automated scanner or crawler, 17 00:00:48,277 --> 00:00:52,640 then you would need to move to a manual approach. 18 00:00:52,640 --> 00:00:54,727 I would normally start out first 19 00:00:54,727 --> 00:00:58,894 by using an automated scanner, commercial or open-source. 20 00:01:00,296 --> 00:01:03,460 This tool will follow each link in the application 21 00:01:03,460 --> 00:01:06,449 to determine its full scope. 22 00:01:06,449 --> 00:01:09,522 Manual scanning is literally the process 23 00:01:09,522 --> 00:01:13,269 of clicking each link throughout the application 24 00:01:13,269 --> 00:01:14,436 to map it out. 25 00:01:16,146 --> 00:01:18,293 You can also take that hybrid approach. 26 00:01:18,293 --> 00:01:20,260 What this means is that you click 27 00:01:20,260 --> 00:01:22,491 through some of the application 28 00:01:22,491 --> 00:01:26,914 to give the automated scanner an idea of where to look. 29 00:01:26,914 --> 00:01:29,685 From there, the scanner will dig deeper 30 00:01:29,685 --> 00:01:32,185 and report back what it finds. 31 00:01:33,626 --> 00:01:37,120 So in the mapping process, we need to have a starting point. 32 00:01:37,120 --> 00:01:40,290 This is with manual or automated approaches. 33 00:01:40,290 --> 00:01:43,945 So, for instance, our starting point would normally be 34 00:01:43,945 --> 00:01:48,567 the root of any site, so the fist page or the index page. 35 00:01:48,567 --> 00:01:53,472 Here, you can see were starting with Hackazon.net. 36 00:01:53,472 --> 00:01:57,132 The index page will typically have links on it 37 00:01:57,132 --> 00:02:00,146 that will lead us to the next level of pages, 38 00:02:00,146 --> 00:02:04,239 so, in a manual process, we would essentially start clicking 39 00:02:04,239 --> 00:02:08,987 on these different links, like FAQ, Contact Us, 40 00:02:08,987 --> 00:02:11,737 Wish List, the registration page, 41 00:02:12,717 --> 00:02:16,536 and while we're mapping, we're gonna be building out 42 00:02:16,536 --> 00:02:18,619 our map of all the pages, 43 00:02:20,079 --> 00:02:24,246 so, under Contact Us, there may be phone or email. 44 00:02:27,912 --> 00:02:31,895 Each of these individual pages also have some sort of link 45 00:02:31,895 --> 00:02:34,784 to lead us to the next level, 46 00:02:34,784 --> 00:02:38,612 so this is a way a normal web scanner might work. 47 00:02:38,612 --> 00:02:42,455 We would also follow the same method for manual mapping. 48 00:02:42,455 --> 00:02:43,912 Another approach we can take is 49 00:02:43,912 --> 00:02:47,035 to just brute force the files and folders. 50 00:02:47,035 --> 00:02:48,878 This method takes a dictionary list 51 00:02:48,878 --> 00:02:51,969 of commonly found pages and folders 52 00:02:51,969 --> 00:02:55,783 that you would find in websites on the internet, 53 00:02:55,783 --> 00:02:59,640 and what it does is it sends a request for each of them. 54 00:02:59,640 --> 00:03:01,109 As it gets back responses, 55 00:03:01,109 --> 00:03:04,087 the tool will report pages that were found. 56 00:03:04,087 --> 00:03:08,098 These are typically indicated by a 200 response. 57 00:03:08,098 --> 00:03:10,816 This method is good for finding hidden pages 58 00:03:10,816 --> 00:03:12,773 that a spider might miss, 59 00:03:12,773 --> 00:03:16,022 and we would normally not find manually. 60 00:03:16,022 --> 00:03:20,661 The mapping process is walking through the application 61 00:03:20,661 --> 00:03:23,821 to map out all the menus and links 62 00:03:23,821 --> 00:03:25,956 that are included in the scope. 63 00:03:25,956 --> 00:03:29,468 A good way to do this is to proxy your browser 64 00:03:29,468 --> 00:03:32,334 through an application like Burp Suite. 65 00:03:32,334 --> 00:03:36,490 This way, you can capture all of the application traffic 66 00:03:36,490 --> 00:03:39,817 while browsing through Burp Suite. 67 00:03:39,817 --> 00:03:43,984 From there, you can utilize those results in your attacks. 68 00:03:45,479 --> 00:03:48,305 Let's take a look at how we do this. 69 00:03:48,305 --> 00:03:51,588 First, we'll open up Burp Suite. 70 00:03:51,588 --> 00:03:55,977 We need to turn off the interception feature of Burp Suite 71 00:03:55,977 --> 00:03:58,810 so that we collect all of our data 72 00:04:00,302 --> 00:04:02,077 instead of stopping and having 73 00:04:02,077 --> 00:04:05,660 to forward the request each and every time. 74 00:04:08,549 --> 00:04:10,944 Next, we need to make sure our browser 75 00:04:10,944 --> 00:04:13,527 is configured to use the proxy. 76 00:04:16,430 --> 00:04:20,960 We can see here in Burp Suite, in HTTP History, 77 00:04:20,960 --> 00:04:24,478 right now, we don't have anything to look at, 78 00:04:24,478 --> 00:04:28,118 so let's start browsing our application. 79 00:04:28,118 --> 00:04:30,325 Again, the application that we're gonna be using 80 00:04:30,325 --> 00:04:33,575 in most of these demos is Hackazon.net. 81 00:04:35,562 --> 00:04:38,576 Now, what we wanna do is just start browsing 82 00:04:38,576 --> 00:04:42,409 through the different sections of the website. 83 00:04:44,475 --> 00:04:48,642 Maybe put some test data into the forms and submit. 84 00:04:51,700 --> 00:04:53,853 Doesn't have to be correct. 85 00:04:53,853 --> 00:04:57,270 We're just collecting data at this point. 86 00:04:58,741 --> 00:05:02,908 Click on a few of the items, maybe add one to a cart. 87 00:05:06,400 --> 00:05:07,505 That should be enough. 88 00:05:07,505 --> 00:05:10,906 Now let's go back over to Burp Suite. 89 00:05:10,906 --> 00:05:13,987 Here, we can see in HTTP History 90 00:05:13,987 --> 00:05:18,154 all of the web requests and responses that we've sent, 91 00:05:20,725 --> 00:05:23,768 and if we go over to the target tab, 92 00:05:23,768 --> 00:05:28,749 we can actually see how the Burp Suite application 93 00:05:28,749 --> 00:05:31,916 built out a map of the website for us. 94 00:05:33,848 --> 00:05:36,201 This is exactly what we're trying to do 95 00:05:36,201 --> 00:05:38,668 in the mapping phase. 96 00:05:38,668 --> 00:05:40,357 So, from here, we can move on 97 00:05:40,357 --> 00:05:43,607 to our vulnerability discovering phase.