There are a number of additional settings that can be configured for AMP Web Tests. These settings can enable you to test content while removing repetitiveness in testing.
Browser Emulation - The Browser emulation feature passes the user agent string for the browser on the selected devise through Firefox and then re-sizes the browser window to match the actual device size. After that, AMP tests the rendered DOM in the browser. The default browser is 'Firefox'; the chosen browser is used to render the page on the server and then perform the automated testing. Users can also choose iPhone 6 and iPad Mini4 which uses the "emulation" process described above.
Maximum Page Count - The total number of pages that the spider should attempt to collect.
Maximum Depth - Defines the number how much branching out the spider should perform as it runs. The extreme minimum, a depth of zero, case would be spidering only the start location. In general, depth should be set to at least a depth of one which means that AMP would diagnose the base page, and also any page that is linked directly off of the base page. A depth of two means the spider should diagnose the base page, pages linked from the base page, and also pages linked from those pages - conceptually, "grandchildren" or third-generation pages. By default, the depth is set to unlimited and the primary restriction on spider size is Maximum Page Count.
Maximum Argument Count - Defines the number of unique argument pages that should be captured for a given base URL. As an example, let's take the URL https://www.acme.com/prod.php&product_id=123. This URL has one argument, which is product_id=123. If the Max Argument Account were set to 5, AMP would only test the first 5 pages that it finds that have the argument 'product_id'. After 5, it wouldn't spider the additional pages.
Maximum Argument Count = 5
www.acme.com/prod.php&product_id=123 (would be spidered)
www.acme.com/prod.php&product_id=124 (would be spidered)
www.acme.com/prod.php&product_id=156 (would be spidered)
www.acme.com/prod.php&product_id=158 (would be spidered)
www.acme.com/prod.php&product_id=155 (would be spidered)
www.acme.com/prod.php&product_id=876 (would NOT be spidered, as it is the 6th argument of the same kind and the maximum argument count is set to 5)
www.acme.com/prod.php&product_id=743&type=onsale (would be spidered based on the addition of a new argument in the URL)
Positive Filters - Defines a path that the spider should follow using a regular expression (AMP follows standard PHP syntax). These can be used to specify particular pages or hosts that you would like the spider to examine - essentially extending the Page Restriction setup. Positive Filters are | (pipe) delimited. As an example, a few regular expressions have been provided below:
- To implement the 'Restrict to Path' requirement the positive filter is set to .*www\.ssbbartgroup\.com/contact.php.*
- To implement the 'Restrict to Host' requirement the positive filter is set to .*www\.ssbbartgroup\.com.*
- To implement the 'Restrict to Domain' requirement the positive filter is set to .*ssbbartgroup\.com.*
Negative Filters - Defines a regular expression that the spider should not follow. These can be used to specify particular pages that you do not want the spider to visit. A good example of this would be .*cgi-bin.* which would ignore all URLs that contained "cgi-bin". Negative Filters are | (pipe) delimited.
XPath Exclusion - This field allows you to specify a common 'section' of the site to exclude from being automatically tested during the spidering process. For example a specific div element. XPath Exclusions should be formatted as follows: /htm/body/div[@id="RandomAds"] To add additional XPath Exclusions, XPaths are separated using a comma (ex. /html/body, /html/body/div[@id="RandomAds"], /html/head).
Publish Document Inventory - Selecting this option will cause AMP to catalog any documents (e.g. .pdf, .doc, .txt, .xls, etc) it spiders and create a list of all the documents and the document locations it finds. It will also pick up images if there are links to images at a URL.
Scope - Defines the basic restriction on the type of pages that should be spidered.
- Restrict to Path will ensure that only pages that are present at or below the path of the Start Location will be spidered. So for example if the Start Location is http://www.ssbbartgroup.com/contact.php only pages in the contact directory or its sub-directories on the www.ssbbartgroup.com server will be spidered.
- Restrict to Host will ensure that only pages that are present on the Start Locations host will be spidered. So for example if the Start Location is http://www.ssbbartgroup.com/contact.php any pages on www.ssbbartgroup.com will be spidered.
- Restrict to Domain will ensure that only pages that are present on the Start Locations domain will be spidered. So for example if the Start Location is http://www.ssbbartgroup.com/contact.php pages on www.ssbbartgroup.com, amp.ssbbartgroup.com or any other server in the ssbbartgroup.com domain will be spidered