snipt

Ctrl+h for KB shortcuts

Text only

Default robots..txt

# Example robots.txt
# Created by Brandon
# Learn more at http://kb.mediatemple.net
# (mt) Forums - http://kb.mediatemple.net/questions/824/
# (mt) System Status - http://status.mediatemple.net
# (mt) Statement of Support - http://mediatemple.net/support/statement/

# How do I check that my robots.txt file is working as expected?
# http://www.google.com/support/webmasters/bin/answer.py?answer=35237
# Here is a list of Robots: http://www.robotstxt.org/db.html

# Instructions
# Remove the "#" to uncomment any line that you wish to use, but be sure not to uncomment the Description. 

# Grant Robots Access
#######################################################################################

# This example allows all robots to visit all files because the wildcard "*" specifies all robots:
#User-agent: *
#Disallow:

#To allow a single robot you would use the following:
#User-agent: Google
#Disallow:

#User-agent: *
#Disallow: /

# Deny Robots Access
#######################################################################################

# This example keeps all robots out:
#User-agent: *
#Disallow: /

# The next is an example that tells all crawlers not to enter into four directories of a website:
#User-agent: *
#Disallow: /cgi-bin/
#Disallow: /images/
#Disallow: /tmp/
#Disallow: /private/

# Example that tells a specific crawler not to enter one specific directory:
#User-agent: BadBot
#Disallow: /private/

# Example that tells all crawlers not to enter one specific file called foo.html
#User-agent: *
#Disallow: /domains/mt-example.net/foo.html
https://snipt.net/embed/e467f42d2d441c08c50ba726059dc74c/
/raw/e467f42d2d441c08c50ba726059dc74c/
e467f42d2d441c08c50ba726059dc74c
text
Text only
49
2019-06-27T01:01:07
True
False
False
/api/public/snipt/4070/
default-robotstxt
<table class="highlighttable"><tr><td class="linenos"><div class="linenodiv"><pre><a href="#L-1"> 1</a> <a href="#L-2"> 2</a> <a href="#L-3"> 3</a> <a href="#L-4"> 4</a> <a href="#L-5"> 5</a> <a href="#L-6"> 6</a> <a href="#L-7"> 7</a> <a href="#L-8"> 8</a> <a href="#L-9"> 9</a> <a href="#L-10">10</a> <a href="#L-11">11</a> <a href="#L-12">12</a> <a href="#L-13">13</a> <a href="#L-14">14</a> <a href="#L-15">15</a> <a href="#L-16">16</a> <a href="#L-17">17</a> <a href="#L-18">18</a> <a href="#L-19">19</a> <a href="#L-20">20</a> <a href="#L-21">21</a> <a href="#L-22">22</a> <a href="#L-23">23</a> <a href="#L-24">24</a> <a href="#L-25">25</a> <a href="#L-26">26</a> <a href="#L-27">27</a> <a href="#L-28">28</a> <a href="#L-29">29</a> <a href="#L-30">30</a> <a href="#L-31">31</a> <a href="#L-32">32</a> <a href="#L-33">33</a> <a href="#L-34">34</a> <a href="#L-35">35</a> <a href="#L-36">36</a> <a href="#L-37">37</a> <a href="#L-38">38</a> <a href="#L-39">39</a> <a href="#L-40">40</a> <a href="#L-41">41</a> <a href="#L-42">42</a> <a href="#L-43">43</a> <a href="#L-44">44</a> <a href="#L-45">45</a> <a href="#L-46">46</a> <a href="#L-47">47</a> <a href="#L-48">48</a> <a href="#L-49">49</a></pre></div></td><td class="code"><div class="highlight"><pre><span></span><span id="L-1"><a name="L-1"></a># Example robots.txt </span><span id="L-2"><a name="L-2"></a># Created by Brandon </span><span id="L-3"><a name="L-3"></a># Learn more at http://kb.mediatemple.net </span><span id="L-4"><a name="L-4"></a># (mt) Forums - http://kb.mediatemple.net/questions/824/ </span><span id="L-5"><a name="L-5"></a># (mt) System Status - http://status.mediatemple.net </span><span id="L-6"><a name="L-6"></a># (mt) Statement of Support - http://mediatemple.net/support/statement/ </span><span id="L-7"><a name="L-7"></a> </span><span id="L-8"><a name="L-8"></a># How do I check that my robots.txt file is working as expected? </span><span id="L-9"><a name="L-9"></a># http://www.google.com/support/webmasters/bin/answer.py?answer=35237 </span><span id="L-10"><a name="L-10"></a># Here is a list of Robots: http://www.robotstxt.org/db.html </span><span id="L-11"><a name="L-11"></a> </span><span id="L-12"><a name="L-12"></a># Instructions </span><span id="L-13"><a name="L-13"></a># Remove the &quot;#&quot; to uncomment any line that you wish to use, but be sure not to uncomment the Description. </span><span id="L-14"><a name="L-14"></a> </span><span id="L-15"><a name="L-15"></a># Grant Robots Access </span><span id="L-16"><a name="L-16"></a>####################################################################################### </span><span id="L-17"><a name="L-17"></a> </span><span id="L-18"><a name="L-18"></a># This example allows all robots to visit all files because the wildcard &quot;*&quot; specifies all robots: </span><span id="L-19"><a name="L-19"></a>#User-agent: * </span><span id="L-20"><a name="L-20"></a>#Disallow: </span><span id="L-21"><a name="L-21"></a> </span><span id="L-22"><a name="L-22"></a>#To allow a single robot you would use the following: </span><span id="L-23"><a name="L-23"></a>#User-agent: Google </span><span id="L-24"><a name="L-24"></a>#Disallow: </span><span id="L-25"><a name="L-25"></a> </span><span id="L-26"><a name="L-26"></a>#User-agent: * </span><span id="L-27"><a name="L-27"></a>#Disallow: / </span><span id="L-28"><a name="L-28"></a> </span><span id="L-29"><a name="L-29"></a># Deny Robots Access </span><span id="L-30"><a name="L-30"></a>####################################################################################### </span><span id="L-31"><a name="L-31"></a> </span><span id="L-32"><a name="L-32"></a># This example keeps all robots out: </span><span id="L-33"><a name="L-33"></a>#User-agent: * </span><span id="L-34"><a name="L-34"></a>#Disallow: / </span><span id="L-35"><a name="L-35"></a> </span><span id="L-36"><a name="L-36"></a># The next is an example that tells all crawlers not to enter into four directories of a website: </span><span id="L-37"><a name="L-37"></a>#User-agent: * </span><span id="L-38"><a name="L-38"></a>#Disallow: /cgi-bin/ </span><span id="L-39"><a name="L-39"></a>#Disallow: /images/ </span><span id="L-40"><a name="L-40"></a>#Disallow: /tmp/ </span><span id="L-41"><a name="L-41"></a>#Disallow: /private/ </span><span id="L-42"><a name="L-42"></a> </span><span id="L-43"><a name="L-43"></a># Example that tells a specific crawler not to enter one specific directory: </span><span id="L-44"><a name="L-44"></a>#User-agent: BadBot </span><span id="L-45"><a name="L-45"></a>#Disallow: /private/ </span><span id="L-46"><a name="L-46"></a> </span><span id="L-47"><a name="L-47"></a># Example that tells all crawlers not to enter one specific file called foo.html </span><span id="L-48"><a name="L-48"></a>#User-agent: * </span><span id="L-49"><a name="L-49"></a>#Disallow: /domains/mt-example.net/foo.html </span></pre></div> </td></tr></table>
robots, robots.txt