skip to Main Content

How can I update robots.txt on the pantheon environment Live site?

I have tried the following option
1) Via FTP
2) via word press SEO >> tool

Do I need to follow any steps, as it’s a word press instance

2

Answers


  1. Nothing special. Two options here,

    1. Create a robots.txt file locally. Add desired statements. Upload to Pantheon via SFTP or Git.

    2. Pull down the existing robots.txt file from Pantheon, modify as necessary, and push back up via SFTP or Git.

    In both cases, you need to keep in mind that Pantheon forces a Workflow. You have the Dev, Testing, and Live Servers. When you push, whether by Git or SFTP, you are essentially pushing to the Dev environment. Note that if you choose to use SFTP, you must have the Pantheon site in SFTP mode (not Git), and you should log into the Dev environment SFTP. From there, you must deploy up to the Live environment. You do this via the Pantheon Dashboard.

    EDIT:
    Since you are going the SFTP route, you will need to login via SFTP to the dev environment. Once logged in via SFTP, you will want to upload to the /code directory. This is the root directory for the WordPress installation. So you will have uploaded /code/robots.txt. Once you upload, you will need to return to the Pantheon Dashboard and commit your changes through Dev, Testing, and Production.

    Hope this helps.

    Login or Signup to reply.
  2. If you do not have any experience with PHP and or don’t feel comfortable modifying your themes code for whatever reason the solution above should work perfectly.

    Alternative PHP approach

    If this is a site you are developing / maintaining and feel comfortable modifying the theme there is another approach that will save you time in the long run.

    Filters to the Rescue!

    If you are unfamiliar with hooks and filters within WordPress I’ll defer you to either this article from Treehouse blogs or a quick google search. The hooks and filter system plays a fundamental part in how plugins like Yoast SEO function, allowing them to modify the output of the robots.txt file for example.

    We can use this same robots_txt filter to modify the output of our sites robots.txt file without any external plugin or theme dependency. If you use git or svn to manage your theme or /wp-content/ directories this approach allows you to keep any modifications under version control.

    The code below should live in your themes functions.php file or another included PHP file of your chosing.

    <?php 
    function so_robots_txt_50725645( $output ) {
        // User-agent: *
        $output .= 'User-agent: *' . PHP_EOL;
        $output .= 'Disallow: /wp-includes/' . PHP_EOL;
        $output .= 'Disallow: /wp-content/uploads/' . PHP_EOL;
    
        return $output;
    }
    
    // Hook in our filter function.
    add_filter( 'robots_txt', 'so_robots_txt_50725645', 10, 1 );
    
    ?>
    

    What’s listed above is just an example, you could populate the $output variable with whatever content you wanted to appear on the robots.txt page. In this example we are appending new Disallow lines to the existing output via the .= operator.

    After all operations have been completed we return the modified $output and go on our way, never to worry about migrating pesky robots.txt files ever again.

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search